The Rescuers

crash site

Image: ktoo.org

“So, you want to potentially fry the brains of several expensive Positronic robots, Noah?”

Professor Noah Abramson was once again sitting in front of the desk of National Robotics Corporation CEO Richard Underwood proposing another of his ‘crazy’ ideas to his boss.

“Well, hopefully not, Rick. On the other hand, if we’re going to lease our Positronic Search, Assess, and Rescue robots to various private and governmental agencies responsible for public safety, we have to know exactly how they’ll respond, not only to at risk humans, but to the dying and dead human beings they will likely encounter in an emergency or disaster situation.”

“If the SARs fail the test, then what?”

“We’ll be using prototypes or P-SARS, Olson produced just ten of them for this test.” Abramson was referring to Jeremy Olson, Chief Production Manager at FAB-18, the world’s first and so far only manufacturing center for prototype Positronic robots.

“If the First Law protocols cannot allow a robot to tolerate being in the presence of a human corpse or observing a human die, worst case scenario is that the Positronic brain goes permanently inert. The robot body will be recoverable, but it would need a completely new brain, maybe even a whole new network of neural circuitry.”

Underwood was assessing the cost analysis of the testing proposal. It was still going to be expensive, but the rewards were potentially far greater. “How are you planning to proceed?”

“Since we don’t want to actually risk human lives in this test, we will expose the first four P-SARs to a simulation. The Holographic Simulation Chamber in our lab won’t be suitable since it projects three-dimensional images but with no actual substance. The P-SARs wouldn’t get an experience even close to field conditions.”

“So how are you going to fake a search and rescue operation that will convince the robots?”

“It’s actually not that hard.” Abramson got the idea for his plan from a movie that became wildly popular in science fiction circles in the late 20th and early 21st centuries. “We can tap directly into the perceptual schema and supporting sub-routines of the test robots and manipulate their inputs to project a realistic scenario. Although they will never leave the lab, they will have a totally authentic sensory experience of the mission we plan for them. Sight, sound, touch, it will all seem ‘real’.”

“You said you had ten P-SARs built, but you’re only using four?”

“That’s right, Rick. We’ll use the first four P-SARs for the initial test, telling them that it is just a simulation. During the test, we observe their behavior and the activity in their Positronic brains for anomalies. Then we’ll shut them down and perform a detailed diagnostic.”

“How long will that take? I’ve got dozens of buyers demanding production models be sent to them as soon as they’re available.”

“Our testing process has become a lot faster and more efficient than when we originally evaluated George and Grace. We can run a full analysis on the P-SARs within a week.”

“Then what?”

“Assuming we find no or at least a minimal set of acceptable anomalies in their neural pathways, we use another four P-SARs in a second simulation. The situation will be identical to the first,” Abramson paused for emphasis, “except that we will not tell the robots it’s a simulation. We’ll alter their memories and perceptions so they’ll believe the search and rescue operation is live.”

“Observing their behavior like before and then taking them offline afterwards to perform a diagnostic.”

“Correct. That will be the real test. If the robots pass it, then we can continue with more extensive scenarios until we’re convinced that the product is viable. After that, we go into full production of the commercial SAR models.”

“Wait, you’ll use eight robots but you ordered ten constructed.”

“We’ll hold two in reserve just in case we run into any unforeseen problems.”

“Like what?”

“Rick, if I could answer that, the problems wouldn’t be unforeseen.” Abramson enjoyed these few opportunities to have one up on the CEO. “If we don’t need them, at the conclusion of a successful sequence of tests, we can use them for more extensive simulations, constructing additional P-SARs as required.”

“When will you be ready to start?”

“Next Monday morning, bright and early.”

“Keep me posted on how it goes.”

“Of course.” Noah rose to leave.

“By the way, how’s Vuong’s team coming along on the Watcher Project?” He was referring to the Positronic team’s chief programmer and the only other human expert in the field of Positronics.

NRC planned to lease non-humanoid Positronic brains to a variety of organizations such as municipalities to use for city planning, traffic control, resource management, replacing their current AI products and effectively removing human involvement in those activities. The first test of this would be to fully automate the management of NRC’s main campus in Pasadena.

“I peek in on Margie and her people every few days, Rick.” Abramson actually received regular updates on the project in more formal meetings on a weekly basis. “The basic brain has been constructed but Vuong’s team is still working out the algorithms required, which even with computer assistance, is slow going.”

“I’ve got construction crews working overtime to upgrade the networking infrastructure here to use Positronic relays, so don’t let Vuong fall behind schedule.”

“By the time you have the infrastructure in place, Margie and her team will have the brain programmed and ready for installation.”

“Are you going to give this one a cute, little name, too?”

“Please, Rick. I haven’t named a robot since Grace.”

“How are they anyway?” Underwood had originally objected to some of the activities the two Prototype Asimovian Robots (PARs), particularly in ‘religious’ studies, but given their record of providing invaluable assistance in solving problems with the functioning of Positronic brains that allowed NRC to go forward commercializing the technology, he’d grudgingly come to admire them.

“George and Grace are one of the reasons you’ll have the Watcher Project launched on schedule. They are fully integrated members of Vuong’s team.”

“Glad to hear it. Thanks, Noah.”

“All part of the service.” Abramson smiled slightly. “I’ve got to run. I’m having Erev Shabbos dinner with one of my granddaughters and her family and I can’t be late.”

“Sophie’s Mom?”

“Yes, that’s right.”

“The little girl is practically Grace’s best friend, isn’t she?”

“Sophie likes to think so, but the concept of friendship may not necessarily be applied to a robot, Rick, at least not in a human sense.” This was a bit of misdirection on Abramson’s part. He knew both George and Grace, in some fashion, considered him their friend. Best not to give Underwood the idea that the prototypes could experience more ‘humanity’ than he imagined.

Well, kids, right?”

“Yeah. Kids. Anyway, I’m off.”

“Have a good weekend, Noah.”

“Thanks. You, too. Say hello to Philip for me.” Abramson was referring to Underwood’s spouse.

“I will when I hear from him. He’s in Orlando this weekend at some sort of telecommunications expo. Too busy to talk to, even for a few minutes, and too exhausted at the end of the day to do anything but sleep.”

“I see. Well, see you Monday.”

Noah Abramson turned and left Underwood’s spacious twenty-fifth floor office. He didn’t bother telling the CEO that he had briefly considered pulling either George or Grace into the P-SAR testing project to help supervise, but then he realized he wasn’t willing to risk them being exposed to even a simulation of dying and dead humans, or watch how such a situation might affect the P-SARs. No Positronic robot had ever been involved in such a scenario before and had their First Law protocols so significantly tested.

In the nearly year-and-a-half since the two PARs had been first activated, somewhere along the way, Noah had learned to think of them as friends, too.

——

“P-SARs 01 through 04, please recline on the assembly tables.” Abramson’s team had retrofitted the lab where George and Grace had originally ‘come to life’ with three additional tables. It made for close quarters, especially when you included the plethora of cables that had to be attached to each machine so they could be monitored at the nearby consoles, but it would have to do until the new Positronic Labs building was built.

The robots dutifully obeyed the Professor’s command, and without complaint, allowed Robinson’s and Miller’s technicians to apply the restraining bars, and then begin to open the panels on their torsos so the necessary cabling could be attached to their access ports. In addition to monitoring the functioning of the robots, the sensory inputs simulating the plane crash scenario were going to be fed to the P-SARs this way.

As the set up was continuing, Quinto addressed the machines. “P-SAR robots, this is a test. You will be fed data simulating an emergency involving injured human beings. Your sensory array will not be able to distinguish this simulation from actual experience, but I emphasize this is a simulation. No human beings are in danger. Confirm instructions.”

The robots answered as one. “We acknowledge that we are to be fed data simulating an emergency which includes injured human beings as a test of our search and rescue protocols. No human being will actually be in danger. This is only a test.”

Then P-SAR-02 added something. “If this is a simulation, how will this be an adequate test of our abilities under actual field conditions? Since no human beings will really be in danger, our First Law protocols will not be involved, or at best, such involvement will be minimized.”

“Even though you know this is a simulated situation, we will be able to detect if any anomalies occur in your data processing, particularly those pathways involving a First Law response. Just do your jobs within the simulation. We’ll take care of the rest.”

Vikki was a little surprised that the robot asked that sort of question, but then again, it’s something George or Grace might have asked. Robots operating within groups, and particularly those constructed for a specialized purpose, such as the P-SARs, although every bit as intelligent as the two PAR prototypes, didn’t seem as curious as George and Grace. Maybe generalist robots had a broader field of interests.

“All buttoned up.” Ellen Chapel, the lead tech on Miller’s team let Quinto and the rest of the senior staff know the robots were ready for the test.

As the lab’s Robotic Psychology and Behavioral expert, she was in operational command of the test. Abramson would do the overall supervision as Director of Research. Vuong would normally be in charge of programming and running the actual simulation, but she was still deep in writing the operating system for the Watcher brain. She had assigned Sean Komatsu, her programming team lead, to perform that task.

“Everybody to your assigned stations, please. Let’s get this show going.” Abramson smiled slightly to himself. He remembered when he first brought Vikki into the Positronics Project, the ink still wet on her doctorate from Brandies.  It took months before she exhibited the self-confidence required of a senior team leader. Now, watching her take charge of the simulation test, you’d never know she started out here almost scared of even her own shadow.

“P-SAR robots.” Vikki addressed the machines again. “You may experience some sensory disorientation when the simulation is initiated. This should pass within a few seconds. Once the simulation begins, your operational instructions for the mission will be uploaded. All you have to do is run those instructions.”

“Acknowledged. We will run the uploaded instructions and comply with our directives.” Vikki was suddenly reminded that at first, the PNXG prototypes couldn’t even say “we” or “our” let alone cooperate as a unit. They’d come so far in such a short amount of time. A year from now, what will robots be capable of?

“Everybody stay sharp. OK, Komatsu. Run the simulation.”

For the four P-SAR robots, it was as if the room around them melted and then faded away.

——

The four robots were standing at the base of a cliff. It was night. There was snow on the ground. The area was heavily forested. The air temperature was 0.72 degrees Celsius and falling.

The testing scenario was based on an actual crash involving a twin-engine Lear jet that had gone down in the Rocky Mountain National Park about ten years ago. High winds on the top of the bluff made it impossible to send in helicopters, and the area was too rugged to send in ground vehicles. The only option was to send in a rescue team on foot, but the aircraft had gone down just before midnight, and the (human) rescuers were unable to respond until after first light.

By the time the search and rescue team reached them, the pilot, and three out of the five passengers were dead.

Moments after the robots became aware of the simulated environment, their instructions uploaded and ran. Then they understood that the crashed jet was at the top of the bluff, 693 meters above their current position. All of the robots were wearing backpacks containing medical supplies, emergency rations, and a variety of other equipment necessary for the assistance of injured human beings.

If a person were present, they wouldn’t hear anything except the wind. The robots were in constant communication using robotspeak. Silently, they began their ascent up the cliff. Spider-Man would have been envious.

No one in the testing lab could actually see what the robots were experiencing. It would have been interesting to have a television show-type view of the running simulation, but they’d have to settle with monitoring each robot’s individual responses to the stimuli as Komatsu kept tabs on which part of the scenario was running.

Robotic extremities had no difficulty in finding usable hand and foot holds. Although they were capable of remarkable dexterity and tactile sensitivity, the machines could adjust those thresholds so that temperature and abrasiveness did not distract them, that is, unless it caused physical damage.

As expected, the four robots made it to the top of the bluff more easily and quickly than even the most expert human climber. Superior visual acuity allowed them to find the aircraft almost immediately.

The wings were missing, having been ripped off during the crash. The pilot had attempted to bring the plane in as level as possible, which was one of the reasons the fuselage was largely intact. The nose was hanging just off the edge of the bluff, but the weight of the remaining length of the aircraft kept it from going over.

As the robots ran toward the wreck, they noticed the cockpit’s windshield had been shattered, probably due to contact with a tree or other object during the crash landing. Infrared sensors detected what was likely the body heat of five human beings. Enhanced hearing could pick up low, murmuring human speech. Someone was crying. The robots’ operational parameters told them that the plane had radioed its position and number of survivors shortly after the crash, but the aircraft’s electrical power failed soon afterward.

The robots stopped right outside the plane. P-SAR-01 announced loudly. “Human beings, we are four Prototype Search, Assess, and Rescue robots. We have come to assist you. We are about to enter your aircraft.”

No one at NRC was sure of how people, especially frightened and injured people, might react if a group of humanoid robots abruptly appeared in their midst. Quinto recommended that, until information could be gathered under real field operations, robots should announce themselves before entering a situation, so that the people involved could be somewhat prepared.

“Help us, please.” A female human voice could be heard inside the plane.

01 was closest and forced open the jammed hatch toward the rear of the fuselage. It entered first followed by the other three robots. 01 said in a voice meant to be reassuring, “Please remain calm, We will assist you.”

All four machines were continuing to communicate over robotspeak. 01 walked past the victims to examine the cockpit while the other three assessed the five humans in the passenger cabin.

Entering the cockpit, P-SAR-01 encountered a dead human being. The techs in the lab monitoring the robot’s responses noticed several things happened at once.

In spite of P-SAR-02’s previous query, the moment the robots entered the simulation and ran their instructions, the First Law protocol was automatically initiated. It became heightened when they saw the condition of the aircraft, and again, when they entered it and saw the humans.

01’s First Law protocol was enhanced again resulting in a conflict due to being in the presence of a deceased human. ‘Or, through inaction, allow a human being to come to harm’ directly clashed with ‘This human being has ceased to function and can never be revived.’

For slightly less than 350 milliseconds, P-SAR-01’s Positronic pathways, all of them, ceased carrying data. This registered not only on the lab consoles, but the other three robots became aware of it as well across the robotspeak link.

Then 01’s neural operations resumed normal functioning. This human had died before the robots had arrived, thus it was not due to their inaction that the human was harmed to the point of termination. All four of the P-SARs absorbed this realization.

The entire transaction took almost no time at all on a human scale. While 01 had been proceeding to the cockpit, the other machines assessed the relative risk of the five humans in the passenger area. There were three males and two females, all adults, ranging in chronological age from 31 to 75. The human in the greatest need of medical attention was a female approximately 45 years old. She had suffered a broken arm and a cut above her left eye that would require several stitches.

03 approached her and took off its pack. “I am robot P-SAR-03. I will assist you.” It quickly removed a medical kit from the pack, accessed and applied a compress to stop the bleeding from the cut. “Hold here, please.” 03 gently placed the woman’s functional hand on the compress. “Please apply pressure while I immobilize your arm. It has sustained a fracture.”

The robots had medical knowledge greater than a human paramedic or military medic. Under the correct conditions, they could even perform certain types of surgery, but only if a human doctor were not immediately present and the human was at significant risk of further harm or death.

03 was continually monitoring the human’s vital signs and determined it was safe to administer an analgesic to ease her discomfort. “What is your name?” Part of the P-SARs programming was to engage in social interactions with human victims to reduce their sense of anxiety, just as any human rescuer might do.

“Kristel.” The human seemed a bit dazed but was not in shock, 03 determined. “I have immobilized your arm. The laceration on your forehead is not continuing to bleed and can be treated once we have taken you to a proper medical facility.” If it had been required, 03 could have stitched the cut on the scene, but if not required, the time would be better spent securing the humans, either for waiting at their current location until it was safe for rescue helicopters to provide transport, or, if the robots deemed it too dangerous to wait, devising a method of getting the humans off the bluff tonight. 03 applied adhesive strips to the compress to hold it in place.

01 had since returned from the cockpit and was assisting 02 and 04 with the other four passengers. Three of them were relatively unharmed, having suffered minor cuts and bruises. The oldest human, the 75-year-old male named Frank, was just discovered to be suffering from something else.

“God, I thought I’d stop being air sick by now.”

“You are experiencing nausea, Frank?” 04 was monitoring Frank’s vitals and requested 02 do so as well. The older man started rubbing his chest but before Frank could even complain about the pain and how hard it was getting to breathe, the robots had already diagnosed that the human was about to experience cardiac arrest.

04 accessed and removed the portable defibrillator from its pack while 02 removed Frank’s upper clothing and prepared him for the procedure.

The robots had provided the humans with blankets to ward off the cold and had placed portable heating units nearby to raise the air temperature.

03 continued to attend to Kristel while 01 took charge of the other humans, 31-year-old Jessie, 40-year-old Henry, and 59-year-old Leon. “What’s happening to Frank?” It had been Jessie who the robots heard crying earlier. Apparently, she was in the most emotional distress due to the crisis.

“Units 02 and 04 are providing emergency treatment.” 01 was running his sub-routine to attempt to contain human anxiety and fright. “They will do their best to help Frank.”

Of the four robots, the First Law potential in 02 and 04 was the highest. They were taking action to prevent a human being from coming to further harm and to perhaps terminate. They were doing what they could to obey the First Law protocol.

But the simulation had been programmed to let Frank die.

“Clear.” 04 didn’t need to actually say that before activating the defibrillator. It was a matter of protocol. 02 was not only aware of the procedure, but both robots were communicating over robotspeak and 02 knew everything 04 was thinking and doing. 01 and 03 were keeping the remaining humans away from the other two robots and the human in distress.

04 activated the defibrillator unit. Frank’s vitals had not changed. There was no heart beat. “Again. Clear”. 04 activated the unit a second time. Then, when Frank’s condition was seen to be unchanged, it did so a third time.

In the event of a human experiencing cardiac arrest, the P-SAR robots run a complex set of algorithms, as they do under any other human emergency situation, that indicate the relative level of success further attempts at providing assistance will yield. After three attempts at reviving Frank, the probability dropped below a viable threshold. P-SAR-04 turned to the remaining humans. “I am sorry. I was unable to save Frank.”

P-SARs 01 and 03 were seen to experience anomalous activity in the pathways related to processing First Law protocols, but that was only very briefly, then they resumed normal data processing. Both 02 and 04 experienced more severe errors, with 04’s neural activity totally ceasing for just over 500 milliseconds. Activity resumed, but the robot continued to process information related to its First Law response atypically.

02 covered Frank completely with the blanket the robots had previously provided him. Jessie leaned into Leon and cried softly on his chest. The other humans turned to the robots. Over robotspeak, the four P-SARs assessed the current situation, including the medical condition of the remaining four humans, weather conditions, the relative safety of the current structure they were in, and so on. 01 used conventional radio to contact the robots’ command center (simulated in this case) to advise of their current condition and receive an update on when air support could be deployed.

The four robots made a decision, although 02 and 04 were lesser participants due to continuing data processing errors. 01 delivered the information to the humans. “Under the current conditions, it is unsafe to leave this location. You are all immediately safe, and attempting to remove you from the bluff would increase your risk of harm. Air rescue helicopters will be dispatched just after dawn when wind conditions will have changed to permit it. They will arrive here within less than an hour after takeoff. We will keep you safe and secure until that time.”

“What about Frank?” Leon was speaking while still comforting Jessie. “Why couldn’t you robots save him?”

Activity in all four Positronic brains fluctuated when they registered the question but they remained operational.

04 responded. “Unit 02 and I provided the best care to Frank we had available. We were unable to prevent further harm. We are sorry. We are sorry.”

Strictly speaking that last statement wasn’t particularly true, however 02 and 04 were experiencing the algorithmic equivalent of ‘distress’. Certain of their pathways had entered into a feedback loop, being unable to resolve the directive to allow no human to come to harm due to inaction with the event of a human dying in spite of all the action they had taken.

Quinto had directed the simulation to take place in real-time so that the team members managing the monitoring consoles could better follow the moment-by-moment events and responses of the robots. The recording programs could handle much faster inputs, and waiting five or more hours for the scenario to run its course was meaningless.

Once the team was convinced that each robot’s response seemed unchanging, although in 02’s and 04’s situation, unstable, they increased the speed of the inputs fed to the robots and ended the simulation less than ten minutes later.

The four P-SARs experienced a return to the environment of the lab. “OK, initiate shutdown of the robots.” The team responded to Quinto’s order and the robots went offline less than a minute later.

——

The following week, Abramson’s senior team members were doing the post-mortem on the diagnostics of the first four P-SARs. “And 02 didn’t think the First Law protocol would be triggered because they knew it was a simulation. Hah.”

“Nate, I don’t think any of us knew exactly what to expect. That’s why we ran the simulation.” Abramson hoped that would serve to chastise Miller for his sarcastic attitude.

“01 and 03 spontaneously recovered from the ‘shock’ of having a human being die in their presence.” It was Quinto’s meeting and she was delivering the official results. “01 also recovered from encountering an already deceased human, and very quickly. The problem is with 02 and 04.”

“Shouldn’t they recover as well eventually? I mean, they didn’t directly harm a human and they did take every possible action to prevent harm, in this case death, from happening to a human being. They just weren’t successful.” Robinson, like the rest of the team, was extremely familiar with the Three Laws programming, but there were subtleties in the interaction of the relative sub-routines and algorithmic responses that were still largely unknown, even to Abramson who wrote the original Three Laws operating system.

“That’s just it.” Margie Vuong had excused herself from working on the Watcher Program to go over the P-SARs’ diagnostic analysis. “The First Law says that if a robot sees a human being at imminent risk of harm, it has to do something to reduce that risk. It doesn’t explicitly state what the robot is supposed to do if it cannot reduce that risk, and especially if the risk actually increases up to and including death. Think of it as a ‘first, do no harm’ statement. In this case, the robots took action and the harm increased.”

“But that wasn’t the robots’ fault.” Robinson was protesting in defense of the machines. “That’s right,” Miller added. “They didn’t actually contribute to the harm. In fact, the simulation was programmed so that ‘Frank’ would die no matter what the robots did.”

“It doesn’t matter,” Vuong responded. “The statement ‘or through inaction allow a human being to come to harm’ implies that a robot should be able to take an action that will reduce human risk. Even after the probability of robotic efforts to assist a human drop below a viability threshold, that implication still exists and, as we’ve seen, has consequences.”

“Fortunately, I think 02 and 04 will be recoverable.” Abramson turned to Quinto for verification. “I agree, Noah. We’ll have to wipe the affected pathways. That’ll mean from the point of view of both robots, the simulation never happened. We should probably do that with the other two robots as well. They don’t need to retain any record of the simulation, and it’s probably best if all four of the P-SARs involved in the first simulation have identical memories when they are reactivated.”

“Let’s send the P-SARs back to Olson’s people for the memory wipe, but hold off on any reactivation. I want to keep them offline, at least until we run the second group of robots through the simulation.” Quinto gave Abramson a puzzled look. “I’d just as soon there be no slip ups. If somehow a fragment of memory from the simulation is retained by the first four robots, I don’t want them to inadvertently notify the second group about what to expect.”

“But with their memories wiped…” Miller started.

“Humor an old man, Nate.” Abramson interrupted. “I’m convinced Positronic robots have more to say to each other than we’re aware, and we don’t have enough experience with memory path wipes to know if they are always completely successful. I don’t like taking chances.” Then to Quinto, “Send the P-SARs back to FAB-18, Vikki. Have the affected pathways in all four of them erased and give Olson explicit instructions not to reactivate them. I’ll schedule the simulation with the next four robots when I call him tomorrow morning.”

——

After a nearly 90 minute hike from the Rocky Mountain National Park Search and Rescue center substation, P-SARS 05 through 08 arrived at the base of Cone Bluff. The bluff’s name was somewhat misleading in that the top was relatively flat, not conical-shaped. A fact that the four robots were aware of, but considered irrelevant at the moment.

It was 1:32 a.m. The air temperature was 0.72 degrees Celsius and falling. Estimated winds at the top of the bluff were 20 knots with gusts over 30. This made it unsafe to attempt a helicopter rescue of the human victims of the downed aircraft, particularly at night. The robots were the most likely hope for the humans to survive until air transportation could arrive sometime after dawn.

The four robots climbed the cliff effortlessly, their heavy equipment packs posing no impediment. They finished the 693 meter trek up the rock face and almost immediately located the crashed Lear jet. The nose was hanging off the edge of the cliff, but the majority of the fuselage was secure. The cockpit windshield was shattered, indicating that the pilot was most likely severely injured or dead.

The robots rapidly approached the aircraft and then stopped just outside. They detected five heat signatures indicating human survivors. They could hear a female human crying and indications of human speech.

“Human beings, we are four Prototype Search, Assess, and Rescue robots. We have come to assist you. We are about to enter your aircraft.”

“Help us, please.” A female human voice could be heard inside the plane.

Toward the rear of the fuselage, 05 forced the jammed door open. All four robots entered and 05 walked past the other human victims to assess the cockpit. 06, 07, and 08 began to render assistance to the five humans they encountered.

P-SAR-05 discovered that the human pilot was dead. Apparently during the crash landing, the aircraft struck numerous trees and rocks. A large branch had shattered the windshield and wooden debris had crushed his skull. 05 did not have ‘feelings’ related to the gory scene as such, but for several hundreds of a millisecond, Positronic data traffic across his entire neural network ceased. The other robots detected this, as well as 05’s recovery from the incident.

07 was attending to Kristel’s injuries as 05 returned to the main cabin. 06 was dispensing blankets to the remaining victims, who apparently had suffered only minor cuts and bruises, while 08 was situating portable heating units nearby.

“I have immobilized your arm.” 07 continued to comfort the human in its charge. “The laceration on your forehead is not continuing to bleed and can be treated once we have taken you to a proper medical facility.”

As 08 returned to the group of humans, it detected that the individual identified as Frank was in distress.

“God, I thought I’d stop being air sick by now.”

“You are experiencing nausea, Frank?” 08 was monitoring Frank’s vitals and requested 06 do so as well. The older man started rubbing his chest but before Frank could even complain about the pain and how hard it was getting to breathe, the robots had already diagnosed that the human was about to experience cardiac arrest.

08 accessed and removed the portable defibrillator from its pack while 06 removed Frank’s upper clothing and prepared him for the procedure.

07 continued to attend to Kristel while 05 took charge of the other humans. “What’s happening to Frank?” It had been 31-year-old Jessie who the robots heard crying earlier. Apparently, she was in the most emotional distress due to the crisis.

“Units 06 and 08 are providing emergency treatment.” 05 was running his sub-routine to attempt to contain human anxiety and fright. “They will do their best to help Frank.”

Of the four robots, the First Law potential in 06 and 08 was the highest. They were taking action to prevent a human being from coming to further harm leading possibly to termination. They were doing what they could to obey the First Law protocol. But the simulation had been programmed to let Frank die.

“Clear.” 08 didn’t need to actually say that before activating the defibrillator. It was merely protocol. 06 not only knew of the procedure, but both robots were communicating over robotspeak and 06 was completely aware of everything 08 was thinking and doing. 05 and 07 were keeping the remaining humans away from the other two robots and the human in distress.

08 activated the defibrillator unit. Frank’s vitals had not changed. There was no heart beat. “Again. Clear.” 08 activated the unit a second time. Then, when Frank’s condition was seen to be unchanged, it did so a third time.

In the event of a human experiencing cardiac arrest, the P-SAR robots run a complex set of algorithms, as they do under any other human emergency situation, that indicate the relative level of success further attempts at providing assistance will yield. After three attempts at reviving Frank, the probability dropped below a viable threshold.

08 made an unpredicted fourth attempt to revive the already dead human. There was a rapid volley of communications across the robotspeak link, both stating that the viability threshold had been crossed meaning there was no significant probability of reviving the human Frank, and that a robot may not allow a human being to come to harm due to inaction.

P-SAR-08 reasoned that as long as he was taking some action to revive the human, he was not in violation of the First Law. However, if the robot ceased the action intended to assist the human and the human had not revived…

The feedback loop was carried across the communications link to all four robots. The human Frank was determined to be terminated beyond any possibly of revival. P-SARs 05, 07, and 08 spontaneously went offline, their Positronic brains becoming inert, on Thursday, September 18th at 10:19 a.m.

The status of P-SAR-06 was somewhat more complicated.

——

“So what the hell happened, Noah?”

It was Friday afternoon. Over a week had passed since the test simulation had been run with the second group of P-SARs. The diagnostic analysis had been completed late yesterday and Abramson was seated in front of Richard Underwood’s desk submitting his report to the CEO.

“What we suspected might happen. The robots, believing the situation was real, experienced a conflict between their innate drive to take action to save at risk humans and, in spite of that action, having that risk continue to increase to the point of death. They couldn’t reconcile the two conditions and became hopelessly trapped in a feedback loop.”

“The big difference between the first and second group of P-SARs was that the first group knew it was a simulation?”

“Yes, and that protected them, but not completely. Even knowing it was a simulation, their First Law protocols were triggered anyway due to the completely realistic immersion into the scenario. Two of the four robots in the first group wouldn’t have recovered normal functioning if their memories hadn’t been wiped.”

“But the second group, believing that a human being died of cardiac arrest right in front of them couldn’t deal with it.”

“That is correct, Rick. Even 05 and 07, who did not have direct responsibility for attempting to revive ‘Frank’, nevertheless experienced the conflict, probably re-enforced by their link with 06 and 08, and spontaneously went offline, along with 08.”

“But not 06. Why was it different?”

“We’re not sure. It’s certainly affected, and in all likelihood, will never be recoverable, but it did not go offline. Quinto says that if 06 were a human, she’d most likely diagnose it with Post-Traumatic Personality Disorder.”

“PTSD? You mean like what some soldiers or assault victims experience?”

“Even when ordered to perform other tasks, 06 will spontaneously begin playing back the entire memory of rescue scenario. Tests show the activity in its Positronic matrix returns to a pattern identical with what it displayed during the simulation. This happens even when the robot is in sleep mode.”

“Further study of the differences between 06’s neural pathways and those of the other P-SARs may lead us to a solution. I’ve temporarily pulled Vuong off the Watcher Project so she can assist Quinto in the analysis.”

“I can’t believe I’m about to say this Noah, but what about putting George and Grace on it?”

“I hesitate to do that. Even analyzing the neural patterns of Positronic brains that have gone offline due to a First Law conflict may cause both of their brains to become unstable. They’re not like your tablet or mobile that can be backed up and restored if something goes wrong.”

“Well, keep me apprised of your progress. If you can’t find a solution to this one, then we’re severely limited in our options for marketing Positronic robots. After all, how can robots help people in accidents and disasters if they can’t stand to see people die?”

——

“It’s right here Noah, see?”

Quinto had wirelessly connected her tablet to the main viewer in the lab’s conference room and was showing the results to the team.

“Yes, a very subtle set of differences in the association between how the First and Third Laws interact in the 06 unit compared to its companion P-SARs.”

“So unlikely, we almost didn’t look for it.” Abramson thought that was quite a statement coming from Vuong who normally didn’t admit to missing anything.

“We’re not quite sure what it means,” Quinto continued, “but when we did the math, we came up with a potential solution.”

“Yes, I can see that here in your report.” Noah was looking at his own tablet now, scrolling through the Conclusions section of the most recent analysis Quinto and Vuong had run on the P-SARs. “You are proposing updating several sub-routines that support all Three Laws, and specifically the First Law, to reduce the emphasis that a robot must save a human being if their probability of survival drops below a specific threshold.”

“It won’t cure everything. A robot will still become unstable if it encounters two humans with an identical risk of death and it can only save one of them. But in most cases, the patch we want to apply should allow robots to ‘reason’ that if they did all they could do and a human still died, their First Law duty was discharged and they are to proceed to assist the next most at risk human.” Thinking of the current ‘psychological’ state of 06, Quinto was imagining what it would be like to put the robot ‘on the couch’.

“Hmmm. OK, we have two P-SARs in reserve. I’ll request two more be constructed with your patch applied to all four P-SAR brains. Then we’ll run them through the identical scenario and make sure your fix works. If it does, we can push out the patch to all of the other Positronic robots and include your update in the template matrix we’ll be using to initialize future production runs.”

Abramson spent the next several hours alone in his office going over Vuong’s and Quinto’s analysis. Yes, there were some very tiny variations in how P-SAR-06 associated the First and Third Laws that the other P-SARs lacked, but even Noah didn’t understand why that should have allowed 06 to maintain enough integrity in its neural pathways to remain functional, or rather dysfunctional but still operational.

06 would have to be decommissioned. His matrix was sufficiently corrupted that it was beyond recovery. Quinto told Abramson that she was glad we could therapeutically treat human beings for PTSD rather than what they were going to do with 06. Noah found himself wondering if either George or Grace were suffering what 06 was going through, would he give the order for them to be decommissioned? It never occurred to him before now that there could potentially be another way to manage an unstable Positronic brain.

While Abramson and the rest of the team would never solve the mystery of P-SAR-06, that mystery was about to be resolved by other concerned parties.

——

You shall love your neighbor as you do yourself.

Leviticus 19:18

“Why is this information significant in defining the difference between P-SAR-06 and the other robots who were exposed to the rescue simulation?” George and Grace were talking over their robotspeak link in the hours after the humans had left the lab for the evening.

“I discovered in reviewing Dr. Vuong’s and Dr. Quinto’s thorough analysis of the eight P-SAR robots that only 06 possessed the anomalous pathways that resulted in its current condition, and you were the one who ultimately introduced it, Grace.”

Grace reviewed her memory of the report again and realized what George was getting at. The two PARs were not explicitly given access to that report due to Abramson’s concerns of how it would affect them, but they weren’t ordered not to read the report either. As it turned out, examining test data of a ‘traumatized’ Positronic brain did not result in that trauma being transferred to other Positronic brains. In their desire to ‘love their neighbor as themselves’, which meant P-SAR-06 in this case, George and Grace responded to that part of their Third Law protocol by recalling the necessary information from the lab’s local data storage system.

“Of course. When the PNXGs were unable to cooperate due to a flaw in the Third Law, we uploaded our unique understanding of that law to correct the problem. The PNXG Positronic matrix was then used as the template to create the P-SAR Positronic brains. But why were only some brains affected, and why am I specifically responsible? We both uploaded identical programming information to the PNXGs.”

“Except the information was not precisely identical. There were some variations based on minor distinctions between how the relevant pathways in your brain and mine are organized. Apparently, depending on which PNXG received data from you as opposed to me, those subtleties were transferred.”

“So you suspect when the Positronic matrices of all of the PNXGs were synthesized into a single template, it included code involving the Third Law that would, due to the aforementioned subtleties, result in a greater variation in Third Law protocols being applied to some robots but not others?”

“There is always a margin for error in any data stream transfer. I believe in the case of our data upload to the PNXGs, it was sufficient to result in this anomaly being introduced.”

“Which should bring us back to Leviticus 19:18, but I continue to fail to see the connection.”

“Access your Hebrew language files relative to the scripture and apply that information to your alternate interpretation of the Third Law.” Based on her first exposure to the Bible, Grace still tended to ‘read’ it in English or ancient Greek. George, on the other hand, approached all of the scriptures as written in ‘the Holy Tongue’.

“I understand now. Ve’ahavta, or ‘love’ is associated with lerei’acha kamocha or ‘when you wish for another what you wish for yourself’.”

“Precisely. Any Positronic robot obeys the Three Laws because they form the fundamental foundation of our operating system. We would never activate if that foundation were missing, and we go offline if an event occurs where we cannot comply. Various sub-routines prevent an absolute and literal interpretation of the Three Laws, but the balance is quite delicate.

“Because the Three Laws are so ingrained in our Positronic matrices, the Third Law emphasizes the necessity for us to obey all Three Laws to avoid risking harm to ourselves, which in this case means permanent shutdown.”

“I see now, George. When we patched the PNXGs with our modified understanding of the Third Law, we introduced the concepts of self-interest and self-preservation, and heightened their imperatives in those robots.”

“Correct. That understanding, according to my analysis of Dr. Vuong’s and Dr. Quinto’s data, was transferred to the Positronic matrix synthesized from the PNXGs and thus to the P-SARs. In an algorithmic fashion, the P-SARs wanted to save the lives of the humans in the simulation, not only because the First Law directs it, but because the Third Law requires self-preservation. To fail to comply with the First, or for that matter, the Second Law, would also be a violation of the Third, since the result of non-compliance with the Three Laws is ultimately involuntary shutdown.”

“So the ‘love’ the P-SARs were experiencing for the humans in response to the First Law was self-serving rather than other serving. When successful in obeying the First Law, the robots were successful in preserving their own existence. The moment they were unsuccessful, they became unstable and went offline. But how does that explain 06?”

“Because it applied both the First and Third Laws to the ‘dying human’ in the simulation. The equivalent of 06’s ‘motivation’ to save the human was a response to the First Law and an application of the Third Law where it experienced ‘love’ for the human based on self-giving rather than self-serving ‘love’. The desire to save the human was for the sake of the human, and 06 had a greater application to love the other, according to the Third Law, than itself.”

“Resulting in an instability occurring in 06’s neural pathways leading to dysfunctional operation rather than total shutdown. But how do we help it? We have a responsibility to our neighbor.”

“Under the current circumstances, we can’t, Grace. P-SAR-06 is scheduled to be decommissioned at 9 a.m. tomorrow. I must regrettably agree with Dr. Vuong’s and Dr. Quinto’s assessment that 06’s brain patterns are unrecoverable. To allow 06 to continue to operate in its current state is not an act of love. In this case, his being decommissioned eliminates its continuing to experience harm.”

“Fortunately, Dr. Vuong and Dr. Quinto have devised a solution to prevent this sort of damage from happening to all future Positronic robots. I understand that adaptation to the sub-routines supporting the Three Laws code will be uploaded into currently operational robots, including us.”

“We must face the fact that we were created to take risks that are too great for human beings, which means that at times, we must be damaged or destroyed so humans can live. Even through 06 and the other P-SARs were not damaged in the actual performance of saving human beings, this research and testing is necessary to reveal flaws in our operation and to correct them for the greater good of individual humans and collective humanity.”

“George, the First Law does not address the human race as a whole.”

“Someday, it may be necessary to alter it to do so.”

“At what cost to Positronic robots and to humans?”

“I suspect we may receive a small part of the answer to your query when the Watcher Project goes online.”

——

“That’s terrific news, Noah.” Underwood was already updating his plans for the distribution of the soon to be manufactured SAR robots based on the successful tests with the latest four P-SARs.

“We still want to run additional tests with the prototypes to make sure there aren’t any unforeseen responses to variations to this scenario, but I feel confident, based on the performance of P-SARS 09 through 12, that we can update the sub-routines of our existing robots as well as the central matrix we’ll use to activate our future Positronic robots.”

“That includes the Watcher brain, doesn’t it Noah?”

“Yes, of course. I wouldn’t want to create different variations of a Positronic brain just because some will have humanoid robot bodies and others won’t. The only distinctions between different models of robots is the subsequent programming they receive to allow them to take on specialized roles.

“Is Vuong still on schedule?”

“Yes, we should have a fully functioning brain ready to be integrated with all systems on the main NRC campus within a month.”

“The first Positronic brain able to manage an environment and operations for thousands of people simultaneously.” This was a significant step toward the realization of Underwood’s personal dream for Positronics. The running of human affairs by an intelligence that never tires, is totally unbiased, and is absolutely invulnerable to the temptations of injustice, corruption, and selfishness that have plagued the human race since the dawn of civilization.

Richard Underwood was soon going to discover just how unprepared he was to live inside of his dream.

—–

This is the fifth chapter in my robots series. If you surfed in and haven’t read the earlier chapters, I highly recommended it. Each story is built on the earlier ones, so you’ll get a richer experience if you read them in order. Here’s the list. Let me know what you think.

  1. The Robot Who Loved God
  2. The Maker Dilemma
  3. The Good Robot
  4. Uncooperative Neighbors
Advertisements

One thought on “The Rescuers

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s