Daneel Olivaw 3 - The Robots of Dawn Page 10
“By all means, go on.”
“The robot, according to the tale, could read minds. And when asked questions, he read the questioner's mind and told the questioner what he wanted to hear. Now the First Law of Robotics states quite clearly that a robot may not injure a human being or, through inaction, allow a person to come to harm, but to robots generally that means physical harm. A robot who can read minds, however, would surely decide that disappointment or anger or any violent emotion would make the human being feeling those emotions unhappy and the robot would interpret the inspiring of such emotions under the heading of ‘harm.’ If, then, a telepathic robot knew that the truth might disappoint or enrage a questioner or cause that person to feel envy or unhappiness, he would tell a pleasing lie, instead. Do you see that?”
“Yes, of course.”
“So the robot lied even to Susan Calvin herself. The lies could not long continue, for different people were told different things that were not only inconsistent among themselves but unsupported by the gathering evidence of reality, you see, Susan Calvin discovered she had been lied to and realized that those lies had led her into a position of considerable embarrassment. What would have disappointed her somewhat to begin with had now, thanks to false hopes, disappointed her unbearably. —You never heard the story?”
“I give you my word.”
“Astonishing! Yet it certainly wasn't invented on Aurora, for it is equally current on all the worlds. —In any case, Calvin took her revenge. She pointed out to the robot that, whether he told the truth or told a lie, he would equally harm the person with whom he dealt. He could not obey the First Law, whatever action he took. The robot, understanding this, was forced to take refuge in total inaction. If you want to put it colorfully, his positronic pathways burned out. His brain was irrecoverably destroyed. The legend goes on to say that Calvin's last word to the destroyed robot was ‘Liar!’ ”
Baley said, “And something like this, I take it, was what happened to Jander Panell. He was faced with a contradiction in terms and his brain burned out?”
“It's what appears to have happened, though that is not as easy to bring about as it would have been in Susan Calvin's day. Possibly because of the legend, roboticists have always been careful to make it as difficult as possible for contradictions to arise. As the theory of positronic brains has grown more subtle and as the practice of positronic brain design has grown more intricate, increasingly successful systems have been devised to have all situations that might arise revolve into non-equality, so that some action can always be taken that will be interpreted as obeying the First Law.”
“Well, then, you can't burn out a robot's brain. Is that what you're saying? Because if you are, what happened to Jander?”
“It's not what I'm saying. The increasingly successful systems I speak of, are never completely successful. They cannot be. No matter how subtle and intricate.a brain might be, there is always some way of setting up a contradiction. That is a fundamental truth of mathematics. It will remain forever impossible to produce a brain so subtle and intricate as to reduce the chance of contradiction to zero. Never quite to zero. However, the systems have been made so close to zero that to bring about a mental freeze-out by setting up a suitable contradiction would require a deep understanding of the particular positronic brain being deal with—and that would take a clever theoretician.”
“Such as yourself, Dr. Fastolfe?”
“Such as myself. In the case of humaniform robots, only myself.”
“Or no one at all,” said Baley, heavily ironic.
“Or no one at all. Precisely,” said Fastolfe, ignoring the irony. “The humaniform robots have brains—and, I might add, bodies—constructed in conscious imitation of the human being. The positronic brains are extraordinarily delicate and they take on some of the fragility of the human brain, naturally. Just as a human being may have a stroke, through some chance event within the brain and without the intervention of any external effect, so a humaniform brain might, through chance alone—the occasional aimless drifting of positrons—go into mental freeze.”
“Can you prove that, Dr. Fastolfe?”
“I can demonstrate it mathematically, but of those who could follow the mathematics, not all would agree that the reasoning was valid. It involves certain suppositions of my own that do not fit into the accepted modes of thinking in robotics.”
“And how likely is spontaneous mental freeze-out?”
“Given a large number of humaniform robots, say a hundred thousand, there is an even chance that one of them might undergo spontaneous mental freeze-out in an average Auroran lifetime. And yet it could happen much sooner, as it did to Jander, although then the odds would be very greatly against it.”
“But look here, Dr. Fastolfe, even if you were to prove conclusively that a spontaneous mental freeze-out could take place in robots generally, that would not be the same as proving that such a thing happened to Jander in particular at this particular time.”
“No,” admitted Fastolfe, “you are quite right.”
“You, the greatest expert in robotics, cannot prove it in the specific case of Jander.”
“Again, you are quite right.”
“Then what do you expect me to be able to do, when I know nothing of robotics.”
“There is no need to prove anything. It would surely be sufficient to present an ingenious suggestion that would make spontaneous mental freeze-out plausible to the general public.”
“Such as—”
“I don't know.”
Baley said harshly. “Are you sure you don't know, Dr. Fastolfe?”
“What do you mean? I have just said I don't know.”
“Let me point out something. I assume that Aurorans, generally, know that I have come to the planet for the purpose of tackling this problem. It would be difficult to manage to get me here secretly, considering that I am an Earthman and this is Aurora.”
“Yes, certainly, and I made no attempt to do that. I consulted the Chairman of the Legislature and persuaded him to grant me permission to bring you here. It is how I've managed to win a stay in judgment. You are to be given a chance to solve the mystery before I go on trial. I doubt that they'll give me a very long stay.”
“I repeat, then—Aurorans, in general, know I'm here and I imagine they know precisely why I am here—that I am supposed to solve the puzzle of the death of Jander.”
“Of course. What other reason could there be?”
“And from the time I boarded the ship that brought me here, you have kept me under close and constant guard because of the danger that your enemies might try to eliminate me—judging me to be some sort of wonderman who just might solve the puzzle in such a way as to place you on the winning side, even though all the odds are against me.”
“I fear that as a possibility, yes.”
“And suppose someone who does not want to see the puzzle solved and you, Dr. Fastolfe, exonerated should actually succeed in killing me. Might that not swing sentiment in your favor? Might not people reason that your enemies felt you were, in actual fact, innocent or they would not fear the investigation so much that they would want to kill me?”
“Rather complicated reasoning, Mr. Baley. I suppose that, properly exploited, your death might be used to such a purpose, but it's not going to happen. You are being protected and you will not be killed.”
“But why protect me, Dr. Fastolfe? Why not let them kill me and use my death as a way of winning?”
“Because I would rather you remained alive and succeeded in actually demonstrating my innocence.”
Baley said, “But surely you know that I can't demonstrate your innocence.”
“Perhaps you can. You have every incentive. The welfare of Earth hangs on your doing so and, as you have told me, your own career.”
“What good is incentive? If you ordered me to fly by flapping my arms and told me further that if I failed, I would be promptly killed by slow torture and that Earth would be
blown up and all its population destroyed, I would have enormous incentive to flap my wings and fly—and yet still be unable to do so.”
Fastolfe said uneasily, “I know the chances are small.”
“You know they are nonexistent,” said Baley violently, “and that only my death can save you.”
“Then I will not be saved, for I am seeing to it that my enemies cannot reach you.”
“But you can reach me.”
“What?”
“I have the thought in my head, Dr. Fastolfe, that you yourself might kill me in such a way as to make it appear that your enemies have done the deed. You would then use my death against them—and that that is why you have brought me to Aurora.”
For a moment, Fastolfe looked at Baley with a kind of mild surprise and then, in an excess of passion both sudden and extreme, his face reddened and twisted into a snarl. Sweeping up the spicer from the table, he raised it high and brought his arm down to hurl it at Baley.
And Baley, caught utterly by surprise, barely managed to cringe back against his chair.
5. DANEEL AND GISKARD
18
If Fastolfe had acted quickly, Daneel had reacted far more quickly still.
To Baley, who had all but forgotten Daneel's existence, there seemed a vague rush, a confused sound, and then Daneel was standing to one side of Fastolfe, holding the spicer, and saying, “I trust, Dr. Fastolfe, that I did not in any way hurt you.”
Baley noted, in a dazed sort of way, that Giskard was not far from Fastolfe on the other side and that every one of the four robots at the far wall had advanced almost to the dining room table.
Panting slightly, Fastolfe, his hair quite disheveled, said, “No, Daneel. You did very well, indeed.” He raised his voice. “You all did well, but remember, you must allow nothing to slow you down, even my own involvement.”
He laughed softly and took his seat once more, straightening his hair with his hand.
“Fm sorry,” he said, “to have startled you so, Mr. Baley, but I felt the demonstration might be more convincing than any words of mine would have been.”
Baley, whose moment of cringing had been purely a matter of reflex, loosened his collar and said, with a touch of hoarseness, “I'm afraid I expected words, but I agree the demonstration was convincing. I'm glad that Daneel was close enough to disarm you.”
“Any one of them was close enough to disarm me, but Daneel was the closest and got to me first. He got to me quickly enough to be gentle about it. Had he been farther away, he might have had to wrench my arm or even knock me out.”
“Would he have gone that far?”
“Mr. Baley,” said Fastolfe. “I have given instructions for your protection and I know how to give instructions. They would not have hesitated to save you, even if the alternative was harm to me. They would, of course, have labored to inflict minimum harm, as Daneel did. All he harmed was my dignity and the neatness of my hair. And my fingers tingle a bit.” Fastolfe flexed them ruefully.
Baley drew a deep breath, trying to recover from that short period of confusion. He said, “Would not Daneel have protected me even without your specific instruction?”
“Undoubtedly. He would have had to. You must not think, however, that robotic response is a simple yes or no, up or down, in or out. It is a mistake the layman often makes. There is the matter of speed of response. My instructions with regard to you were so phrased that the potential built up within the robots of my establishment, including Daneel, is abnormally high, as high as I can reasonably make it, in fact. The response, therefore, to a clear and present danger to you is extraordinarily rapid. I knew it would be and it was for that reason that I could strike out at you as rapidly as I did—knowing I could give you a most convincing demonstration of my inability to harm you.”
“Yes, but I don't entirely thank you for it.”
“Oh, I was entirely confident in my robots, especially Daneel. It did occur to me, though, a little too late, that if I had not instantly released the spicer, he might, quite against his will—or the robotic equivalent of will—have broken my wrist.”
Baley said, “It occurs to me that it was a foolish risk for you to have undertaken.”
“It occurs to me, as well—after the fact. Now if you had prepared yourself to hurl the spicer at me, Daneel would have at once countered your move, but not with quite the same speed, for he has received no special instructions as to my safety. I can hope he would have been fast enough to save me, but I'm not sure—and I would prefer not to test that matter.” Fastolfe smiled genially.
Baley said, “What if some explosive device were dropped on the house from some airborne vehicle?”
“Or if a gamma beam were trained upon us from a neighboring hilltop. —My robots do not represent infinite protection, but such radical terrorist attempts are unlikely in the extreme here on Aurora. I suggest we do not worry about them.”
“I am willing not to worry about them. Indeed, I did not seriously suspect that you were a danger to me, Dr. Fastolfe, but I needed to eliminate the possibility altogether if I were to continue. We can now proceed.”
Fastolfe said, “Yes, we can. Despite this additional and very dramatic distraction, we still face the problem of proving that Jander's mental freeze-out was spontaneous chance.”
But Baley had been made aware of Daneel's presence and he now turned to him and said uneasily, “Daneel, does it pain you that we discuss this matter?”
Daneel, who had deposited the spicer on one of the farther of the empty tables, said, “Partner Elijah, I would prefer that past-friend Jander were still operational, but since he is not and since he cannot be restored to proper functioning, the best of what is left is that action be taken to prevent similar incidents in the future. Since the discussion now has that end in view, it pleases rather than pains me.
“Well, then, just to settle another matter, Daneel, do you believe that Dr. Fastolfe is responsible for the end of your fellow-robot Jander? —You'll pardon my inquiring, Dr. Fastolfe?”
Fastolfe gestured his approval and Daneel said, “Dr.
Fastolfe has stated that he was not responsible, so he, of course, was not.”
“You have no doubts on the matter, Daneel?”
“None, Partner Elijah.”
Fastolfe seemed a little amused. “You are cross-examining a robot, Mr. Baley.”
“I know that, but I cannot quite think of Daneel as a robot and so I have asked.”
“His answers would have no standing before any Board of Inquiry. He is compelled to believe me by his positronic potentials.”
“I am not a Board of Inquiry, Dr. Fastolfe, and I am clearing out the underbrush. Let me go back to where I was. Either you burned out Jander's brain or it happened by random circumstance. You assure me that I cannot prove random circumstance and that leaves me only with the task of disproving any action by you. In other words, if I can show that it is impossible for you to have killed Jander, we are left with random circumstance as the only alternative.”
“And how can you do that?”
“It is a matter of means, opportunity, and motive. You had the means of killing Jander—the theoretical ability to so manipulate him that he would end in a mental freeze-out. But did you have the opportunity? He was your robot, in that you designed his brain paths and supervised his construction, but was he in your actual possession at the time of the mental freeze-out?”
“No, as a matter of fact. He was in the possession of another.”
“For how long?”
“About eight months—or a little over half of one of your years.”
“Ah. It's an interesting point. Were you with him—or near him—at the time of his destruction? Could you have reached him? In short, can we demonstrate the fact that you were so far from him—or so out of touch with him— that it is not reasonable to suppose that you could have done the deed at the time it is supposed to have been done?”
Fastolfe said, “That, I'm afraid, is i
mpossible. There is a rather broad interval of time during which the deed might have been done. There are no robotic changes after destruction equivalent to rigor mortis or decay in a human being. We can only say that, at a certain time, Jander was known to be in operation and, at a certain other time, he was known not to be in operation. Between the two was a stretch of about eight hours. For that period, I have no alibi.”
“None? During that time, Dr. Fastolfe, what were you doing?”
“I was here in my establishment.”
“Your robots were surely aware, perhaps, that you were here and could bear witness.”
“They were certainly aware, but they cannot bear witness in any legal sense and on that day Fanya was off on business of her own.”
“Does Fanya share your knowledge of robotics, by the way?”
Fastolfe indulged in a wry smile. “She knows less than you do. —Besides, none of this matters.”
“Why not?”
Fastolfe's patience was clearly beginning to stretch to the cracking point. “My dear Mr. Baley, this was not a matter of close-range physical assault, such as my recent pretended attack on you. What happened to Jander did not require my physical presence. As it happens, although not actually in my establishment, Jander was not far away geographically, but it wouldn't have mattered if he were on the other side of Aurora. I could always reach him electronically and could, by the orders I gave him and the responses I could educe, send him into mental freeze-out. The crucial step would not even necessarily require much in the way of time—”
Baley said at once, “It's a short process, then, one that someone else might move through by chance, while intending something perfectly routine?”
“No!” said Fastolfe. “For Aurora's sake, Earthman, let me talk. I've already told you that's not the case. Inducing mental freeze-out in Jander would be a long and complicated and tortuous process, requiring the greatest understanding and wit, and could be done by no one accidentally, without incredible and long-continued coincidence. There would be far less chance of accidental progress over that enormously complex route than of spontaneous mental freeze-out, if my mathematical reasoning were only accepted.