The Robots of Dawn trs-3 Page 4
Baley said, “What ought I to call you, boy?”
“I am Giskard, sir.”
“R. Giskard?”
“If you wish, sir.”
“Do you have a library on this ship?”
“Yes, sir.”
“Can you get me book-films on Aurora?”
“What kind, sir?”
“Histories—political science—geographies—anything that will let me know about the planet.”
“Yes, sir.”
“And a viewer.”
“Yes, sir.
The robot left through the double door and Baley nodded grimly to himself. On his trip to Solaria, it had never occurred to him to spend the useless time crossing space in learning something useful. He had come along a bit in the last two years.
He tried the door the robot had just passed through. It was locked and utterly without give. He would have been enormously surprised at anything else.
He investigated the room. There was a hyperwave screen. He handled the controls idly, received a blast of music, managed to lower the volume eventually, and, listened with disapproval. Tinkly and discordant. The instruments of the orchestra seemed vaguely distorted,
He touched other contacts and finally managed—to change the view. What he saw was a space-soccer game that was played, obviously, under conditions of zero-gravity. The ball flew in straight lines and the players (too many of them on each side—with fins on backs, elbows, and knees that must serve to control movement) soared in graceful, sweeps. The unusual movements made Baley feel dizzy. He leaned forward and had just found and used the off-switch when he heard the door open behind him.
He turned and, because he thoroughly expected to see R. Giskard, he was aware at first only of someone who was not R. Giskard. It took a blink or two to realize that he saw a thoroughly human shape, with a broad, high-cheekboned face and with short, bronze hair lying flatly backward, someone dressed in clothing with a conservative cut and color scheme.
“Jehoshaphat!” said Baley in a nearly strangled voice.
“Partner Elijah,” said the other, stepping forward, a small grave smile on his face.
“Daneel!” cried Baley, throwing his arms around the robot and hugging tightly. “Daneel!”
7
Baley continued to hold Daneel, the one unexpected familiar object on the ship, the one strong link to the past. He clung to Daneel in a gush of relief and affection.
And then, little by little, he collected his thoughts and knew that he was hugging not Daneel but R. Daneel—Robot Daneel Olivaw. He was hugging a robot and the robot was holding him lightly, allowing himself to be hugged, judging that the action gave pleasure to a human being and enduring that action because the positronic potentials of his brain made it impossible to repel the embrace and so cause disappointment and embarrassment to the human being.
The insurmountable First Law of Robotics states: “A robot may not injure a human being”—and to repel a friendly gesture would do injury.
Slowly, so, as to reveal no sign of his own chagrin, Baley released his hold. He even gave each upper arm of the robot a final squeeze, so that there might seem to be no shame to the release.
“Haven’t seen you, Daneel,” said Baley, “since you brought that, ship to Earth with the two mathematicians. Remember?”
“Of a certainty, Partner Elijah. It is a pleasure to see you.”
“You feel emotion, do you?” said Baley lightly.
“I cannot say what I feel in any human sense, Partner Elijah. I can say, however, that the sight of you seems to make my thoughts flow more easily, and the gravitational pull on my body seems to assault my senses with lesser insistence, and that there are other changes I can identify. I imagine that what I sense corresponds in a rough way to what it is that you may sense when you feel pleasure.”
Baley nodded. “Whatever it is you sense when, you see me, old partner, that makes it seem preferable to the state in which you are when you don’t see me, suits me well—if you follow my meaning. But how is it you are here?”
“Giskard Reventlov, having reported you—” R. Daneel paused.
“Purified?” asked Baley sardonically.
“Disinfected,” said R. Daneel. “I felt it appropriate to enter then.”
“Surely you would not fear infection otherwise?”
“Not at all, Partner Elijah, but others on the ship might then be reluctant to have me approach them. The people of Aurora are sensitive to the chance of infection, sometimes to a point beyond a rational estimate of the probabilities.”
“I understand, but I wasn’t asking why you were here at this moment. I meant why are you here at all?”
“Dr. Fastolfe, of whose establishment I am part, directed me to board the ship that had been sent to pick you up for several reasons. He felt it desirable that you have one immediate item of the known in what he was certain would be a difficult mission for you.”
“That was a kindly thought on his part. I thank him.”
R. Daneel bowed gravely in acknowledgment. “Dr. Fastolfe also felt that the meeting would give in”—the robot paused “appropriate sensations—”
“Pleasure, you mean, Daneel.”
“Since I am permitted to use the term, yes. And as a third reason—and the most important—”
The door opened again at that point and R. Giskard walked in.
Baley’s head turned toward it and he felt a surge of displeasure. There was no mistaking R. Giskard as a robot and its presence, emphasized, somehow, the robotism of Daneel (R. Daneel, Baley suddenly thought again), even though Daneel was far the superior of the two. Baley didn’t want the robotism of Daneel emphasized; he didn’t want himself humiliated for his inability to regard Daneel as anything but a human being with a somewhat stilted way with the language.
He said impatiently, “What is it, boy?”
R. Giskard said, “I have brought the book-films you wished to see, sir, and the viewer.”
“Well, put them down. Put them down.—And you needn’t stay. Daneel will be here with me.”
“Yes, sir.” The robot’s eyes—faintly glowing, Baley noticed, as Daneel’s were not—turned briefly to R. Daneel, as though seeking orders from a superior being.
R. Daneel said quietly, “It will be appropriate, friend Giskard, to remain just outside the door.”
“I shall, friend Daneel,” said R. Giskard.
It left and Baley said with some discontent, “Why does it have to stay just outside the door? Am I a prisoner?”
“In the sense,” said R. Daneel, “that it would not be permitted for you to mingle with the ship’s company in the course of this voyage, I regret to be forced to say you are indeed a prisoner. Yet that is not the reason for the presence of Giskard.—And I should tell you at this point that it might well be advisable, Partner Elijah, if you did not address Giskard—or any robot—as boy.”
Baley frowned. “Does it resent the expression?”
“Giskard does not resent any action of a human being. It is simply that ‘boy’ is not a customary term of address for robots on Aurora and it would be inadvisable to create friction with the Aurorans, by unintentionally stressing your place of origin through habits of speech that are nonessential.”
“How do I address it, then?”
“As you address me, by the use of his accepted identifying name. That is, after all, merely a sound indicating the particular person you are addressing—and why should one sound be preferable to another? It is merely a matter of convention. And it is also the custom on Aurora to refer to a robot as ‘he’—or sometimes ‘she’—rather than as ‘it.’ Then, too, it is not the custom on Aurora to use the initial ‘R.’ except under formal conditions where the entire name of the robot is appropriate and even then the initial is nowadays often left out.”
“In that case—Daneel,” (Baley repressed the sudden impulse to say “R. Daneel”) “how do you distinguish between robots and human beings?”
�
��The distinction is usually self-evident, Partner Elijah. There would seem to be no need to emphasize it unnecessarily. At least that is the Auroran view and, since you have asked Giskard for films on Aurora, I assume you wish to familiarize yourself with things Auroran as an aid to the task you have undertaken.”
“The task which has been dumped on me, yes. And what if the distinction between robot and human being is not self evident, Daneel? As in your case?”
“Then why make the distinction, unless the situation is such that it is essential to make it?”
Baley took a deep breath. It was going to be difficult to adjust to this Auroran pretense that robots did not exist. He said, “But then, if Giskard is not here to keep me prisoner, why is it—he—outside the door?”
“Those are according to the instructions of Dr. Fastolfe, Partner Elijah. Giskard is to protect you.”
“Protect me? Against what?—Or against whom?”
“Dr. Fastolfe was not precise on that point, Partner Elijah. Still, as human passions are running high over the matter of Jander Panell—”
“Jander Panell?”
“The robot whose usefulness was terminated.”
“The robot, in other words, who was killed?”
“Killed, Partner Elijah, is a term that is usually applied to human beings.”
“But on Aurora distinctions between robots and human beings are avoided, are they not?”
“So they are! Nevertheless, the possibility of distinction or lack of distinction in the particular case of the ending of functioning has never arisen to my knowledge. I do not know what the rules are.”
Baley pondered the matter. It was a point of no real importance, purely a matter of semantics. Still, he wanted to probe the manner of thinking of the Aurorans. He would get nowhere otherwise.
He said slowly, “A human being who is functioning, is alive. If that life is violently ended by the deliberate action of another human being, we call that ‘murder’ or ‘homicide.’ ‘Murder’ is, somehow, the stronger word. To be witness, suddenly, to an attempted violent end to the life of a human being, one would shout ‘Murder!’ It is not at all likely that one would shout ‘Homicide!’ It is the more formal word, the less emotional word.”
R. Daneel said, “I do not understand the distinction you are making, Partner Elijah. Since ‘murder’ and ‘homicide’ are both used to represent the violent ending of the life of a human being, the two words must be interchangeable. Where, then, is the distinction?”
“Of the two words, one screamed out will more effectively chill the blood of a human being than the other will, Daneel.”
“Why is that?”
“Connotations and associations; the subtle effect, not of dictionary meaning, but of years of usage; the—nature of the sentences and conditions and events in which one has experienced the use of one word as compared with that of the other.”
“There is nothing of this in my programming,” said Daneel, with a curious sound of helplessness hovering over the apparent lack of emotion with which he said this (the same lack of emotion with which he said everything).
Baley said, “Will you accept my word for it, Daneel?”
Quickly, Daneel said, almost as though he had just been presented with the solution to a puzzle, “Without doubt.”
“Now, then, we might say that a robot that is functioning is alive,” said Baley. “Many might refuse to broaden the word so far, but we are free to devise definitions to suit ourselves if it is useful. It is easy to treat a functioning robot as alive and it would be unnecessarily complicated to try to invent a new word for the condition or to avoid the use of the familiar one. You are alive, for instance, Daneel, aren’t you?”
Daneel said, slowly and with emphasis, “I am functioning!”
“Come. If a squirrel is alive, or a bug, or a tree, or a blade of grass, why not you? I would never remember to say—or to think—that I am alive but that you are merely functioning, especially if I am to live for a while on Aurora, where I am to try not, to make unnecessary distinctions between a robot and myself. Therefore, I tell you that we are both alive and I ask you to take my word for it.”
“I will do so, Partner Elijah.”
“And yet can we say that the ending of robotic life—by the deliberate violent action of a human being is also ‘murder’? We might hesitate. If the crime is the same, the punishment should be the same, but would that be right? If the punishment of the murder of a human being is death, should one actually execute a human being who puts an end to a robot?”
“The punishment of a murderer is psychic-probing, Partner Elijah, followed by the construction of a new personality. It is the personal structure of the mind that has committed the crime, not the life of the body.”
“And what is, the punishment on Aurora for putting a violent end to the functioning of a robot?”
“I do not know, Partner Elijah. Such an incident has never occurred on Aurora, as far as I know.”
“I suspect the punishment would not be psychic-probing,” said Baley. “How about ‘roboticide’?”
“Roboticide?”
“As the term used to describe the killing of a robot.”
Daneel said, “But what about the verb derived from the noun, Partner Elijah? One never says ‘to homicide’ and it would therefore not be proper to say ‘to roboticide.’”
“You’re right. You would have to say ‘to murder’ in each case.”
“But murder applies specifically to human beings. One does not murder an animal, for instance.”
Baley said, “True. And one does not murder even a human being by accident, only be deliberate intent. The more general term is ‘to kill’. That applies to accidental death as well as to deliberate murder—and it applies to animals as well as human beings. Even a tree may be killed by disease, so why may not a robot be killed, ’eh, Daneel?”
“Human beings and other animals and plants as well, Partner Elijah, are all living things,” said Daneel. “A robot is a human artifact, as much as this viewer is. An artifact is ‘destroyed’, ‘damaged’, ‘demolished’, and so on. It is never ‘killed’.”
“Nevertheless, Daneel, I shall say ‘killed.’ Jander Panell was killed.”
Daneel said, “Why should a difference in a word make any difference to the thing described?”
“That which we call a rose by any other name would smell as sweet. Is that it, Daneel?”
Daneel paused, then said, “I am not certain what is meant by the smell of a rose, but if a rose on Earth is the common flower that is called a rose on Aurora, and if by its ‘smell’ you mean a property that can be detected, sensed, or measured by human beings, then surely calling a rose by another sound combination and holding all else equal—would not affect the smell or any other of its intrinsic properties.”
“True. And yet, changes in name do result in changes in perception, where human beings are concerned.”
“I do not see why, Partner Elijah.”
“Because human beings are often illogical, Daneel. It is not an admirable characteristic.”
Baley sank deeper into his chair and fiddled with his viewer, allowing his mind, for a few minutes, to retreat into private thought. The discussion with Daneel was useful in itself, for while Baley played with the question of words, he managed to forget that he was in space, to forget that the ship was moving forward until it was far enough from the mass centers of the Solar System to make the Jump through hyperspace; to forget that he would soon be several million kilometers from Earth and, not long after that, several lightyears from Earth.
More important, there were positive conclusions to be drawn. It was clear that Daneel’s talk about Aurorans, making no distinction between robots and human beings was misleading. The Aurorans might virtuously remove the initial “R.,” the use of “boy” as a form of address, and the use of “it” as the customary pronoun, but from Daneel’s resistance to the use of the same word for the violent ends of a robot and of a
human being (a resistance inherent in his programming which was, in turn, the natural consequence of Auroran assumptions about how Daneel ought to behave) one had to conclude that these were merely superficial changes. In essence, Aurorans were as firm as Earthmen in their belief that robots were machines that were infinitely inferior to human beings.
That meant that his formidable task of finding a useful resolution of the crisis (if that were possible at all) would not be hampered by at least one particular misperception of Auroran society.
Baley wondered if he ought to question Giskard, in order to confirm the conclusions he reached from his conversation with Daneel—and, without much hesitation, decided not to. Giskard’s simple and rather unsubtle mind would be of no use. He would “Yes, sir” and “No, sir” to the end. It would be like questioning a recording.
Well, then, Baley decided, he would continue with Daneel, who was at least capable of responding with something approaching subtlety.
He, said, “Daneel, let us consider the case of Jander Panell, which I assume, from what you have said so far, is the first case of roboticide in the history of Aurora. The human being responsible—the killer—is I take it, not known.”
“If,” said Daneel, “one assumes that a human being was responsible, then his identity is not known. In that, you are right, Partner Elijah.”
“What about the motive? Why was Jander Panell killed?”
“That, too, is not known.”
“But Jander Panell was a humaniform robot, one like yourself and not one like, for instance, R. Gis—I mean, Giskard.”
“That is so. Jander was a humaniform robot like myself.”
“Might it not be, then, that no case of roboticide was intended?”
“I do not understand, Partner Elijah.”
Baley said, a little impatiently, “Might not the killer have thought this Jander was a human being, that the intention was homicide, not roboticide?”
Slowly, Daneel shook his head. “Humaniform robots are quite like human beings in appearance, Partner Elijah, down to the hairs and pores in our skin. Our voices are thoroughly natural, we can go through the motions of eating, and so on. And yet, in our behavior there are noticeable differences. There may be fewer such differences with time and with refinement of technique,—but as yet they are many. You—and other Earthmen not used to humaniform robots—may not easily notes these differences, but Aurorans would. No Auroran would mistake Jander—or me—for a human being, not for a moment.”