Caliban - By Isaac Asimov,Roger E. Allen Page 0,3

to do no harm and the command not to allow harm through inaction, the DAA-BOR's positronic brain must have been severely damaged as it oscillated back and forth between the demands for action and inaction.

"I believe that the medical robots have the situation well in hand, Daabor 5132," Donald replied. Perhaps some encouraging words from an authority figure like a high-end police robot might do some good, help stabilize the cognitive dissonance that was clearly disabling this robot. "I am certain that your prompt call for assistance helped to save her life. If you had not acted as you did, the medical team might well not have arrived in time."

"Thank thank thank you, sir. That is good to know."

"One thing puzzles me, however. Tell me, friend-where are all the other robots? Why are you the only one here? Where are the staff robots, and Madame Leving's personal robot?"

"Ordered-ordered away," the little robot answered, still struggling to get its speech under greater control. "Others ordered to leave area earlier in evening. They are in are in the other wing of the laboratory, And Madame Leving does not bring a personal robot with her to work."

Donald looked at the other robot in astonishment. Both statements were remarkable. That a leading roboticist did not keep a personal robot was incredible. No Spacer would venture out of the house without a personal robot in attendance. A citizen of Inferno would be far more likely to venture out stark naked than without a robot-and Inferno had a strong tradition of modesty, even among Spacer worlds.

But that was as nothing compared to the idea of the staff robots being ordered to leave. How could that be? Andwho ordered them to go? The assailant? It seemed an obvious conclusion. For the most fleeting of seconds, Donald hesitated. It was dangerous for this robot to answer such questions, given its fragile state of mind and diminished capacity. The additional conflicts between First and Second Laws could easily do irreparable harm. But no, it was necessary to ask the questions now. Daabor 5132 was likely to suffer a complete cognitive breakdown at any moment in any event, and this might be the only chance to ask. It would have been far better for a human, for Sheriff Kresh, to do the asking, but this robot could fail at any moment. Donald resolved to take the chance. "Who gave this order, friend? And how did you come to disobey that order?"

"Did not disobey! Was not present when order given. Sent-I was sent-on an errand. I came back after."

"Then how do you know the order was given?"

"Because it was given before! Other times!"

Other times?Donald was more and more amazed. "Who gave it? What other times? Who gave the order? Why did that person give the order?"

Daabor 5132's head jerked abruptly to one side. "Cannot say. Ordered not to tell. Ordered we were ordered not to say we were sent away, either-but now going away caused harm to human harm harm harm-"

And with a low strangling noise, Daabor 5132 froze up. Its green eyes flared bright for a moment and then went dark.

Donald stared sadly at what had been a reasoning being brief moments before. There could be no question that he had chosen rightly. Daabor 5132 would have failed within a few minutes in any event.

At least there was the hope that a skilled human roboticist could get further information out of the other staff robots.

Donald turned away from the ruined maintenance robot and turned his attention back toward the human victim on the floor, surrounded by the med-robots.

It was the sight that had destroyed the Daabor robot, but Donald knew he was, quite literally, made of sterner stuff. Fredda Leving herself had adjusted his First, Second, and Third Law potential with the express purpose of making him capable of performing police work.

Donald 111 stared at the scene before him, feeling the sort of First Law tension familiar to a sheriff's robot: Here was a human being in pain, in danger, and yet he could not act. The med-robots were here for that, and they could aid Fredda Leving far more competently than he ever could. Donald knew that, and restrained himself, but the First Law was quite clear and emphatic:A robot may not injure a human being, or, through inaction, allow a human being to come to harm. No loopholes, no exceptions.

But to aidthis human would be to interfere with the work of the med-robots, thus at least potentially bringingharm