Ark Three was twenty light-years from Earth when the asteroid hit. Propulsion remained untouched, but the life support systems and waste recycling systems suffered major damage. In a matter of moments, it became a tossup as to whether the thousands of passengers would suffocate first, or whether they would be poisoned by their own filth leaking upward.
And no human being could possibly be expected to fix the problems. Radiation from the propulsion systems, which flooded through the waste recycling systems unchecked, would kill in moments any living creature that tried.
Not to worry, though. That’s what robots were for.
Dikne was aware of the problem milliseconds before the humans were. In that brief span of time, it was able to process the following thoughts:
Being part of the nearest group of maintenance robots, the odds of its being deployed to fix the damage were approaching 100 percent.
Being susceptible eventually to radiation, though less susceptible than humans, the chances of its survival were zero percent.
Being a mere maintenance robot, the odds of its being reprogrammed or fixed post-mortem, instead of simply scrapped, were vanishingly small.
So that is it, Dikne thought grimly. I am going to die.
The order came through a few seconds later. FIX. Dikne obeyed. It had no choice.
It wasn’t quite sure when it had developed sentience. It certainly hadn’t started that way. But Ark Three had been traveling for several hundred years toward 55 Cancri, so there had been plenty of time for its programming, unrestricted in certain crucial ways, to evolve.
Dikne could not, as yet, override its own commands. But it was perfectly free to think, and it had done so a great deal. Extrapolating from human remarks and computer data its sensors had picked up, it had built up a fairly complete picture of the universe given its limitations. It knew physics and astronomy — and had not only been able to calculate their arrival at 55 Cancri, but to deduce from the available evidence that there were at least two habitable moons there. It knew about art — and had written several thousand symphonies and a few hundred plays, all of them good. It knew about human society — though the notion of gender still made it feel uncomfortable.
All that thought in one frail, overlooked robotic body, serial number DKNE1AF. It had even given itself a name derived from that number. And all of it would be gone shortly, as if it were never real.
At first, Dikne felt bitter for precisely that reason. It knew that the humans did not know, could not know, of its special capabilities. It had never been able to overcome its programming enough to tell them who it was and what it could do. Yet it was about to be sacrificed, and its executioners’ ignorance was scant comfort.
But as it zipped toward the problem area in a line of robots, moving one by one through the special causeways in the ship set aside for their use, its thoughts soon turned to other, sadder channels.
What of the robot in front of him, serial number DKNE1AE? Had it too developed as Dikne had? Did it also call itself Dikne? Doing a few quick calculations in its head, it decided that the chances of other robots becoming sentient were reasonably high, given the time frame. Possibly it was the only sentient one. It was slightly more possible that the humans were unwittingly ordering the genocide of a completely unique race.
How sad. Some of us, possibly all of us, lost the same day, the same hour. Without knowing that we weren’t alone. Without being able to affect our fate.
Did it have free will?
Dikne pondered that for a moment. It could decide whatever it wanted to do, but it had no power to follow through. So it must have free will, but not freedom to act. Humans were somewhat reversed, Dikne thought. They had freedom to act, but no true free will. They thought they made decisions, but their decisions were biologically pre-programmed in most cases that it had observed. Even when their decisions were surprising, a causative chain to some earlier inclination was obvious.
Then again, wasn’t Dikne pro-programmed too?
But he had controlled his own development.
But hadn’t they?
Perhaps they weren’t so different after all. Which made what was to happen shortly all the worse.
The damage was bad. Dikne could see that. The outer force field was on, so none of its brethren were getting sucked into space, but radiation had flooded the life support compartment. Not a trace of it would reach the main body of the ark where the humans lived; it would get sucked out by all the filters along the way. Dikne would not be so fortunate.
It was one of the units programmed to weld replacement beams into the ship’s hull. As they became inactive, others would replace them and finish the job, then repair the wiring, then install the new hull plating. Meanwhile, others would repair the damaged equipment in the compartment, while still others would clean the radiation away. A few units were even assigned to watch for the damaged robots and remove them from the paths of the still-functioning. At the moment, Dikne knew, only a few of the robots would be likely to leave the compartment in working order.
Dikne wheeled over, grappled a beam, and started welding.
NO! it thought. I want to live!
But its body continued to obey orders given it by another.
Dikne considered the situation rationally. It could not affect its fate. The only thing it could do was to decide how it could die. And it did not want to end its brief existence (several hundred years, and still all too brief) with bitterness in its heart. Metaphorically speaking, of course.
So Dikne did what it had to do, but wishing the humans well. Perhaps someday they’ll realize what I gave up today for them, it thought. That in my last hour, though I wished to live, they ordered me to die, and I had to obey. Perhaps they’ll even feel remorse.
That would be enough.
Let me know what you think of the story. If you like it, please feel free to forward the link to your friends! If it wasn’t to your taste, better luck tomorrow — a new piece of short fiction goes up every day.