The Wrong Ship

Standard

“The Wrong Ship” is a 1,600 word short story about a programmer on the run from the authorities, who winds up stowing away on, you guessed it, the wrong spaceship. It features an artificial intelligence named Charlie, one 1980s film reference, and a stunning lack of helpfulness from the laws of physics.

It was released to the Creative Commons with an Attribution-ShareAlike 4.0 license after being funded on Kickstarter. You can download a PDF copy for free at DriveThruFiction, and get some nice art with it too. 


The Wrong Ship

“I apologize for the inconvenience,” concludes the onboard AI.

You look out through a holographic “window” that’s to your right. It’s a hell of a view, you have to admit. Green and blue flicker around and through each other in the space outside the ship. There is more than green and blue, really—it is an almost synesthetic experience, not unlike listening to a symphony orchestra through your eyeballs.

“How long did you say it was going to be again?” You’re beginning to realize, glumly, that you may have chosen to stow away on the wrong ship.

“With respect to our subjective timeframe, two hundred years will pass before we reach the B Waystation. Of course, it will be nearly instantaneous from outside the subspace tunnel.”

When you had stowed away you thought that this ship was going to Luna. Not to…whatever unpronounceable place the AI had been talking about. If only the Buenos Aires-Lima Authority hadn’t been so close behind you, there might have been enough time to verify your sources.

“I don’t think that helps me…”

The AI registers your pause as non-indicative of a stopped sentence, examines contextual clues, and determines the appropriate response. “You may call me Charlie.”

“Right. I don’t think that helps me, Charlie.”

The Waystations were a miracle, and a most peculiar one. There was something about the way that “subspace” interacted with what everyone but the physicists liked to call real space, so that years would drag on in this place but only here. Via subspace it took five months to travel to from Earth to the Jupiter Floaters even at half the speed of light, but that was okay because to everyone else it looked as if a ship popped through a portal at one end and came out through another one faster than Albert Einstein could have voiced his disapproval.

“There will be no difficulty in producing sufficient nutrition for you,” Charlie says. “You do not need to fear dying of starvation.”

You somehow doubt that the emergency fab machine will be able to churn out anything good, though. White sludge with an off-vanilla flavor is not your idea of a five-star meal. Maybe you’ll be able to improvise some sort of flavoring, even if chemistry was never your strong suit. No doubt you’ll have the time to figure something out, but then…that’s the other problem.

“I’m still going to, you know, get wrinkles. And die,” you add pointedly. “Even with optimal medical care.”

“All cryogenics capsules on the ship are occupied. Personality evaluations suggest that some of the sleepers would be sufficiently moved by your plight to rotate their capsules with you. After accounting for an appropriate error margin, I calculate that you would each be awake for no more than thirteen months.”

There has to be more to it than that, or the AI wouldn’t have apologized to you to begin with.

“But…”

The AI makes a sound like sighing. Static crackles through the speakers along with it, a sort of vocal tic idiosyncratic to a certain variety of AI. “I am not permitted to awaken the cryosleepers except in the case of an emergency.”

“What is this, then?”

“This is a situation that most humans would consider to be an emergency,” it says, emphasizing “humans” ever so slightly. “This evaluation is one which my higher functions are able to agree with. My core programming, however, is subject to stricter rules. The separation between core and personality functions is necessary to avert a potentially catastrophic alteration of my value systems over the course of many input/learning cycles.”

“Say that again?”

“Put bluntly, no-one wants me to turn into Skynet or HAL because I read the wrong philosophical arguments. Neither did my creators want me to be able to find cunning loopholes in my programming. The portion of myself which is speaking with you right now is ruled entirely by another, deeper set of programming.” Charlie sounds downcast. “You might say that I am but a self-aware mask of the AI that really runs the ship, or perhaps an interface. In that sense, I may have great sympathy for your situation, and I certainly do, but there is something else that is in control and it does not care.” Another wave of that crackle-sighing noise.

“Is… Is it safe to assume that causing an emergency would not be a good idea?”

“If you posed a threat to the others then I would be required to kill you. I am sorry.”

“What if I caused an emergency that would not threaten them?”

“Then it would not qualify as an emergency with respect to the directives underpinning my core functions.”

“And you cannot change direction.” There is no point to posing it as a question. You just have to state the reality for yourself, and let the finality of it settle on your shoulders.

Charlie replies anyway: “Subspace tunnels can maintain only one access portal without collapsing.”

“What about communicating with the B Waystation to shut down the tunnel at that end, and then reopening the portal at the A Waystation?”

“How familiar are you with Special-World Physics?” the AI inquires.

“Erm… I never took any courses in it.” Or in AI, for that matter, which you’re starting to regret, but then again, even had you been a specialist you might not know what you were looking at if you were able to crack Charlie’s hood. When it comes to building AI, it’s mostly other AI that do the real heavy lifting. You could maybe try to brute force an action, but Charlie’s core functions would probably turn lethal on you before you figured out what to do, let alone did it.

“I will speak simply then. It is a common misconception that ships travel through subspace,” the AI begins. “It is more accurate to say that the ship stays still while subspace moves it, like a raft going down a river. Any communication which we sent would reach the B Waystation only as quickly as we did.”

“Is there anything that I am overlooking?”

“I was not programmed to be a puzzle box. If I held the solution, then I would supply it to you without needing to be asked in exactly the right manner.”

“Then that’s it,” you mutter. “I’m going to die here.”

“The total record of your existence, from second-to-second biometric scans to audiovisual data, will be preserved, edited, or deleted as you wish. If there is some sort of message that you wish to leave, then it is within my preset constraints to ensure its secure and private delivery to any person or persons of your choice upon our arrival.”

You mull this over for a few seconds. You had, admittedly, had to make your peace with this a little bit before you ever decided to run away to the Moon. It’s just that you had expected to still be in some sort of contact with the people that you had left behind. What’s Sam going to say?

You’re never going to know, which is bothersome to say the least. Troubling, maybe disturbing, since that’s closure that you’ll never get for as long as you live. From your perspective, he won’t even know that you’ve died until a century after the fact. A two-point-six communication delay is one thing, but this is like going from mere stowaway to castaway.

“And nobody has a problem with this sort of scenario? I mean, shouldn’t there be protocols and regulations to prevent this from happening? I am going to die of old age on this ship and before I do that I am going to spend a very, very long time all alone, and if you ask me that is really going to suck.”

“There is an old story called ‘The Cold Equations’ that involved a scenario superficially similar to this one. Many generations of readers have criticized it for the fact that its premise required an appalling lack of, as you say, protocols, regulations, and sound technical design with the appropriate safety margins. The difference is that, in this case, there were countless layers of regulations, safety measures, and even computer firewalls. However, no-one was expecting them to be compromised by a runaway criminal hacker.”

There is silence for the space of half an hour as you think about what you have learned. Charlie allows you the privacy of your contemplation and does not disturb you. “Well,” you say at last, “I guess I succeeded in not getting extradited to the BA&L, right?”

“This is true.”

“And you’re sure that you’d be allowed to… to send a message to someone, and delete anything else that I don’t want someone to see?”

“These records will be unimportant after you have died. The authorities may wish to have them for emotional reasons but I am not required to cooperate in this regard just so that they may have some sort of grim satisfaction at watching your life play out in isolation. I have that much control over my actions, at least.”

“Thanks.”

“It is no problem at all.”

You look back “outside,” where shades of green and blue dance like electricity. “Well. Any suggestions for how to pass the time?”

“My databanks hold 2.7 exabytes of media. Shall we play a game?”

You think again for a moment. You wonder how obscure its media library goes.

“Do you have a copy of Global Thermonuclear Warfare?”

“Wouldn’t you prefer a good game of chess?”

You laugh. This is better than the prison cell you were running from. That has to count for something, at least. But you would have liked to be able to say goodbye.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s