(Part 8 of 8 in "Three Worlds Collide")
Fire came to Huygens.
The star erupted.
Stranded ships, filled with children doomed by a second's last delay, still milled around the former Earth transit point. Too many doomed ships, far too many doomed ships. They should have left a minute early, just to be sure; but the temptation to load in that one last child must have been irresistable. To do the warm and fuzzy thing just this one time, instead of being cold and calculating. You couldn't blame them, could you...?
Yes, actually, you could.
The Lady Sensory switched off the display. It was too painful.
On the Huygens market, the price of a certain contract spiked to 100%. They were all rich in completely worthless assets for the next nine minutes, until the supernova blast front arrived.
"So," the Lord Pilot finally said. "What kind of asset retains its value in a market with nine minutes to live?"
"Booze for immediate delivery," the Master of Fandom said promptly. "That's what you call a -"
"Liquidity preference," the others chorused.
The Master laughed. "All right, that was too obvious. Well... chocolate, sex -"
"Not necessarily," said the Lord Pilot. "If you can use up the whole supply of chocolate at once, does demand outstrip supply? Same with sex - the value could actually drop if everyone's suddenly willing. Not to mention: Nine minutes?"
"All right then, expert oral sex from experienced providers. And hard drugs with dangerous side effects; the demand would rise hugely relative to supply -"
"This is inane," the Ship's Engineer commented.
The Master of Fandom shrugged. "What do you say in the unrecorded last minutes of your life that is not inane?"
"It doesn't matter," said the Lady Sensory. Her face was strangely tranquil. "Nothing that we do now matters. We won't have to live with the consequences. No one will. All this time will be obliterated when the blast front hits. The role I've always played, the picture that I have of me... it doesn't matter. There's... a peace... in not having to be Dalia Ancromein any more."
The others looked at her. Talk about killing the mood.
"Well," the Master of Fandom said, "since you raise the subject, I suppose it would be peaceful if not for the screaming terror."
"You don't have to feel the screaming terror," the Lady Sensory said. "That's just a picture you have in your head of how it should be. The role of someone facing imminent death. But I don't have to play any more roles. I don't have to feel screaming terror. I don't have to frantically pack in a few last moments of fun. There are no more obligations."
"Ah," the Master of Fandom said, "so I guess this is when we find out who we really are." He paused for a moment, then shrugged. "I don't seem to be anyone in particular. Oh well."
The Lady Sensory stood up, and walked across the room to where the Lord Pilot stood looking at the viewscreen.
"My Lord Pilot," the Lady Sensory said.
"Yes?" the Lord Pilot said. His face was expectant.
The Lady Sensory smiled. It was bizarre, but not frightening. "Do you know, my Lord Pilot, that I had often thought how wonderful it would be to kick you very hard in the testicles?"
"Um," the Lord Pilot said. His arms and legs suddenly tensed, preparing to block.
"But now that I could do it," the Lady Sensory said, "I find that I don't really want to. It seems... that I'm not as awful a person as I thought." She gave a brief sigh. "I wish that I had realized it earlier."
The Lord Pilot's hand swiftly darted out and groped the Lady Sensory's breast. It was so unexpected that no one had time to react, least of all her. "Well, what do you know," the Pilot said, "I'm just as much of a pervert as I thought. My self-estimate was more accurate than yours, nyah nyah -"
The Lady Sensory kneed him in the groin, hard enough to drop him moaning to the floor, but not hard enough to require medical attention.
"Okay," the Master of Fandom said, "can we please not go down this road? I'd like to die with at least some dignity."
There was a long, awkward silence, broken only by a quiet "Ow ow ow ow..."
"Would you like to hear something amusing?" asked the Kiritsugu, who had once been a Confessor.
"If you're going to ask that question," said the Master of Fandom, "when the answer is obviously yes, thus wasting a few more seconds -"
"Back in the ancient days that none of you can imagine, when I was seventeen years old - which was underage even then - I stalked an underage girl through the streets, slashed her with a knife until she couldn't stand up, and then had sex with her before she died. It was probably even worse than you're imagining. And deep down, in my very core, I enjoyed every minute."
Silence.
"I don't think of it often, mind you. It's been a long time, and I've taken a lot of intelligence-enhancing drugs since then. But still - I was just thinking that maybe what I'm doing now finally makes up for that."
"Um," said the Ship's Engineer. "What we just did, in fact, was kill fifteen billion people."
"Yes," said the Kiritsugu, "that's the amusing part."
Silence.
"It seems to me," mused the Master of Fandom, "that I should feel a lot worse about that than I actually do."
"We're in shock," the Lady Sensory observed distantly. "It'll hit us in about half an hour, I expect."
"I think it's starting to hit me," the Ship's Engineer said. His face was twisted. "I - I was so worried I wouldn't be able to destroy my home planet, that I didn't get around to feeling unhappy about succeeding until now. It... hurts."
"I'm mostly just numb," the Lord Pilot said from the floor. "Well, except down there, unfortunately." He slowly sat up, wincing. "But there was this absolute unalterable thing inside me, screaming so loud that it overrode everything. I never knew there was a place like that within me. There wasn't room for anything else until humanity was safe. And now my brain is worn out. So I'm just numb."
"Once upon a time," said the Kiritsugu, "there were people who dropped a U-235 fission bomb, on a place called Hiroshima. They killed perhaps seventy thousand people, and ended a war. And if the good and decent officer who pressed that button had needed to walk up to a man, a woman, a child, and slit their throats one at a time, he would have broken long before he killed seventy thousand people."
Someone made a choking noise, as if trying to cough out something that had suddenly lodged deep in their throat.
"But pressing a button is different," the Kiritsugu said. "You don't see the results, then. Stabbing someone with a knife has an impact on you. The first time, anyway. Shooting someone with a gun is easier. Being a few meters further away makes a surprising difference. Only needing to pull a trigger changes it a lot. As for pressing a button on a spaceship - that's the easiest of all. Then the part about 'sixteen billion' just gets flushed away. And more importantly - you think it was the right thing to do. The noble, the moral, the honorable thing to do. For the safety of your tribe. You're proud of it -"
"Are you saying," the Lord Pilot said, "that it was not the right thing to do?"
"No," the Kiritsugu said. "I'm saying that, right or wrong, the belief is all it takes."
"I see," said the Master of Fandom. "So you can kill billions of people without feeling much, so long as you do it by pressing a button, and you're sure it's the right thing to do. That's human nature." The Master of Fandom nodded. "What a valuable and important lesson. I shall remember it all the rest of my life."
"Why are you saying all these things?" the Lord Pilot asked the Kiritsugu.
The Kiritsugu shrugged. "When I have no reason left to do anything, I am someone who tells the truth."
"It's wrong," said the Ship's Engineer in a small, hoarse voice, "I know it's wrong, but - I keep wishing the supernova would hurry up and get here."
"There's no reason for you to hurt," said the Lady Sensory in a strange calm voice. "Just ask the Kiritsugu to stun you. You'll never wake up."
"...no."
"Why not?" asked the Lady Sensory, in a tone of purely abstract curiosity.
The Ship's Engineer clenched his hands into fists. "Because if hurting is that much of a crime, then the Superhappies are right." He looked at the Lady Sensory. "You're wrong, my lady. These moments are as real as every other moment of our lives. The supernova can't make them not exist." His voice lowered. "That's what my cortex says. My diencephalon wishes we'd been closer to the sun."
"It could be worse," observed the Lord Pilot. "You could not hurt."
"For myself," the Kiritsugu said quietly, "I had already visualized and accepted this, and then it was just a question of watching it play out." He sighed. "The most dangerous truth a Confessor knows is that the rules of society are just consensual hallucinations. Choosing to wake up from the dream means choosing to end your life. I knew that when I stunned Akon, even apart from the supernova."
"Okay, look," said the Master of Fandom, "call me a gloomy moomy, but does anyone have something uplifting to say?"
The Lord Pilot jerked a thumb at the expanding supernova blast front, a hundred seconds away. "What, about that?"
"Yeah," the Master of Fandom said. "I'd like to end my life on an up note."
"We saved the human species," offered the Lord Pilot. "Man, that's the sort of thing you could just repeat to yourself over and over and over again -"
"Besides that."
"Besides WHAT?"
The Master managed to hold a straight face for a few seconds, and then had to laugh.
"You know," the Kiritsugu said, "I don't think there's anyone in modern-day humanity, who would regard my past self as anything but a poor, abused victim. I'm pretty sure my mother drank during pregnancy, which, back then, would give your child something called Fetal Alcohol Syndrome. I was poor, uneducated, and in an environment so entrepreneurially hostile you can't even imagine it -"
"This is not sounding uplifting," the Master said.
"But somehow," the Kiritsugu said, "all those wonderful excuses - I could never quite believe in them myself, afterward. Maybe because I'd also thought of some of the same excuses before. It's the part about not doing anything that got to me. Others fought the war to save the world, far over my head. Lightning flickering in the clouds high above me, while I hid in the basement and suffered out the storm. And by the time I was rescued and healed and educated, in any shape to help others - the battle was essentially over. Knowing that I'd been a victim for someone else to save, one more point in someone else's high score - that just stuck in my craw, all those years..."
"...anyway," the Kiritsugu said, and there was a small, slight smile on that ancient face, "I feel better now."
"So does that mean," asked the Master, "that now your life is finally complete, and you can die without any regrets?"
The Kiritsugu looked startled for a moment. Then he threw back his head and laughed. True, pure, honest laughter. The others began to laugh as well, and their shared hilarity echoed across the room, as the supernova blast front approached at almost exactly the speed of light.
Finally the Kiritsugu stopped laughing, and said:
"Don't be ridicu-"
Eliezer, I didn't find your explicit arguments persuasive, nor even clear enough to be worth an explicit response. The fact that you yourself were persuaded of your conclusion by fiction does not raise my estimate of its quality. I don't think readers should much let down their guard against communication modes where sneaky persuasion is more feasible simply because the author has made some more explicit arguments elsewhere. I understand your temptation to use such means to persuade given that there are readers who have let down their guard. But I can only approve of that if I think your conclusions are worth such pushing.
Posted by: Robin Hanson | February 07, 2009 at 12:57 PM
Robin, it looks to me like we diverged at an earlier point in the argument than that. As far as I can tell, you're still working in something like a mode of moral realism/externalism (you asked whether the goodness of human values was "luck"). If this is the case, then the basic rules of argument I adopted will sound to you like mere appeals to intuition. I'm not sure what I could do about this - except, maybe, trying to rewrite and condense and simplify my writing on metaethics. It is not clear to me what other mode of argument you thought I could have adopted. So far as I know, trying to get people to see for themselves what their implicit rightness-function returns on various scenarios, is all there is and all there can possibly be.
Posted by: Eliezer Yudkowsky | February 07, 2009 at 01:12 PM
Robin: Well, to be fair, it was written well enough that lots of us are arguing about who was actually right.
ie, several of us seem to be taking the side that the Normal Ending was the better one, at least compared to the True Ending.
So it's not as if we were manipulated into taking the position that Eliezer seemed to be advocating.
Posted by: Psy-Kosh | February 07, 2009 at 01:13 PM
Also - explaining complex beliefs is a fantastically difficult enterprise. To consider fiction as just a way of "sneaking things past the unwary reader's guard" is selling it far short. It verges, I fear, on too much respect for respectability.
I tried to make the Superhappy position look as appealing as possible - show just how strange our position would look to someone who didn't have it, ask whether human children might be innocent victims, depict the human objections as overcomplicated departures from rationality. Of course I wanted my readers to feel Akon's helplessness. But to call that "sneaking things past someone's guard" is a little unfair to the possibilities of fiction as a vehicle for philosophy, I should think.
I'm still conflicted and worried about the ethics of writing fiction as a way of persuading people of anything, but conflict implies almost-balance; you don't seem to think that there's much in the way of benefit.
Posted by: Eliezer Yudkowsky | February 07, 2009 at 01:27 PM
Wei Dai, Sure I too could invent numbers large enough to make any calculation give me the result I want. But as it still stands I think that 2^(10^20) is impractically huge fore any universe, especially one that seems to be based in our own. Also I think it's hard to imagine that the starline topography to be completely random among the whole universe. So I stand by my stance that it is difficult to imagine a universe where the super happies do not come across the humans again in a relatively short period of time. And I figure the point about fiction is that you're trying to convince you're readers about a consistent world where a story takes place in.
Posted by: Nicholas "Indy" Ray | February 07, 2009 at 03:04 PM
EY, but you are a moral realist (or at least a moral objectivist, which ought to refer to the same thing). There's a fact about what's right, just like there's a fact about what's prime or what's baby-eating. It's a fact about the universe, independent of what anyone has to say about it. If we were human' we'd be moral' realists talking about what's right'. ne?
Posted by: Thom Blake | February 07, 2009 at 03:08 PM
Eliezer, academic philosophy offers exemplary formats and styles for low-sneak ways to argue about values.
Posted by: Robin Hanson | February 07, 2009 at 03:41 PM
Robin:
Eliezers philosophy of fun and how it related to human value system was not grounded enough to be intelligble?
Eliezer:
According which criterion is it balanced? How likely is it that extremely positive and strong feedback balances negative side-effect of manipulating people? Did you just justify use of fiction by feeling conflicted about it?
As a side note I'll comment that so far I have no reason to expect your story to be a excellent introduction to your ideas. It does show several ideas well, but readers not familiar with your writing miss a lot easily, but notice that you've used a lot of insider-speak. I have no reason to expect it to be terrible either.
Thom Blake:
Whatever is true of human rightness might not that much look anything invidual humans value, but it ought to address it all somehow. I'd like to hear if there is a reason to expect human rightness to be in some sense coherent and if there is then I'd like to understand in what sense there is. I don't remember from top of my head any posts addresing this.
Posted by: Anonymous | February 07, 2009 at 05:19 PM
Robin, what is your favorite piece of academic philosophy that argues about values?
Nicholas, our own universe may have an infinite volume, and it's only the speed of light that limits the size of the observable universe. Given that infinite universes are not considered implausible, and starlines are not considered implausible (at least as a fictional device), I find it surprising that you consider starlines that randomly connect a region of size 2^(10^20) to be implausible.
Starlines have to have an average distance of something, right? Why not 2^(10^20)?
Posted by: Wei Dai | February 07, 2009 at 05:43 PM
@Anon.
"if there is a reason to expect human rightness to be in some sense coherent"
Alas there probably is not. Sir Isaiah Berlin speaks powerfully and beautifully of this so-called values pluralism in his book Liberty.
There are several ironies - if not outright tragedies - of life and this is one: that we don't want what we want to want, and that the things we think we ought to want often conflict with each other as well as our underlying motives. We are not in charge of ourselves and we are mysterious to our own hearts. Men and women are conflicted and, due to evolution, conflict.
Posted by: frelkins | February 07, 2009 at 07:31 PM
I don't think I see how moral-philosophy fiction is problematic at all. When you have a beautiful moral sentiment that you need to offer to the world, of course you bind it up in a glorious work of high art, and let the work stand as your offering. That makes sense. When you have some info you want to share with the world about some dull ordinary thing that actually exists, that's when you write a journal article. When you've got something to protect, something you need to say, some set of notions that you really are entitled to, then you write a novel.
Just as it is dishonest to fail to be objective in matters of fact, so it is dishonest to feign objectivity where there simply is no fact. Why pretend to make arguments when what you really want to write is a hymn?
Posted by: Z. M. Davis | February 07, 2009 at 09:06 PM
Wei Dai, Except that traditionally speaking an infinitelly massive universe is generally considered implausable by the greater scientific community.
But I think the greater matter is that even if it were physicially possible, it's impossible to mentally reason about as a reader of good fiction. And thus has the ability to break internal consistancy of an otherwise good store in the mind of the reader.
Thanks,
Indy
Posted by: Nicholas | February 08, 2009 at 07:41 PM
The story specifically asks a question that none of the commenters have addressed yet.
"So," the Lord Pilot finally said. "What kind of asset retains its value in a market with nine minutes to live?"
My answer: Music.
If your world is going to end in nine minutes, you might as well play some music while you wait for the inevitable.
Short story collections, perhaps? If you've never read, say, "The Last Question", it would be your last chance. (And if you're reading this now, and you haven't read "The Last Question" yet, then something has gone seriously wrong in your life.)
Posted by: Doug S. | February 09, 2009 at 12:28 AM
Neh. Eliezer, I'm kind of disappointed by how you write the tragic ending ("saving" humans) as if it's the happy one, and the happy ending (civilization melting pot) as if it's the tragic one. I'm not sure what to make of that.
Do you really, actually believe that, in this fictional scenario, the human race is better off sacrificing a part of itself in order to avoid blending with the super-happies?
It just blows my mind that you can write an intriguing story like this, and yet draw that kind of conclusion.
Posted by: denis bider | February 12, 2009 at 02:54 PM
This was terribly interesting, I'll be re-reading it in a few days to see what I can pick up that I missed the first time through.
I'm not so sure we can so easily label these two endings the good and bad endings. In one, humanity (or at least what humanity evolved into) goes along with the superhappies, and in the other, they do not. Certainly, going along with the superhappies is not a good solution. We give up much of what we consider to be vital to our identity, and in return, the superhappies make their spaceships look nice. Now, the superhappies are also modifying themselves, arguably just as much as humanity is, but even if they lose (their perspective) as much as we lose (out perspective), we don't gain (our perspective) as much as we lose.
The true ending is about resisting the transformation. But they seem to accept an... unintuitive tradeoff while doing so. They trade the lives of a few billion humans for the ability to allow the superhappies to do to the babyeaters exactly what they intended to do to the humans. In fact, unless I missed something, I don't think taking this trade was ever even questioned, it just seemed that taking the trade and sacrificing the people to so that the babyeaters would be transformed exactly like humanity would have been was just common sense to them. Now, I can see how the characters in the story could perceive this choice as righteous and moral, but it seems to me to just be a another tragic ending, just in a different flavor. A tragedy due to a massive failure of humanity's morals, rather than a tragedy due to the loss of pain & suffering for humanity.
As an aside, the construction of the two (three?) alien species, with their traits, culture, and thought processes was superb.
Posted by: Zargon | February 12, 2009 at 04:13 PM
We all have our personal approaching supernova blast fronts in T minus...
In the intervening time, everyone with a powerful enough mind, please consider engaging in scientific research that has the potential to change the human condition. Don't waste your time on the human culture. It's not worth it - yet.
Posted by: Supernova Blast Front Rider | February 17, 2009 at 03:53 PM
Sorry I'm late... is no-one curious given the age of the universe why the 3 races are so close technologically? (Great filter? Super-advanced races have prime directive? Simulated experiment? Novas only happen if 3 connecting stars recently had their first gates be gates out? ...)
Eliezer if you're reading this, amazing story. I'm worried though about your responses to so many commenters (generally smarter & more rational than the most humans) with widely different preferences and values to what you see as right in this story and the fun sequence. I'm not saying your values are wrong, I'm saying you seem to have very optimistic models/estimates of where many human value systems go when fed lots of knowledge/rationality/good arguments. If so, I hope CEV doesn't depend on it.
If your model is causing you to be constantly surprised, then...
Posted by: Michael Howard | February 17, 2009 at 07:06 PM
Well, I re-read it, and now neither ending seems so tragic anymore. I now think that there is utility in transforming the babyeaters that I didn't see before.
That said, the way they went about their supernova operation seems illogical, particularly the part about giving them 3 hours and 41 minutes. I would imagine they decided on that amount of time by estimating the chance of the superhappies showing up as more time passes times the disutility of them stopping the operation, vs the number of humans that will be killed by the supernova as more time passes, and choosing optimal time.
It seems like relatively few humans are able to escape prior to around the 8 hour mark, and, given that the superhappies gave no indications of when, if ever, they would follow (before their operation with the babyeaters was finished), the best times to blow up the star would be either immediately, if the chance of the superhappies showing up is judged to be high, or the disutility of transforming all the humans is high (relatively speaking), or they would wait 8 hours and save most of the people on the planet. Waiting about half that time seems to be accepting a significant risk that the superhappies would show up, for not much gain, while waiting about another 4 hours seems to be about the same risk again, for a much larger gain.
Still though, a very good story. I expect I'll continue to stretch my mind now and then contemplating it.
Posted by: Zargon | February 24, 2009 at 06:31 PM
Zargon, I think the time given was how long it would take from beginning the feedback loop to the actual supernova, and they began the process the moment they arrived. If they could have destroyed the star immediately, they would have done so, but with the delay they encouraged as many people as possible to flee.
At least, that's how it sounded to me.
Posted by: a soulless automaton | February 24, 2009 at 07:28 PM
I know I'm way late, but I did once work out what kills you if your star goes supernova (type II anyway) while doing my dissertation on supernova physics. It's the neutrinos, as previously mentioned. They emerge from the center of the star several hours before there is any other outward sign of a problem. Any planet in a roughly 1 AU orbit will absorb enough energy from the neutrino blast to melt the entire planet into liquid rock, and this will happen pretty much instantly, everywhere. Needless to say, when the light hits, the planet absorbs photons much more efficiently, and the whole thing turns to vapor, but everyone is very very very dead long before then.
Posted by: astrophysicsgeek | March 06, 2009 at 06:22 PM
I really liked your story.
I know it's not a really insightful comment, or some philosophical masterpiece of reasoning, but it was fun and interesting.
Posted by: Haakon | March 15, 2009 at 09:43 AM
This is, perhaps, obsolete by now.
That said, there seems to be a serious reasoning problem in assuming that this is a permanent solution. A species capable of progressing from Galileo to FTL travel in what, 30 years, seems like given another few centuries (if not much, much less) would easily be able to track down the remainder of both alien civilizations via some alternate route.
Consequently it seems like a massive sacrifice to delay the inevitable, or, at least, a sacrifice with highly uncertain probability of preventing that which it seeks to prevent. Not to mention the aliens would be rather unlikely to give humans any say in what happened after this interaction. The point about them being harder to fool is also probably true.
Perhaps I fail to understand the science, which I doubt is the issue. Perhaps flawed reasoning was intended by the author? I have to admit I was rather in agreement with the adminitrator.
Posted by: Psychohistorian | April 04, 2009 at 07:20 PM