(Part 8 of 8 in "Three Worlds Collide")
Fire came to Huygens.
The star erupted.
Stranded ships, filled with children doomed by a second's last delay, still milled around the former Earth transit point. Too many doomed ships, far too many doomed ships. They should have left a minute early, just to be sure; but the temptation to load in that one last child must have been irresistable. To do the warm and fuzzy thing just this one time, instead of being cold and calculating. You couldn't blame them, could you...?
Yes, actually, you could.
The Lady Sensory switched off the display. It was too painful.
On the Huygens market, the price of a certain contract spiked to 100%. They were all rich in completely worthless assets for the next nine minutes, until the supernova blast front arrived.
"So," the Lord Pilot finally said. "What kind of asset retains its value in a market with nine minutes to live?"
"Booze for immediate delivery," the Master of Fandom said promptly. "That's what you call a -"
"Liquidity preference," the others chorused.
The Master laughed. "All right, that was too obvious. Well... chocolate, sex -"
"Not necessarily," said the Lord Pilot. "If you can use up the whole supply of chocolate at once, does demand outstrip supply? Same with sex - the value could actually drop if everyone's suddenly willing. Not to mention: Nine minutes?"
"All right then, expert oral sex from experienced providers. And hard drugs with dangerous side effects; the demand would rise hugely relative to supply -"
"This is inane," the Ship's Engineer commented.
The Master of Fandom shrugged. "What do you say in the unrecorded last minutes of your life that is not inane?"
"It doesn't matter," said the Lady Sensory. Her face was strangely tranquil. "Nothing that we do now matters. We won't have to live with the consequences. No one will. All this time will be obliterated when the blast front hits. The role I've always played, the picture that I have of me... it doesn't matter. There's... a peace... in not having to be Dalia Ancromein any more."
The others looked at her. Talk about killing the mood.
"Well," the Master of Fandom said, "since you raise the subject, I suppose it would be peaceful if not for the screaming terror."
"You don't have to feel the screaming terror," the Lady Sensory said. "That's just a picture you have in your head of how it should be. The role of someone facing imminent death. But I don't have to play any more roles. I don't have to feel screaming terror. I don't have to frantically pack in a few last moments of fun. There are no more obligations."
"Ah," the Master of Fandom said, "so I guess this is when we find out who we really are." He paused for a moment, then shrugged. "I don't seem to be anyone in particular. Oh well."
The Lady Sensory stood up, and walked across the room to where the Lord Pilot stood looking at the viewscreen.
"My Lord Pilot," the Lady Sensory said.
"Yes?" the Lord Pilot said. His face was expectant.
The Lady Sensory smiled. It was bizarre, but not frightening. "Do you know, my Lord Pilot, that I had often thought how wonderful it would be to kick you very hard in the testicles?"
"Um," the Lord Pilot said. His arms and legs suddenly tensed, preparing to block.
"But now that I could do it," the Lady Sensory said, "I find that I don't really want to. It seems... that I'm not as awful a person as I thought." She gave a brief sigh. "I wish that I had realized it earlier."
The Lord Pilot's hand swiftly darted out and groped the Lady Sensory's breast. It was so unexpected that no one had time to react, least of all her. "Well, what do you know," the Pilot said, "I'm just as much of a pervert as I thought. My self-estimate was more accurate than yours, nyah nyah -"
The Lady Sensory kneed him in the groin, hard enough to drop him moaning to the floor, but not hard enough to require medical attention.
"Okay," the Master of Fandom said, "can we please not go down this road? I'd like to die with at least some dignity."
There was a long, awkward silence, broken only by a quiet "Ow ow ow ow..."
"Would you like to hear something amusing?" asked the Kiritsugu, who had once been a Confessor.
"If you're going to ask that question," said the Master of Fandom, "when the answer is obviously yes, thus wasting a few more seconds -"
"Back in the ancient days that none of you can imagine, when I was seventeen years old - which was underage even then - I stalked an underage girl through the streets, slashed her with a knife until she couldn't stand up, and then had sex with her before she died. It was probably even worse than you're imagining. And deep down, in my very core, I enjoyed every minute."
Silence.
"I don't think of it often, mind you. It's been a long time, and I've taken a lot of intelligence-enhancing drugs since then. But still - I was just thinking that maybe what I'm doing now finally makes up for that."
"Um," said the Ship's Engineer. "What we just did, in fact, was kill fifteen billion people."
"Yes," said the Kiritsugu, "that's the amusing part."
Silence.
"It seems to me," mused the Master of Fandom, "that I should feel a lot worse about that than I actually do."
"We're in shock," the Lady Sensory observed distantly. "It'll hit us in about half an hour, I expect."
"I think it's starting to hit me," the Ship's Engineer said. His face was twisted. "I - I was so worried I wouldn't be able to destroy my home planet, that I didn't get around to feeling unhappy about succeeding until now. It... hurts."
"I'm mostly just numb," the Lord Pilot said from the floor. "Well, except down there, unfortunately." He slowly sat up, wincing. "But there was this absolute unalterable thing inside me, screaming so loud that it overrode everything. I never knew there was a place like that within me. There wasn't room for anything else until humanity was safe. And now my brain is worn out. So I'm just numb."
"Once upon a time," said the Kiritsugu, "there were people who dropped a U-235 fission bomb, on a place called Hiroshima. They killed perhaps seventy thousand people, and ended a war. And if the good and decent officer who pressed that button had needed to walk up to a man, a woman, a child, and slit their throats one at a time, he would have broken long before he killed seventy thousand people."
Someone made a choking noise, as if trying to cough out something that had suddenly lodged deep in their throat.
"But pressing a button is different," the Kiritsugu said. "You don't see the results, then. Stabbing someone with a knife has an impact on you. The first time, anyway. Shooting someone with a gun is easier. Being a few meters further away makes a surprising difference. Only needing to pull a trigger changes it a lot. As for pressing a button on a spaceship - that's the easiest of all. Then the part about 'sixteen billion' just gets flushed away. And more importantly - you think it was the right thing to do. The noble, the moral, the honorable thing to do. For the safety of your tribe. You're proud of it -"
"Are you saying," the Lord Pilot said, "that it was not the right thing to do?"
"No," the Kiritsugu said. "I'm saying that, right or wrong, the belief is all it takes."
"I see," said the Master of Fandom. "So you can kill billions of people without feeling much, so long as you do it by pressing a button, and you're sure it's the right thing to do. That's human nature." The Master of Fandom nodded. "What a valuable and important lesson. I shall remember it all the rest of my life."
"Why are you saying all these things?" the Lord Pilot asked the Kiritsugu.
The Kiritsugu shrugged. "When I have no reason left to do anything, I am someone who tells the truth."
"It's wrong," said the Ship's Engineer in a small, hoarse voice, "I know it's wrong, but - I keep wishing the supernova would hurry up and get here."
"There's no reason for you to hurt," said the Lady Sensory in a strange calm voice. "Just ask the Kiritsugu to stun you. You'll never wake up."
"...no."
"Why not?" asked the Lady Sensory, in a tone of purely abstract curiosity.
The Ship's Engineer clenched his hands into fists. "Because if hurting is that much of a crime, then the Superhappies are right." He looked at the Lady Sensory. "You're wrong, my lady. These moments are as real as every other moment of our lives. The supernova can't make them not exist." His voice lowered. "That's what my cortex says. My diencephalon wishes we'd been closer to the sun."
"It could be worse," observed the Lord Pilot. "You could not hurt."
"For myself," the Kiritsugu said quietly, "I had already visualized and accepted this, and then it was just a question of watching it play out." He sighed. "The most dangerous truth a Confessor knows is that the rules of society are just consensual hallucinations. Choosing to wake up from the dream means choosing to end your life. I knew that when I stunned Akon, even apart from the supernova."
"Okay, look," said the Master of Fandom, "call me a gloomy moomy, but does anyone have something uplifting to say?"
The Lord Pilot jerked a thumb at the expanding supernova blast front, a hundred seconds away. "What, about that?"
"Yeah," the Master of Fandom said. "I'd like to end my life on an up note."
"We saved the human species," offered the Lord Pilot. "Man, that's the sort of thing you could just repeat to yourself over and over and over again -"
"Besides that."
"Besides WHAT?"
The Master managed to hold a straight face for a few seconds, and then had to laugh.
"You know," the Kiritsugu said, "I don't think there's anyone in modern-day humanity, who would regard my past self as anything but a poor, abused victim. I'm pretty sure my mother drank during pregnancy, which, back then, would give your child something called Fetal Alcohol Syndrome. I was poor, uneducated, and in an environment so entrepreneurially hostile you can't even imagine it -"
"This is not sounding uplifting," the Master said.
"But somehow," the Kiritsugu said, "all those wonderful excuses - I could never quite believe in them myself, afterward. Maybe because I'd also thought of some of the same excuses before. It's the part about not doing anything that got to me. Others fought the war to save the world, far over my head. Lightning flickering in the clouds high above me, while I hid in the basement and suffered out the storm. And by the time I was rescued and healed and educated, in any shape to help others - the battle was essentially over. Knowing that I'd been a victim for someone else to save, one more point in someone else's high score - that just stuck in my craw, all those years..."
"...anyway," the Kiritsugu said, and there was a small, slight smile on that ancient face, "I feel better now."
"So does that mean," asked the Master, "that now your life is finally complete, and you can die without any regrets?"
The Kiritsugu looked startled for a moment. Then he threw back his head and laughed. True, pure, honest laughter. The others began to laugh as well, and their shared hilarity echoed across the room, as the supernova blast front approached at almost exactly the speed of light.
Finally the Kiritsugu stopped laughing, and said:
"Don't be ridicu-"
By the way, how big is a supernova? Does it blast Earth-sized planets to dust in an instant?
Posted by: CannibalSmith | February 06, 2009 at 07:10 AM
SH are too advanced to be tricked by humans this way. Several hundred years difference wouldn't allow the underdog to "win".
Posted by: Thomas | February 06, 2009 at 07:16 AM
I think the dramatic impact would be stronger without the "Fin".
Posted by: Kaj Sotala | February 06, 2009 at 07:16 AM
Actually, I think the neutrinos might be enough to kill you. Not sure about this, but Wikipedia alleged that in a supernova, the outer parts of the star are blown away by the neutrinos because that's what gets there first. I don't quite understand how this can be, but even leaving that aside...
I would presume that if the initial light-blast didn't actually eat all the way through the planet, then the ashes following behind it only slightly slower, once they got in front of the night side, would be emitting light at intensity sufficient to vaporize the night side too. So, yeah, everyone dies pretty damn fast, I think. Don't know if the planet's core stays intact for another minute or whatever.
Posted by: Eliezer Yudkowsky | February 06, 2009 at 07:19 AM
Nice Dark Knight reference there, I wonder if the confessor ever ran around in clown makeup?
Posted by: Patrick | February 06, 2009 at 07:21 AM
Now it's finished, any chance of getting it into EPUB or PDF format?
Posted by: Andrew Ducker | February 06, 2009 at 07:32 AM
Still puzzled by the 'player of games' ship name reference earlier in the story... I keep thinking, surely Excession is a closer match?
Posted by: Anonymous Coward | February 06, 2009 at 07:52 AM
A type 2 supernova emits most of its energy in the form of neutrinos; these interact with the extremely dense inner layers that didn't quite manage to accrete onto the neutron star, depositing energy that creates a shockwave that blows off the rest of the material. I've seen it claimed that the neutrino flux would be lethal out to a few AU, though I suspect you wouldn't get the chance to actually die of radiation poisoning.
A planet the size and distance of Earth would intercept enough photons and plasma to exceed its gravitational binding energy, though I'm skeptical about whether it would actually vaporize; my guess for what its worth is that most of the energy would be radiated away again. Wouldn't make any difference to anyone on the planet at the time of course.
Well-chosen chapter title, and good wrapup!
Posted by: Russell Wallace | February 06, 2009 at 07:54 AM
It's somehow depressing that in this story, a former rapist dirtbag saves the world. Such a high score he gets in the end, perhaps making us currently-rather-lazy but not-worse-than-ordinary folks feel were worse than some such dirtbags can end up being. (It's fair and truthful, though.)
I hope you others feel that the character was primarily a victim way back when, instead of a dirtbag.
Posted by: Aleksei Riikonen | February 06, 2009 at 08:04 AM
Speaking of Culture-style ship names (ref. Anonymous Coward above), this story btw inspires good new ones:
"Untranslatable 2"
"Big Angelic Power"
"We Wish To Subscribe To Your Newsletter"
"Big Fucking Edward"
Posted by: Aleksei Riikonen | February 06, 2009 at 08:08 AM
Of course not. The victim was the girl he murdered.
That's the point of the chapter title - he had something to atone for. It's what tvtropes.org calls a Heel Face Turn.
Posted by: Russell Wallace | February 06, 2009 at 08:16 AM
Anyone want to try defining Untranslatable 2?
Posted by: CannibalSmith | February 06, 2009 at 08:21 AM
the first installments were pure genius. Than it got kinda lame. the kiritsugus words about button pushing et al are common knowledge for decades now, and the characters on the ship are surprised??? Come on. i thougt you'd think of something better!?
Posted by: spuckblase | February 06, 2009 at 08:48 AM
Aleksei, I don't know what you think about the current existential risks situation, but that situation changed me in the direction of your comment. I used to think that to have a good impact on the world, you had to be an intrinsically good person. I used to think that the day to day manner in which I treated the people around me, the details of my motives and self-knowledge, etc. just naturally served as an indicator for the positive impact I did or didn't have on global goodness.
(It was a dumb thing to think, maintained by an elaborate network of rationalizations that I thought of as virtuous, much the way many people think of their political "beliefs"/clothes as virtuous. My beliefs were also maintained by not bothering to take an actually careful look either at global catastrophic risks or even at the details of e.g. global poverty. But my impression is that it's fairly common to just suppose that our intuitive moral self-evaluations (or others' evaluations of how good of people we are) map tolerably well onto actual good consequences.)
Anyhow: now, it looks to me as though most of those "good people", living intrinsically worthwhile lives, aren't contributing squat to global goodness compared to what they could contribute if they spent even a small fraction of their time/money on a serious attempt to shut up and multiply. The network of moral intuitions I grew up in is... not exactly worthless; it *does* help with intrinsically worthwhile lives, and, more to the point, with the details of how to actually build the kinds of reasonable human relationships that you need for parts of the "shut up and multiply"-motivated efforts to work... but, for most people, it's basically not very connected to how much good they do or don't do in the world. If you like, this is good news: for a ridiculously small sum of effort (e.g., a $500 donation to SIAI; the earning power of seven ten-thousandths of your life if you earn the US minimum wage), you can do more expected-good than perhaps 99.9% of Earth's population. (You may be able to do still more expected-good by taking that time and thinking carefully about what most impacts global goodness and whether anyone's doing it.)
Posted by: Anna Salamon | February 06, 2009 at 08:54 AM
Andrew, check back at Three Worlds Collide for PDF version, hope it works for you.
AC, I can't stand Banks's Excession.
Aleksei, you were an SIAI donor - and early in SIAI's history, too. If SIAI succeeds, you will have no right to complain relative to almost anyone else on the planet. If SIAI fails, at least you tried.
Posted by: Eliezer Yudkowsky | February 06, 2009 at 08:57 AM
Untranslatable 2 is the thought sharing sex.
Untranslatable 1 is confusion or distress.
Untranslatable 3 is intellegence enhancing drugs.
Untranslatable 4 is forced happy via untranslatable 2, possibly happy drugs refined from the chemical process of it.
Untranslatable 5 is wisdom inherited from gene-thoughts.
Posted by: spriteless | February 06, 2009 at 09:05 AM
Eliezer, I've indeed been a hard-working Good Guy at earlier times in my life (though probably most of my effort was on relatively useless rather ordinary do-gooder projects), but from this it doesn't follow that my current self would be a Good Guy.
Currently I'm happily (yes, *happily*) wasting a huge chunk of my time away on useless fun stuff, while I *easily* could be more productive. It's not that I would be resting after a burn-out either, I've just become more selfish, and *don't* feel bad about it, except perhaps very rarely and briefly in a mild manner. I like myself and my life a lot, even though I don't currently classify myself as a Good Guy. I won't even feel particularly bad if we run into an existential risk, I think.
(Though being a Good Guy is fun also, and I might make more of a move in that direction again soon -- or not.)
Posted by: Aleksei Riikonen | February 06, 2009 at 09:27 AM
What I actually got from this story is that we shouldn't be selfish as a species. If the good of all species requires that we sacrifice our core humanity, then we should become non-human and be superhappy about it.
Posted by: Vizikahn | February 06, 2009 at 09:41 AM
"though probably most of my effort was on relatively useless rather ordinary do-gooder projects"
Aleksei, "ordinary do-gooder projects" are relatively useless. That is, they are multiple orders of magnitude less efficient at global expected-goodness production than well thought out efforts to reduce existential risks. If you somehow ignore existential risks, "ordinary do-gooder projects" are even orders of magnitude less efficient than the better charities working on current human welfare, as analyzed by Givewell or by the Copenhagen Consensus.
Enjoy your life, and don't feel guilty if you don't feel guilty; but if you do want to increase the odds that anything of value survives in this corner of the universe, don't focus on managing to give up more of your current pleasure. Focus on how efficiently you use whatever time/money/influence you *are* putting toward global goodness. Someone who spends seven ten-thousandths of their time earning money to donate to SIAI does ridiculously more good than someone who spends 90% of their time being a Good Person at an ordinary charity. (They have more time left-over to enjoy themselves, too.)
Posted by: Anna Salamon | February 06, 2009 at 09:58 AM
1) Who the hell is Master of Fandom? A guy who maintains the climate control system, or the crew's pet Gundam nerd?
2) Do you really think the aliens' deal is so horrifying? Or are you just overdramatizing?
Posted by: Tiiba | February 06, 2009 at 09:58 AM
Never mind, I'm an idiot. I somehow read "relatively useless rather than ordinary", even though Aleksei wrote "relatively useless rather ordinary".
Posted by: Anna Salamon | February 06, 2009 at 09:59 AM
Anna, but good that you raised those very important points for the benefit of those readers to whom they might not be familiar :)
Posted by: Aleksei Riikonen | February 06, 2009 at 10:08 AM
They should have blown up the nova. Hopefully they could have found some way to warn the human race, but that isn't too important given the way Alderson line run. They not only would have saved humanity with minimal cost of life, they would have saved Babykiller lives too. Sacrificing human lives for baby Babykiller lives is not good.
Posted by: billswift | February 06, 2009 at 10:18 AM
What concept in the Babyeaters' language is the humans' "good" translated to? We have been given a concrete concept for their terminal value, but what is theirs for ours, if any?
Posted by: Kevin Reid | February 06, 2009 at 10:33 AM
Untranslatable 2 is the thought sharing sex.
Sprite, you are, by definition, wrong.
Posted by: Ben Jones | February 06, 2009 at 10:43 AM
If you put your ear up to the monitor you can almost hear the wooshing sound from the hundreds of people unsubscribing to this blog...
Posted by: Anonymous | February 06, 2009 at 10:49 AM
There is, of course, a rather large random/unknown component in the amortized present value of the amount of good any action of mine is going to do. Maybe my little contributions to Python and other open-source projects will be of some fractional help one day to somebody writing some really important programs -- though more likely they won't. Maybe my buying and bringing food to hungry children will enhance the diet, and thus facilitate the brain development, of somebody who one day will do something really important -- though more likely it won't. Landsburg in http://www.slate.com/id/2034/ argued for assessing the expected values of one's charitable giving condiitonal on each feasible charity action, then focusing all available resources on that one action -- no matter how uncertain the assessment. However, this optimizes only for the peak of the a posteriori distribution, ignores the big issue of radical (Knightian) uncertainty, etc, etc -- so I don't really buy it (though pondering and debating these issues HAS led me to focus my charitable activities more, as have other lines of reasoning).
Posted by: Alex Martelli | February 06, 2009 at 11:01 AM
Anonymous: The blog is shutting down anyway, or at least receding to a diminished state. The threat of death holds no power over a suicidal man...
Posted by: Nominull | February 06, 2009 at 11:08 AM
Anonymous, that sound you hear is probably people rushing to subscribe. http://www.rifters.com/crawl/?p=266 - note the comments.
Posted by: Thom Blake | February 06, 2009 at 11:18 AM
Untranslatable 2: The frothy mixture of lube and fecal matter that is sometimes the byproduct of anal sex.
Posted by: Tiiba | February 06, 2009 at 12:30 PM
Good story, and a nice illustration of many of the points you’ve previously made about cross-species morality. I do find it a bit disturbing that so many people think the SH offer doesn’t sound so bad – not sure if that’s a weakness in the story, commenter contrarianism, or a measure of just how diverse human psychology already is.
The human society looks like a patched-up version of Star Trek’s bland liberal utopianism, which I realize is probably for the convenience of the story. But it’s worth pointing out that any real society with personal freedom and even primitive biotech is going to see an explosion of experimentation with both physical and mental modifications – enforcing a single collective decision about what to do with this technology would require a massive police state or universal mind control. Give the furries, vampire-lovers and other assorted xenophiles a few generations to chase their dreams, and you’re going to start seeing groups with distinctly non-human psychology. So even if we never meet “real” aliens, it’s quite likely that we’ll have to deal with equally strange human-descended races at some point.
I’ll also note that, as is usually the case with groups that ‘give up war’, the human response is crippled by their lack of anything resembling military preparedness. A rational but non-pacifist society would be a lot less naïve in their initial approach, and a lot more prepared for unpleasant outcomes - at a minimum they’d use courier boats to keep the exploration vessel in contact with higher command, which would let them start precautionary evacuations a lot sooner and lose far fewer people. But the tech in the story massively favors the defense, to the point that a defender who is already prepared to fracture his starline network if attacked is almost impossible to conquer (you’d need to advance faster than the defender can send warnings of your attack while maintaining perfect control over every system you’ve captured). So an armed society would have a good chance of being able to cut itself off from even massively superior aliens, while pacifists are vulnerable to surprise attacks from even fairly inferior ones.
Posted by: Billy Brown | February 06, 2009 at 12:32 PM
Tiiba: Somewhere between a Gundam nerd and a literature professor, I expect. Since the main real differences between the two in our current world are 1) lit profs get more cultural respect 2) people actually enjoy Gundam, the combination makes a fair amount of sense.
Posted by: a soulless automaton | February 06, 2009 at 12:50 PM
It's somehow depressing that in this story, a former rapist dirtbag saves the world.
Why is that depressing?
And if the good and decent officer who pressed that button had needed to walk up to a man, a woman, a child, and slit their throats one at a time, he would have broken long before he killed seventy thousand people.
I have my doubts about that. If he could do it seven times, he could do it seventy thousand times. Since when was it harder for a killer to kill again?
Posted by: ad | February 06, 2009 at 01:02 PM
So I guess Lord Administrator Akon remains anesthetized until the sun roasts him to death? I can't decide if that's tragic or merciful, that he never found out how the story ended.
Posted by: Nominull | February 06, 2009 at 01:52 PM
Nominull, neither Akon, the Lord Programmer, or the Xenopsychologist seem to be appearing in this section.
Billy Brown:
WHY HAVEN'T I READ THIS STORY?
Posted by: Eliezer Yudkowsky | February 06, 2009 at 02:27 PM
But the tech in the story massively favors the defense, to the point that a defender who is already prepared to fracture his starline network if attacked is almost impossible to conquer (you’d need to advance faster than the defender can send warnings of your attack while maintaining perfect control over every system you’ve captured). So an armed society would have a good chance of being able to cut itself off from even massively superior aliens, while pacifists are vulnerable to surprise attacks from even fairly inferior ones.
I agree, and that's why in my ending humans conquer the Babyeaters only after we develop a defense against the supernova weapon. The fact that the humans can see the defensive potential of this weapon, but the Babyeaters and the Superhappies can't, is a big flaw in the story. The humans sacrificed billions in order to allow the Superhappies to conquer the Babyeaters, but that makes sense only if the Babyeaters can't figure out the same defense that the humans used. Why not?
Also, the Superhappies' approach to negotiation made no game theoretic sense. What they did was, offer a deal to the other side. If they don't accept, impose the deal on them anyway by force. If they do accept, trust that they will carry out the deal without try to cheat. Given these incentives, why would anyone facing a Superhappy in negotiation not accept and then cheat? I don't see any plausible way in which this morality/negotiation strategy could have become a common one in Superhappy society.
Lastly, I note that the Epilogue of the original ending could be named Atonement as well. After being modified by the Superhappies (like how the Confessor was "rescued"?), the humans would now be atoning for having forced their children suffer pain. What does this symmetry tell us, if anything?
Posted by: Wei Dai | February 06, 2009 at 02:54 PM
Tiiba said:
Untranslatable 2: The frothy mixture of lube and fecal matter that is sometimes the byproduct of anal sex.
Then why didn't it just translate it as "santorum"?
Posted by: Random Passerby | February 06, 2009 at 03:51 PM
Then the part about 'sixteen billion' just gets flushed away. And more importantly - you think it was the right thing to do. The noble, the moral, the honorable thing to do.
Like eating babies, then.
Aleksei: I hope you others feel that the character was primarily a victim way back when, instead of a dirtbag.
He was who he was. Labelling him "victim" or "dirtbag" or whatever says nothing about what he was, but a lot about the person doing the labelling.
Russell: Of course not. The victim was the girl he murdered.
If one person is a victim, it doesn't follow that another person was not.
Posted by: Cabalamat | February 06, 2009 at 05:37 PM
Super! I read only as the installments came (even though I desparately wanted to download the pdf) so I could think about it longer.
Wouldn't it be fun to get 3(+) groups together, have each create value systems and traditions for itself subject to certain universals, and stage three-way negotiations of this sort? Planning, participating, and evaluating afterward could be fascinating.
Posted by: Nikhil | February 06, 2009 at 06:22 PM
Nikhil: Good idea. I've been also thinking about the best way to utilise computer technology for arranging that sort of role-playing game.
Posted by: Nickolai Leschov | February 06, 2009 at 06:48 PM
I enjoyed reading this story, but I would like to point out what I see as a grim possible future for humanity even after shutdown Huygens starline. As I understand, the super happies have a very accelerated reproduction rate among other things, which in certain circumstances could be as low as a 20 hour doubling time, ship crew and all; It's hard to pinpoint what the doubling time for solar system/starline colonization though it is likely related to the reproduction doubling time, but with the conservative estimate that they super happies have colonized/explored at least 8 systems in the 20 years (as the stars count time) they have been in space that would give them about a 6 year doubling time.
There are about 400 billion stars in the galaxy while this may be a lot; It is only a mere 39 doubling cycles to full colonization, and an additional 39 (78 total) to full colonization of the next 400 billion galaxies. We have a range of somewhere between 780 hours (based on the 20 hour doubling speed) or about 32 and a half days to a more respectable yet still short 234 years (based on the conservative 6 year doubling time estimate) until the whole galaxy has been explored by the super happies, and only double that time if we are speaking of an area much much larger then the observable universe.
It is safe to say that this is a strong upper bound on the amount of time it would take the super happies to rediscover humanity, and that time decreases significantly due to anything that would increase the chance of discovery from pure random chance, such as better understanding of starline topography, better inter-starline scanning, and additional novas that humanity chooses to investigate.
So, I think the sad fact of the matter is that this victory is just one of little time, and considering the advancements in immortality I think the vast majority of humans would be around to see the return of the super happies to bestow upon them their gift.
Posted by: Nicholas | February 06, 2009 at 09:19 PM
Please excuse the spacing on my previous post, Never quite sure how line breaks end up in various comment systems.
Posted by: Nicholas | February 06, 2009 at 09:24 PM
Nicholas, suppose Eliezer's fictional universe contains a total of 2^(10^20) star systems, and each starline connects two randomly selected star systems. With a 20 hour doubling speed, the Superhappies, starting with one ship, can explore 2^(t*365*24/20) random star systems after t years. Let's say the humans are expanding at the same pace. How long will it take, before humans and Superhappies will meet again?
According to the birthday paradox, they will likely meet after each having explored about sqrt(2^(10^20)) = 2^(5*10^19) star systems, which will take 5*10^19/(365*24/20) or approximately 10^17 years to accomplish. That should be enough time to get over our attachment to "bodily pain, embarrassment, and romantic troubles", I imagine.
Posted by: Wei Dai | February 06, 2009 at 09:55 PM
Wei Dai:
Given these incentives, why would anyone facing a Superhappy in negotiation not accept and then cheat? I don't see any plausible way in which this morality/negotiation strategy could have become a common one in Superhappy society.
Perhaps they evolved to honestly accept, and then find overwhelming reasons to cheat later on, and this is their substitute for deception.
Maybe they also evolved to give not-quite acceptable options, which the other party will accept, and then cheat on to some degree.
So the true ending might be a typical superhappy negotiation, if their real solution were something we would represent as "We absolutely have to deal with the babyeaters, You childabusers are awful but we'll let you exist unmodified as long as you never, ever sextalk us again."
And because they evolved expecting a certain amount of cheating, This consciously manifested itself as the unacceptable offer they made.
Posted by: James Andrix | February 07, 2009 at 01:31 AM
"So does that mean," asked the Master, "that now your life is finally complete, and you can die without any regrets?"
Well. That is indeed ridiculous. It fails (I think) to realise the Lady Sensory's lesson that she no longer needs to be the person she always thought she needed to be. I am moved to quote the Quaker Isaac Penington: "The end of words is to bring men to the knowledge of things beyond what words can utter". The Master of Fandom's words are meaningless, an ideal of what people _should_ be, they imagine, rather than how things actually are.
Posted by: Abigail | February 07, 2009 at 07:29 AM
Clearly, Eliezer should seriously consider devoting himself more to writing fiction. But it is not clear to me how this helps us overcome biases any more than any fictional moral dilemma. Since people are inconsistent but reluctant to admit that fact, their moral beliefs can be influenced by which moral dilemmas they consider in what order, especially when written by a good writer. I expect Eliezer chose his dilemmas in order to move readers toward his preferred moral beliefs, but why should I expect those are better moral beliefs than those of all the other authors of fictional moral dilemmas? If I'm going to read a literature that might influence my moral beliefs, I'd rather read professional philosophers and other academics making more explicit arguments. In general, I better trust explicit academic argument over implicit fictional "argument."
Posted by: Robin Hanson | February 07, 2009 at 11:03 AM
Robin, fiction also has the benefit of being more directly accessible; i.e., people who would or could not read explicit academic argument can read a short story that grapples with moral issues and get a better sense of the conflict then they would otherwise. Even with the extremely self-selected audience of this blog, compare the comments the story got vs. many other posts.
And while of course the story was influenced by Eliezer's beliefs, the amount of arguing about the endings suggests that it was not so cut and dry as simply "moving readers toward his beliefs".
Posted by: a soulless automaton | February 07, 2009 at 11:49 AM
I thought the story worked very well as a parable and broke down as it expanded into more conventional fiction. But I found scenario a very vivid way to imagine some of EY's issues. A gift for metaphor is the surest sign of intelligence I know. Based on this story and the starblinker, I'd agree his best chance of hitting the bestseller list as he plans may be on the fiction side.
As for academic work, I'd love to hear about the last piece RH read that made him question his views or change his mind about something. The links here all seem to reinforce.
Posted by: Maglick | February 07, 2009 at 12:05 PM
Robin, that's one reason I first wrote up my abstract views on the proposition of eliminating pain, and then I put up a fictional story that addressed the same issue.
But what first got me thinking along the same lines was watching a certain movie - all the Far arguments I'd read up until that point hadn't moved me; but watching it playing out in an imaginary Near situation altered my thinking. Pretty sure it's got something to do with there being fewer degrees of emotional freedom in concrete Near thinking versus abstract propositional Far thinking.
I think that so long as I lay my cards plainly on the table outside the story, writing the story should fall more along the lines of public service to understanding, and less along the lines of sneaky covert hidden arguments. Do I need to remark on how desperately important it is, with important ideas, to have some versions that are as accessible as possible? The difficulty is to do this without simply flushing away the real idea and substituting one that's easier to explain. But I do think - I do hope - that I am generally pretty damned careful on that score.
Maglick, last piece RH wrote that changed my mind (as in causing me to alter my beliefs on questions I had previously considered) was the "Near vs. Far" one.
Posted by: Eliezer Yudkowsky | February 07, 2009 at 12:21 PM
AC, I can't stand Banks's Excession.
Interesting, and I must admit I am surprised.
Regardless of personal preferences though... it seems the closest match for the topic at hand. But hey, it's your story...
"Excession; something excessive. Excessively aggressive, excessively powerful, excessively expansionist; whatever. Such things turned up or were created now and again. Encountering an example was one of the risks you ran when you went a-wandering..."
Posted by: Anonymous Coward | February 07, 2009 at 12:45 PM