(Part 5 of 8 in "Three Worlds Collide")
Akon strode into the main Conference Room; and though he walked like a physically exhausted man, at least his face was determined. Behind him, the shadowy Confessor followed.
The Command Conference looked up at him, and exchanged glances.
"You look better," the Ship's Master of Fandom ventured.
Akon put a hand on the back of his seat, and paused. Someone was absent. "The Ship's Engineer?"
The Lord Programmer frowned. "He said he had an experiment to run, my lord. He refused to clarify further, but I suppose it must have something to do with the Babyeaters' data -"
"You're joking," Akon said. "Our Ship's Engineer is off Nobel-hunting? Now? With the fate of the human species at stake?"
The Lord Programmer shrugged. "He seemed to think it was important, my lord."
Akon sighed. He pulled his chair back and half-slid, half-fell into it. "I don't suppose that the ship's markets have settled down?"
The Lord Pilot grinned sardonically. "Read for yourself."
Akon twitched, calling up a screen. "Ah, I see. The ship's Interpreter of the Market's Will reports, and I quote, 'Every single one of the underlying assets in my market is going up and down like a fucking yo-yo while the ship's hedgers try to adjust to a Black Swan that's going to wipe out ninety-eight percent of their planetside risk capital. Even the spot prices on this ship are going crazy; either we've got bubble traders coming out of the woodwork, or someone seriously believes that sex is overvalued relative to orange juice. One derivatives trader says she's working on a contract that will have a clearly defined value in the event that aliens wipe out the entire human species, but she says it's going to take a few hours and I say she's on crack. Indeed I believe an actual majority of the people still trying to trade in this environment are higher than the heliopause. Bid-ask spreads are so wide you could kick a fucking football stadium through them, nothing is clearing, and I have unisolated conditional dependencies coming out of my ass. I have no fucking clue what the market believes. Someone get me a drink.' Unquote." Akon looked at the Master of Fandom. "Any suggestions get reddited up from the rest of the crew?"
The Master cleared his throat. "My lord, we took the liberty of filtering out everything that was physically impossible, based on pure wishful thinking, or displayed a clear misunderstanding of naturalistic metaethics. I can show you the raw list, if you'd like."
"And what's left?" Akon said. "Oh, never mind, I get it."
"Well, not quite," said the Master. "To summarize the best ideas -" He gestured a small holo into existence.
Ask the Superhappies if their biotechnology is capable of in vivo cognitive alterations of Babyeater children to ensure that they don't grow up wanting to eat their own children. Sterilize the current adults. If Babyeater adults cannot be sterilized and will not surrender, imprison them. If that's too expensive, kill most of them, but leave enough in prison to preserve their culture for the children. Offer the Superhappies an alliance to invade the Babyeaters, in which we provide the capital and labor and they provide the technology.
"Not too bad," Akon said. His voice grew somewhat dry. "But it doesn't seem to address the question of what the Superhappies are supposed to do with us. The analogous treatment -"
"Yes, my lord," the Master said. "That was extensively pointed out in the comments, my lord. And the other problem is that the Superhappies don't really need our labor or our capital." The Master looked in the direction of the Lord Programmer, the Xenopsychologist, and the Lady Sensory.
The Lord Programmer said, "My lord, I believe the Superhappies think much faster than we do. If their cognitive systems are really based on something more like DNA than like neurons, that shouldn't be surprising. In fact, it's surprising that the speedup is as little as -" The Lord Programmer stopped, and swallowed. "My lord. The Superhappies responded to most of our transmissions extremely quickly. There was, however, a finite delay. And that delay was roughly proportional to the length of the response, plus an additive constant. Going by the proportion, my lord, I believe they think between fifteen and thirty times as fast as we do, to the extent such a comparison can be made. If I try to use Moore's Law type reasoning on some of the observable technological parameters in their ship - Alderson flux, power density, that sort of thing - then I get a reasonably convergent estimate that the aliens are two hundred years ahead of us in human-equivalent subjective time. Which means it would be twelve hundred equivalent years since their Scientific Revolution."
"If," the Xenopsychologist said, "their history went as slowly as ours. It probably didn't." The Xenopsychologist took a breath. "My lord, my suspicion is that the aliens are literally able to run their entire ship using only three kiritsugu as sole crew. My lord, this may represent, not only the superior programming ability that translated their communications to us, but also the highly probable case that Superhappies can trade knowledge and skills among themselves by having sex. Every individual of their species might contain the memory of their Einsteins and Newtons and a thousand other areas of expertise, no more conserved than DNA is conserved among humans. My lord, I suspect their version of Galileo was something like thirty objective years ago, as the stars count time, and that they've been in space for maybe twenty years."
The Lady Sensory said, "Their ship has a plane of symmetry, and it's been getting wider on the axis through that plane, as it sucks up nova dust and energy. It's growing on a smooth exponential at 2% per hour, which means it can split every thirty-five hours in this environment."
"I have no idea," the Xenopsychologist said, "how fast the Superhappies can reproduce themselves - how many children they have per generation, or how fast their children sexually mature. But all things considered, I don't think we can count on their kids taking twenty years to get through high school."
There was silence.
When Akon could speak again, he said, "Are you all quite finished?"
"If they let us live," the Lord Programmer said, "and if we can work out a trade agreement with them under Ricardo's Law of Comparative Advantage, interest rates will -"
"Interest rates can fall into an open sewer and die. Any further transmissions from the Superhappy ship?"
The Lady Sensory shook her head.
"All right," Akon said. "Open a transmission channel to them."
There was a stir around the table. "My lord -" said the Master of Fandom. "My lord, what are you going to say?"
Akon smiled wearily. "I'm going to ask them if they have any options to offer us."
The Lady Sensory looked at the Ship's Confessor. The hood silently nodded: He's still sane.
The Lady Sensory swallowed, and opened a channel. On the holo there first appeared, as a screen:
The Lady 3rd Kiritsugu
temporary co-chair of the Gameplayer
Language Translator version 9
Cultural Translator version 16
The Lady 3rd in this translation was slightly less pale, and looked a bit more concerned and sympathetic. She took in Akon's appearance at a glance, and her eyes widened in alarm. "My lord, you're hurting!"
"Just tired, milady," Akon said. He cleared his throat. "Our ship's decision-making usually relies on markets and our markets are behaving erratically. I'm sorry to inflict that on you as shared pain, and I'll try to get this over with quickly. Anyway -"
Out of the corner of his eye, Akon saw the Ship's Engineer re-enter the room; the Engineer looked as if he had something to say, but froze when he saw the holo.
There was no time for that now.
"Anyway," Akon said, "we've worked out that the key decisions depend heavily on your level of technology. What do you think you can actually do with us or the Babyeaters?"
The Lady 3rd sighed. "I really should get your independent component before giving you ours - you should at least think of it first - but I suppose we're out of luck on that. How about if I just tell you what we're currently planning?"
Akon nodded. "That would be much appreciated, milady." Some of his muscles that had been tense, started to relax. Cultural Translator version 16 was a lot easier on his brain. Distantly, he wondered if some transformed avatar of himself was making skillful love to the Lady 3rd -
"All right," the Lady 3rd said. "We consider that the obvious starting point upon which to build further negotiations, is to combine and compromise the utility functions of the three species until we mutually satisfice, providing compensation for all changes demanded. The Babyeaters must compromise their values to eat their children at a stage where they are not sentient - we might accomplish this most effectively by changing the lifecycle of the children themselves. We can even give the unsentient children an instinct to flee and scream, and generate simple spoken objections, but prevent their brain from developing self-awareness until after the hunt."
Akon straightened. That actually sounded - quite compassionate - sort of -
"Our own two species," the Lady 3rd said, "which desire this change of the Babyeaters, will compensate them by adopting Babyeater values, making our own civilization of greater utility in their sight: we will both change to spawn additional infants, and eat most of them at almost the last stage before they become sentient."
The Conference room was frozen. No one moved. Even their faces didn't change expression.
Akon's mind suddenly flashed back to those writhing, interpenetrating, visually painful blobs he had seen before.
A cultural translator could change the image, but not the reality.
"It is nonetheless probable," continued the Lady 3rd, "that the Babyeaters will not accept this change as it stands; it will be necessary to impose these changes by force. As for you, humankind, we hope you will be more reasonable. But both your species, and the Babyeaters, must relinquish bodily pain, embarrassment, and romantic troubles. In exchange, we will change our own values in the direction of yours. We are willing to change to desire pleasure obtained in more complex ways, so long as the total amount of our pleasure does not significantly decrease. We will learn to create art you find pleasing. We will acquire a sense of humor, though we will not lie. From the perspective of humankind and the Babyeaters, our civilization will obtain much utility in your sight, which it did not previously possess. This is the compensation we offer you. We furthermore request that you accept from us the gift of untranslatable 2, which we believe will enhance, on its own terms, the value that you name 'love'. This will also enable our kinds to have sex using mechanical aids, which we greatly desire. At the end of this procedure, all three species will satisfice each other's values and possess great common ground, upon which we may create a civilization together."
Akon slowly nodded. It was all quite unbelievably civilized. It might even be the categorically best general procedure when worlds collided.
The Lady 3rd brightened. "A nod - is that assent, humankind?"
"It's acknowledgment," Akon said. "We'll have to think about this."
"I understand," the Lady 3rd said. "Please think as swiftly as you can. Babyeater children are dying in horrible agony as you think."
"I understand," Akon said in return, and gestured to cut the transmission.
The holo blinked out.
There was a long, terrible silence.
"No."
The Lord Pilot said it. Cold, flat, absolute.
There was another silence.
"My lord," the Xenopsychologist said, very softly, as though afraid the messenger would be torn apart and dismembered, "I do not think they were offering us that option."
"Actually," Akon said, "The Superhappies offered us more than we were going to offer the Babyeaters. We weren't exactly thinking about how to compensate them." It was strange, Akon noticed, his voice was very calm, maybe even deadly calm. "The Superhappies really are a very fair-minded people. You get the impression they would have proposed exactly the same solution whether or not they happened to hold the upper hand. We might have just enforced our own will on the Babyeaters and told the Superhappies to take a hike. If we'd held the upper hand. But we don't. And that's that, I guess."
"No!" shouted the Lord Pilot. "That's not -"
Akon looked at him, still with that deadly calm.
The Lord Pilot was breathing deeply, not as if quieting himself, but as if preparing for battle on some ancient savanna plain that no longer existed. "They want to turn us into something inhuman. It - it cannot - we cannot - we must not allow -"
"Either give us a better option or shut up," the Lord Programmer said flatly. "The Superhappies are smarter than us, have a technological advantage, think faster, and probably reproduce faster. We have no hope of holding them off militarily. If our ships flee, the Superhappies will simply follow in faster ships. There's no way to shut a starline once opened, and no way to conceal the fact that it is open -"
"Um," the Ship's Engineer said.
Every eye turned to him.
"Um," the Ship's Engineer said. "My Lord Administrator, I must report to you in private."
The Ship's Confessor shook his head. "You could have handled that better, Engineer."
Akon nodded to himself. It was true. The Ship's Engineer had already betrayed the fact that a secret existed. Under the circumstances, easy to deduce that it had come from the Babyeater data. That was eighty percent of the secret right there. And if it was relevant to starline physics, that was half of the remainder.
"Engineer," Akon said, "since you have already revealed that a secret exists, I suggest you tell the full Command Conference. We need to stay in sync with each other. Two minds are not a committee. We'll worry later about keeping the secret classified."
The Ship's Engineer hesitated. "Um, my lord, I suggest that I report to you first, before you decide -"
"There's no time," Akon said. He pointed to where the holo had been.
"Yes," the Master of Fandom said, "we can always slit our own throats afterward, if the secret is that awful." The Master of Fandom gave a small laugh -
- then stopped, at the look on the Engineer's face.
"At your will, my lord," the Engineer said.
He drew a deep breath. "I asked the Lord Programmer to compare any identifiable equations and constants in the Babyeater's scientific archive, to the analogous scientific data of humanity. Most of the identified analogues were equal, of course. In some places we have more precise values, as befits our, um, superior technological level. But one anomaly did turn up: the Babyeater figure for Alderson's Coupling Constant was ten orders of magnitude larger than our own."
The Lord Pilot whistled. "Stars above, how did they manage to make that mistake -"
Then the Lord Pilot stopped abruptly.
"Alderson's Coupling Constant," Akon echoed. "That's the... coupling between Alderson interactions and the..."
"Between Alderson interactions and the nuclear strong force," the Lord Pilot said. He was beginning to smile, rather grimly. "It was a free parameter in the standard model, and so had to be established experimentally. But because the interaction is so incredibly... weak... they had to build an enormous Alderson generator to find the value. The size of a very small moon, just to give us that one number. Definitely not something you could check at home. That's the story in the physics textbooks, my lords, my lady."
The Master of Fandom frowned. "You're saying... the physicists faked the result in order to... fund a huge project...?" He looked puzzled.
"No," the Lord Pilot said. "Not for the love of power. Engineer, the Babyeater value should be testable using our own ship's Alderson drive, if the coupling constant is that strong. This you have done?"
The Ship's Engineer nodded. "The Babyeater value is correct, my lord."
The Ship's Engineer was pale. The Lord Pilot was clenching his jaw into a sardonic grin.
"Please explain," Akon said. "Is the universe going to end in another billion years, or something? Because if so, the issue can wait -"
"My lord," the Ship's Confessor said, "suppose the laws of physics in our universe had been such that the ancient Greeks could invent the equivalent of nuclear weapons from materials just lying around. Imagine the laws of physics had permitted a way to destroy whole countries with no more difficulty than mixing gunpowder. History would have looked quite different, would it not?"
Akon nodded, puzzled. "Well, yes," Akon said. "It would have been shorter."
"Aren't we lucky that physics didn't happen to turn out that way, my lord? That in our own time, the laws of physics don't permit cheap, irresistable superweapons?"
Akon furrowed his brow -
"But my lord," said the Ship's Confessor, "do we really know what we think we know? What different evidence would we see, if things were otherwise? After all - if you happened to be a physicist, and you happened to notice an easy way to wreak enormous destruction using off-the-shelf hardware - would you run out and tell you?"
"No," Akon said. A sinking feeling was dawning in the pit of his stomach. "You would try to conceal the discovery, and create a cover story that discouraged anyone else from looking there."
The Lord Pilot emitted a bark that was half laughter, and half something much darker. "It was perfect. I'm a Lord Pilot and I never suspected until now."
"So?" Akon said. "What is it, actually?"
"Um," the Ship's Engineer said. "Well... basically... to skip over the technical details..."
The Ship's Engineer drew a breath.
"Any ship with a medium-sized Alderson drive can make a star go supernova."
Silence.
"Which might seem like bad news in general," the Lord Pilot said, "but from our perspective, right here, right now, it's just what we need. A mere nova wouldn't do it. But blowing up the whole star - " He gave that bitter bark of laughter, again. "No star, no starlines. We can make the main star of this system go supernova - not the white dwarf, the companion. And then the Superhappies won't be able to get to us. That is, they won't be able to get to the human starline network. We will be dead. If you care about tiny irrelevant details like that." The Lord Pilot looked around the Conference Table. "Do you care? The correct answer is no, by the way."
"I care," the Lady Sensory said softly. "I care a whole lot. But..." She folded her hands atop the table and bowed her head.
There were nods from around the Table.
The Lord Pilot looked at the Ship's Engineer. "How long will it take for you to modify the ship's Alderson Drive -"
"It's done," said the Ship's Engineer. "But... we should, um, wait until the Superhappies are gone, so they don't detect us doing it."
The Lord Pilot nodded. "Sounds like a plan. Well, that's a relief. And here I thought the whole human race was doomed, instead of just us." He looked inquiringly at Akon. "My lord?"
Akon rested his head in his hands, suddenly feeling more weary than he had ever felt in his life. From across the table, the Confessor watched him - or so it seemed; the hood was turned in his direction, at any rate.
I told you so, the Confessor did not say.
"There is a certain problem with your plan," Akon said.
"Such as?" the Lord Pilot said.
"You've forgotten something," Akon said. "Something terribly important. Something you once swore you would protect."
Puzzled faces looked at him.
"If you say something bloody ridiculous like 'the safety of the ship' -" said the Lord Pilot.
The Lady Sensory gasped. "Oh, no," she murmured. "Oh, no. The Babyeater children."
The Lord Pilot looked like he had been punched in the stomach. The grim smiles that had begun to spread around the table were replaced with horror.
"Yes," Akon said. He looked away from the Conference Table. He didn't want to see the reactions. "The Superhappies wouldn't be able to get to us. And they couldn't get to the Babyeaters either. Neither could we. So the Babyeaters would go on eating their own children indefinitely. And the children would go on dying over days in their parents' stomachs. Indefinitely. Is the human race worth that?"
Akon looked back at the Table, just once. The Xenopsychologist looked sick, tears were running down the Master's face, and the Lord Pilot looked like he were being slowly torn in half. The Lord Programmer looked abstracted, the Lady Sensory was covering her face with her hands. (And the Confessor's face still lay in shadow, beneath the silver hood.)
Akon closed his eyes. "The Superhappies will transform us into something not human," Akon said. "No, let's be frank. Something less than human. But not all that much less than human. We'll still have art, and stories, and love. I've gone entire hours without being in pain, and on the whole, it wasn't that bad an experience -" The words were sticking in his throat, along with a terrible fear. "Well. Anyway. If remaining whole is that important to us - we have the option. It's just a question of whether we're willing to pay the price. Sacrifice the Babyeater children -"
They're a lot like human children, really.
"- to save humanity."
Someone in the darkness was screaming, a thin choked wail that sounded like nothing Akon had ever heard or wanted to hear. Akon thought it might be the Lord Pilot, or the Master of Fandom, or maybe the Ship's Engineer. He didn't open his eyes to find out.
There was a chime.
"In-c-c-coming c-call from the Super Happy," the Lady Sensory spit out the words like acid, "ship, my lord."
Akon opened his eyes, and felt, somehow, that he was still in darkness.
"Receive," Akon said.
The Lady 3rd Kiritsugu appeared before him. Her eyes widened once, as she took in his appearance, but she said nothing.
That's right, my lady, I don't look super happy.
"Humankind, we must have your answer," she said simply.
The Lord Administrator pinched the bridge of his nose, and rubbed his eyes. Absurd, that one human being should have to answer a question like that. He wanted to foist off the decision on a committee, a majority vote of the ship, a market - something that wouldn't demand that anyone accept full responsibility. But a ship run that way didn't work well under ordinary circumstances, and there was no reason to think that things would change under extraordinary circumstances. He was an Administrator; he had to accept all the advice, integrate it, and decide. Experiment had shown that no organizational structure of non-Administrators could match what he was trained to do, and motivated to do; anything that worked was simply absorbed into the Administrative weighting of advice.
Sole decision. Sole responsibility if he got it wrong. Absolute power and absolute accountability, and never forget the second half, my lord, or you'll be fired the moment you get home. Screw up indefensibly, my lord, and all your hundred and twenty years of accumulated salary in escrow, producing that lovely steady income, will vanish before you draw another breath.
Oh - and this time the whole human species will pay for it, too.
"I can't speak for all humankind," said the Lord Administrator. "I can decide, but others may decide differently. Do you understand?"
The Lady 3rd made a light gesture, as if it were of no consequence. "Are you an exceptional case of a human decision-maker?"
Akon tilted his head. "Not... particularly..."
"Then your decision is strongly indicative of what other human decisionmakers will decide," she said. "I find it hard to imagine that the options exactly balance in your decision mechanism, whatever your inability to admit your own preferences."
Akon slowly nodded. "Then..."
He drew a breath.
Surely, any species that reached the stars would understand the Prisoner's Dilemma. If you couldn't cooperate, you'd just destroy your own stars. A very easy thing to do, as it had turned out. By that standard, humanity might be something of an impostor next to the Babyeaters and the Superhappies. Humanity had kept it a secret from itself. The other two races - just managed not to do the stupid thing. You wouldn't meet anyone out among the stars, otherwise.
The Superhappies had done their very best to press C. Cooperated as fairly as they could.
Humanity could only do the same.
"For myself, I am inclined to accept your offer."
He didn't look around to see how anyone had reacted to that.
"There may be other things," Akon added, "that humanity would like to ask of your kind, when our representatives meet. Your technology is advanced beyond ours."
The Lady 3rd smiled. "We will, of course, be quite positively inclined toward any such requests. As I believe our first message to you said - 'we love you and we want you to be super happy'. Your joy will be shared by us, and we will be pleasured together."
Akon couldn't bring himself to smile. "Is that all?"
"This Babyeater ship," said the Lady 3rd, "the one that did not fire on you, even though they saw you first. Are you therefore allied with them?"
"What?" Akon said without thinking. "No -"
"My lord!" shouted the Ship's Confessor -
Too late.
"My lord," the Lady Sensory said, her voice breaking, "the Superhappy ship has fired on the Babyeater vessel and destroyed it."
Akon stared at the Lady 3rd in horror.
"I'm sorry," the Lady 3rd Kiritsugu said. "But our negotiations with them failed, as predicted. Our own ship owed them nothing and promised them nothing. This will make it considerably easier to sweep through their starline network when we return. Their children would be the ones to suffer from any delay. You understand, my lord?"
"Yes," Akon said, his voice trembling. "I understand, my lady kiritsugu." He wanted to protest, to scream out. But the war was only beginning, and this - would admittedly save -
"Will you warn them?" the Lady 3rd asked.
"No," Akon said. It was the truth.
"Transforming the Babyeaters will take precedence over transforming your own species. We estimate the Babyeater operation may take several weeks of your time to conclude. We hope you do not mind waiting. That is all," the Lady 3rd said.
And the holo faded.
"The Superhappy ship is moving out," the Lady Sensory said. She was crying, silently, as she steadily performed her duty of reporting. "They're heading back toward their starline origin."
"All right," Akon said. "Take us home. We need to report on the negotiations -"
There was an inarticulate scream, like that throat was trying to burst the walls of the Conference chamber, as the Lord Pilot burst out of his chair, burst all restraints he had placed on himself, and lunged forward.
But standing behind his target, unnoticed, the Ship's Confessor had produced from his sleeve the tiny stunner - the weapon which he alone on the ship was authorized to use, if he made a determination of outright mental breakdown. With a sudden motion, the Confessor's arm swept out...
- ... and anesthetized the Lord Pilot.
- ... [This option will become the True Ending only if someone suggests it in the comments before the previous ending is posted tomorrow. Otherwise, the first ending is the True one.]
But standing behind his target, unnoticed, the Ship's Confessor had produced from his sleeve the tiny stunner - the weapon which he alone on the ship was authorized to use, if he made a determination of outright mental breakdown. With a sudden motion, the Confessor's arm swept out...
---
... and anaesthetised everyone in the room. He then went downstairs to the engine room, and caused the sun to go supernova, blocking access to earth.
Regardless of his own preferences, he takes the option for humanity to 'painlessly' defect in inter-stellar prisoners dilemma, knowing apriori that the superhappys chose to co-operate.
Posted by: Anonymous Coward | February 03, 2009 at 05:03 AM
Hmm. The three networks are otherwise disconnected from each other? And the Babyeaters are the first target?
Wait a week for a Superhappy fleet to make the jump into Babyeater space, then set off the bomb.
(Otherwise, yes, I would set off the bomb immediately.)
Posted by: Russell Wallace | February 03, 2009 at 05:10 AM
"[This option will become the True Ending only if someone suggests it in the comments before the previous ending is posted tomorrow. Otherwise, the first ending is the True one.]"
I'm not sure I understand what you mean. If no one chooses (2) does that mean that the (True) story ends with the Confessor stunning the Lord Pilot? ...or does it continue after he's stunned? ...or have I gotten it all wrong?
Are the storylines like these:
1. - - - - (END)
2. - - - - - - - - (END)
or
1. - - - - - - - - (END)
2. - - - - - - - - (END)
Posted by: Carl Jakobsson | February 03, 2009 at 05:12 AM
@Anonymous Coward:Reasonable, except even by defecting you haven't gained the substantially greater payoff that is the whole point of Prisoner's Dilemma. In other words, like he asks: what about the Babyeater children? I wouldn't know just how to quantify the 2 options-I believe that's the whole point of this series :) -but I wouldn't call it much better than what the Superhappy aliens offered, at least with the more "inclusive" altruistic concern that the humans in this illustration are supposed to have.
Posted by: Nikhil Punnoose | February 03, 2009 at 05:18 AM
Carl - I'm pretty sure either way we get three more chapters.
Posted by: Mike Blume | February 03, 2009 at 05:18 AM
Given that the number of parts in the story has been explicitly stated all along, I doubt it'd change in length.
No, you've got to suggest someone else to stun, I'm pretty sure.
---
One thing I'm wondering about the superhappys. They're so eager to cooperate, even to the point of changing their own utility function; what would happen if they kept running into one alien race after another, all of which would alter it in the same direction?
I can't figure out a better solution than what they've proposed. I wouldn't particularily want to eat nonsentient babies - it seems so *pointless*, by all three pre-existing utility functions - but so is art, by the happyhappyhappys' function.
Eliezer, if your point is to emotionally drive the point that utility functions are basically arbitrary, you've succeeded.
Posted by: Svein Ove | February 03, 2009 at 05:21 AM
2) ...and anesthetized himself.
Posted by: rw | February 03, 2009 at 05:27 AM
Umm... 'Superstimulus'.
I think Eliezer has written passionately and pointedly about rationality, will to become stronger and need for FAI. Writing this story makes a separate point about those ideas.
After reading this story I feel myself agreeing with Eliezer more on his views and that seems to be a sign of manipulation and not of a rationality.
Philosophy expressed in form of fiction seems to have a very strong effect on people - even if the fiction isn't very good (ref. Ayn Rand). I find this story well written and engaging. I'm having other people read and comment story without background of reading Eliezers writings before to have better idea if this story actually has made a point instead of creating stronger attachment to ideas presented earlier.
Few comments in no particular order (randomized):
Format of the story being released in small bite sized installments creates an artificial scarcity.
The story compactly addresses matters that readers have spend time studying here which is very rewarding.
Engaging people in the creation of the story creates attachment to it.
Characters use very familiar phrases that help formation of in-group feeling.
No matter which of the three alien species one happens to cheer for in the story that is still cheering for someone.
Posted by: Anonymous | February 03, 2009 at 05:27 AM
Svein: No, you've got to suggest someone else to stun, I'm pretty sure.
I doubt Eliezer's grand challenge to us would be to contribute less than four bits to his story.
Posted by: Mike Blume | February 03, 2009 at 05:31 AM
So.. (even taking MST3K into account)
Akon certainly has gone mad. He believes that he is in unique position of power (even his decision markets and his Command staff is divided) and he has to make the decision NOW with great unlikely secrets revealed essentially just to him. There are too many unlikely events to believe in for Akon. I think he has failed his excercise or whatever he is living in.
Posted by: Kellopyy | February 03, 2009 at 05:47 AM
Anonymous Coward's defection isn't. A real defection would be the Confessor anesthetizing Akon, then commandeering the ship to chase the Super Happies and nova their star.
Posted by: James Blair | February 03, 2009 at 05:47 AM
"But our negotiations with them failed, as predicted."
If the Lady 3rd speaks the truth, and human behaviour is not more difficult to model than Babyeater behaviour, then the crew faces a classical Newcomblike problem. (Eliezer hints through Akon's thoughts that the Supper Happies have indeed built relieable models of at least some crewmembers.)
So if you write an alternative ending, take into account that whatever the Confessor, or anyone else, does, will have been already predicted and taken into account by the Super Happy People.
Posted by: Manuel Mörtelmaier | February 03, 2009 at 05:50 AM
Why should we care for some crystalline beasts? We don't desire to modify lions to eat vegetables, and their prey is much more like us. Destroy the star immediately, or better do it at the moment when it can do the greatest damage to the damned self-righteous superhappies (revenge is, after all, also a sort of human value).
Posted by: prase | February 03, 2009 at 05:50 AM
You'll get the same next three installments regardless of whether someone comes up with the Alternative Solution before Ending 1 is posted. But only if someone suggests the Alternative Solution will Ending 2 become the True Ending - the one that, as 'twere, actually happened in that ficton.
This is based on the visual novel format where a given storyline often has two endings, the True Ending and the Good Ending, or the Normal Ending and the True Ending (depending on which of the two is sadder).
To make the second ending the True Ending, someone has to suggest the alternative thing for the Impossible to do in this situation - it's not enough to guess who the Confessor goes after.
Well, I'm glad the story wasn't ruined by the alternative being too obvious. If no one's thought of it yet in the comments, then it's at least plausible that the people on the ship didn't think of it earlier.
Anonymous - yes, I keep wondering myself about the ethics of writing illustrative fiction. So far I'm coming out on the net positive side, especially after Robin's post on Near versus Far thinking. But it does seem to put more of a strain on how much you trust the author - both their honesty and their intelligence.
PS: Anna and Steve, Shulman, Vassar, and Marcello, please don't post the solution if you get it - I want to leave the field at least a little open here...
Posted by: Eliezer Yudkowsky | February 03, 2009 at 06:06 AM
I thought these "events" might be a test for the humans, a mass hallucination. It is strange that three civilisations should encounter each other at the same time like this.
It is difficult to alter one human characteristic without changing the whole person: difficult to change from male to female. Far more difficult to Improve a civilization by changing one characteristic of the humans, take away the ability to feel pain. Take away the whole basis of moral action and cooperation, by preventing babyeaters from eating babies. Would the superhappies really desire to make every one else just like them? Possibly, I think that is a morally poorer choice but making that choice is very common among humans.
But I cannot think of a *better* ending.
Posted by: Abigail | February 03, 2009 at 06:19 AM
I wonder if the Superhappys could be persuaded that instead of modifying us to not feel pain at all, we could be modified to have the ability to feel pain switched off by default, but with the potential to be activated if we so chose - that would avoid their concerns about non-consensually inflicting pain on children who hadn't come to the philosophical realisation that it was worth it, but would still allow us to remain fully human if that was what we actually desired as individuals given the choice.
Posted by: Sebastian Conolly | February 03, 2009 at 06:28 AM
... and stuns Akon (or everyone). He then opens a channel to the Superhappies, and threatens to detonate the star - thus preventing the Superhappies from "fixing" the Babyeaters, their highest priority. He uses this to blackmail them into fixing the Babyeaters while leaving humanity untouched.
Posted by: Allan Crossman | February 03, 2009 at 06:32 AM
I don't really have a good enough grasp on the world to predict what is possible, it all seems to unreal.
One possibility is to jump one star away back towards earth and then blow up that star, if that is the only link to the new star.
Posted by: Will Pearson | February 03, 2009 at 06:50 AM
...and stuns Akon, for failing to be rational and jumping to a decision with insufficient information. Doesn't it seem a little TOO convenient that the first alien race is less powerful, while the second one is massively more powerful? And now that the one is gone, and the other is dust, humans seem to have accepted being modified in ways that would make the babyeaters happy... without even bringing up any other scenarios. That's contrary to the stated mission of the Confessor.
Posted by: nyu2 | February 03, 2009 at 06:50 AM
The Confessor finds Akon's acceptance of part of the terms of capitulation flawed, and stuns him, effectively relieving him of command. The rest of the crew deliberate over their options.
Something about Akon's unwillingness to warn the Babyeaters of the Superhappy's plans set my "Plot Device" warning lights off. Might the rest of the story involve following the Babyeater's starline to attempt to warn/renegotiate with them, and, upon probably failing, detonating that sun to protect the Babyeaters (who didn't choose to capitulate) and consigning humanity to a marathon carnal surplus-infant-eating future?
Not that I don't struggle to come up with a rational case for this course of action, or even to rationalise it. It's just that humanity advocating the universal eating of babies is the sort of perverse outcome I'd expect from following alien first principles to their logical conclusions.
Is there also a Scooby Doo ending, like in Wayne's World?
Posted by: His Own Devices | February 03, 2009 at 07:14 AM
@Anonymous Coward:Reasonable, except even by defecting you haven't gained the substantially greater payoff that is the whole point of Prisoner's Dilemma. In other words, like he asks: what about the Babyeater children?
---
I misread the story and thought the superhappys had flown off to deal with them first. But in fact, the superhappys are 'returning to their home planet' before going to deal with the babyeaters. "This will make it considerably easier to sweep through their starline network when we return.". Oops.
In any event, if the ship's crew is immediately anaesthetised and the sun exploded, then earth remains ignorant of the suffering of the babyeaters, and earth is not coerced to have its value system changed by an external superior power. The only human that feels bad about all this is the one remaining conscious human on the ship before it is fried. The babyeaters experience no net change in their position and the superhappys have made a net loss (by discovering unhappiness in the universe and being made unable to fix it). Humanity has met a more powerful force with a very different value system that wishes to impose values on other cultures, but has achieved a draw. Humanity remains ignorant of suffering - again a draw when the only other options are to lose in some way (either by imposing values when we feel we have no right; or by knowingly allowing suffering).
Of course the confessor might wish to first transmit a message back to earth that neglects to mention any babyeaters, and warns of the highly dangerous 'superhappys', and perhaps describing them falsely as super-powerful babyeaters (ala alderson scientists) to prevent anyone from being tempted to find them, thereby preventing any individual from sacrificing the human race's control of it's own values...
I guess it depends on whether he believes 'right to choose your own species values' ranks above 'right to experience endless orgasms'. If he truly has no preference for either, he might as well consider everyone dangerously highly strung and emotional and an unsuitable sample size to make decisions for humanity. In that case, perhaps he should stun everyone in the control room and cause the ship to return to earth, if he is able to do so, to tell humanity what has happened in full detail. This at least allows the decision to be made by a larger fraction of humanity.
A final practical point. So far, the people on the ship only know what they have received in communications or what they can measure with their sensors. In fact, we can't trust either of these things; a sufficiently advanced species can fool sensors and any species can lie. We can observe the superhappys are clearly more technologically advanced from the evidence of the one ship present, and the growth rate suggests they can rapidly overpower humanity. Humanity has no idea what the superhappys will really do when they return. In fact, if they wish, they might simply turn all humans into superhappys and throw away all human values, without honouring the deal. They could torture all humans till the end of time if they wish or turn us into babyeaters. Equally, we know there is a race that is pleased to advertise they eat babies and wishes to encourage other races to do the same; and we know that they have one quite advanced ship that is slightly technologically inferior to us; what else they have, we don't really know. Perhaps the babyeaters have better crews and ships back home. Perhaps the babyeaters have advanced technology that masks the real capabilities of their ship. All we have is a single unreliable sample point of two advanced civilisations with very different value systems. What we have here is a giant knowledge gap.
The only thing we know for certain is that the superhappys are almost certainly technologically superior to humanity and can basically do whatever they want to us; unless the sun is blown up. And we know that the babyeaters have culturally unacceptable values to us; and we don't know if they might really have the ability to impose those values on us or not. Given this knowledge of these two dangerous forces, one of which is vastly superior, and one of which is advanced and might later turn out to be superior, if humanity can achieve a 'zero loss outcome' for itself by blowing up the sun, it is doing rather well in such an incredibly dangerous situation. Humanity should take advantage of the fact the superhappys already placed a 'co-operate' card on the table and allowed us decide what to do next.
Posted by: Anonymous Coward | February 03, 2009 at 07:28 AM
> Wait a week for a Superhappy fleet to make the jump into Babyeater space, then set off the bomb.
You guys are very trusting of super-advanced species who already showed a strong willingness to manipulate humanity with superstimulus and pornographic advertising.
Posted by: Anonymous Coward | February 03, 2009 at 07:30 AM
Assuming the Lord Pilot was correct in saying that, without the nova star, the Happy Fun People would never be able to reach the human starline network
...and assuming it's literally impossible to travel FTL without a starline
...and assuming the only starline to the nova star was the one they took
...and assuming Huygens, described as a "colony world", is sparsely populated, and either can be evacuated or is considered "expendable" compared to the alternatives
...then blow up Huygens' star. Without the Huygens-Nova starline, the Happy People won't be able to cross into human space, but the Happy-Nova-Babyeater starline will be unaffected. The Happy People can take care of the Babyeaters, and humankind will be safe. For a while.
Still not sure I'd actually take that solution. It depends on how populated Huygens is and how confident I am the Super Happy People can't come up with alternate transportation, and I'm also not *entirely* opposed to the Happy People's proposal. But:
If I had a comm link to the Happy People, I'd also want to hear their answer to the following line of reasoning: one ordinary nova in a single galaxy just attracted three separate civilizations. That means intelligent life is likely to be pretty common across the universe, and our three somewhat-united species are likely to encounter far more of it in the years to come. If the Happy People keep adjusting their (and our) utility functions each time we meet a new intelligent species, then by the millionth species there's not going to be a whole lot remaining of the original Super Happy way of thinking - or the human way of thinking, for that matter. If they're so smart, what's their plan for when that happens?
If they answer "We're fully prepared to compromise our and your utility functions limitlessly many times for the sake of achieving harmonious moralities among all forms of life in the Universe, and we predict each time will involve a change approximately as drastic as making you eat babies," then it will be a bad day to be a colonist on Huygens.
Posted by: Yvain | February 03, 2009 at 07:35 AM
If they're going to play the game of Chicken, then symbolically speaking the Confessor should perhaps stun himself to help commit the ship to sufficient insanity to go through with destroying the solar system.
Posted by: steven | February 03, 2009 at 07:54 AM
Attempting to paraphrase the known facts.
1. You and your family and friends go for a walk. You walk into an old building with 1 entrance/exit. Your friends/family are behind you.
2. You notice the door has a irrevocable self-locking mechanism if it is closed.
3. You have a knife in your pocket.
4. As you walk in you see three people dressed in 'lunatic's asylum' clothes.
5. Two of them are in the corner; one is a guy who is beating up a woman. He appears unarmed but may have a concealed weapon.
6. The guy shouts to you that 'god is making him do it' and suggests that you should join in and attack your family who are still outside the door.
7. The 3rd person in the room has a machine gun pointed at you. He tells you that he is going to give you and your family 1000000 pounds each if you just step inside, and he says he is also going to stop the other inmate from being violent.
8. You can choose to close the door (which will lock). What will happen next inside the room will then be unknown to you.
9. Or you can allow your family and friends into the room with the lunatics at least one of whom is armed with a machine gun.
10. Inside the room, as long as that machine gun exists, you have no control over what actually happens next in the room.
11. Outside the room, once the door is locked, you also have no control over what happens next in the room.
12. But if you invite your family inside, you are risking that they may be killed by a machine or may be given 1 million pounds. But the matter is in the hands of the machine gun toting lunatic.
13. Your family are otherwise presently happy and well adjusted and do not appear to NEED 1 million pounds, though some might benefit from it a great deal.
Personally in this situation I wouldn't need to think twice; I would immediately close the door. I have no control over the unfortunate situation the woman is facing either way, but at least I don't risk a huge negative outcome (the death of myself and my family at the hands of a machine gun armed lunatic).
It is foolish to risk what you have and need for what you do not have, do not entirely know, and do not need.
Posted by: Anonymous Coward | February 03, 2009 at 07:57 AM
Anonymous Coward's defection isn't. A real defection would be the Confessor anesthetizing Akon, then commandeering the ship to chase the Super Happies and nova their star.
--
Your defection isn't. There are no longer any guarantees of anything whenever a vastly superior technology is definitely in the vicinity. There are no guarantees while any staff member of the ship is still conscious besides the Confessor and it is a known fact (from the prediction markets and people in the room) that at least some of humanity is behaving very irrationally.
Your proposal takes unnecessary ultimate risk (the potential freezing, capture or destruction of the human ship upon arrival, leading to the destruction of humanity - since we don't know what the superhappys will REALLY do, after all) in exchange for unnecessary minimal gain (so we can attempt to reduce the suffering of a species whose technological extent we don't truly know and whose value system we know to be in at least one place substantially opposed to our own, and whom we can remain ignorant of, as a species, by anaesthetised self-destruction of the human ship).
It is more rational to take action as soon as possible to guarantee a minimum acceptable level of safety for humankind and its value system, given the unknown but clearly vastly superior technological capabilities of the superhappys if no action is immediately taken.
If you let an AI out of the box and it tells you its value system is opposed to humanity's and that it intends to convert all humanity to a form that it prefers, then it FOOLISHLY trusts you and steps back inside the box for a minute, then what you do NOT do is:
- mess around
- give it any chance to come back out of the box
- allow anyone else the chance to let it out of the box (or the chance to disable you while you're trying to lock the box).
Anonymous
Posted by: Anonymous Coward | February 03, 2009 at 08:09 AM
Probably I have watched too much Star Trek but it is hard to shake the suspicion that both the Superhappies and the Babyeaters are sockpuppets for some kind of weakly godlike entity messing around with us for a laff..
Posted by: Urizen | February 03, 2009 at 08:16 AM
"You can't always get what you want" is one of the few samples of true rationality in this mess, I don't see why Akon seemed to completely forget that notion.
The happies, humans and babyeaters can not reach a quick conclusion that everyone will be satisfied with. In short, they should all just suck it up. All specieses have their little hiccups from the point of view of the others.
Becoming a painfeeling superhappy babyeater and any other combination imaginable should be a choice made available to members of all races. This might lead somewhere, or it might not. The happies should feel ashamed for their rash reaction (although the babyeaters will forgive them for this "reasonable mistake") and after that all three races should continue their personal sufferings until they, inevitably, will find a way to come to terms with it, especially given the way they would be exposed to each other culturally in the meanwhile. It's fair, rational and offers a longterm solution out.
The happy solution is essentially to fight wars until something gives.
Trying to force a solution is hardly rational, which should actually be obvious to all sides, especially the happies. The babyeater society and culture will suffer terrible devastation through a war which they will quickly lose, and I fail to see how putting the rest of the babyeater children through war (in which many will die, painfully) can be called a "definite improvement". (Many babyeaters would propably just quietly go on eating babies anyway.) It should be completely obvious to the happies that at least a significant part of humanity will not turn itself into baby-eaters willingly, which will result in another war. The same will possibly go for the happies actually, so they also risk a civil war, and for what? To have three cultures that eat babies for purely symbolic reasons. Makes no sense.
Cultural standards that are forced upon people will be rejected both among the humans and the babyeaters. There will inevitably be continuous rebellions, and the happies would keep enforcing their ideas, which will result in a cycle of wars until the happies leave for one reason or the other, and then there will be civil wars among the humans and babyeaters trying to decide for themselves whether to return to the "old ways" or keep the new alien ways (which would have significant support by then).
And that's just the start of a cycle of wars with three breeds, now on a path of revenge. At this point, the only sensible solution is to go supernova, breaking the connection. There's better chance of the babyeaters finding a path out of the most significant cultural issue, the babyeating, than there is of the three breeds learning to live together once one starts enforcing itself upon the others in the proposed magnitude. The amount of fighting would clipse the suffering of the babyeater children, thus being pointless.
Blowing the star right now might only be a temporary solution. The happies might find other connections, if they put their minds to it. They obviously travel and develop fast, so it might not even take that long, and they have a lot of data from the humans and babyeaters to work with. With luck the babyeaters might get over their little cultural hiccup before that, but that's not exactly a sound ethical foundation. (It beats the hell out of starting wars though.)
This war must be stopped before it starts, or at least an attempt must be made (as the humans can't just force the happies to do anything).
They could attack the happies as a show of "We are willing to die for their right to their values, as much as I loathe them". It could also remind them that killing is not so much fun once you need to do it to people who are not doing it for "selfish" reasons, and not to people who are just "wrong". And just as a reminder of what a mess of wars they're about to create. They could kill themselves. And yes, they could go and blow up the superhappy star as a last resort, hoping that the happies couldn't recover.
It's terribly pompous to think that just because all cultures are happy with the way they are now that it somehow makes them superior to the cultures that they had previously, let alones the ones someone else has. We think ourselves superior because our standard of living is improved and we like things the way they are better than what we about the way they were. The only way to compare is to try. We can not try previous cultures, but the happies should, for the sake of argument, at least try living more like the babyeaters. However, if they change EVERYBODY, there is once again no comparison.
If the happies argue that the babyeaters will learn to be satisfied not being babyeating, then by the same reasoning the happies should be able to learn to be satisfied to eat babies, or at least live with the idea of the babyeaters being babyeating.
So to return to the point; options of trying differents aspects of the three cultures should be made available, and it could propably be agreed that members of each specieses must be found that are willing to go for this experiment. (It shouldn't be too hard really; all volunteers would be doing it so hopefully others wouldn't.) The happy technology should even make it possible to complete this experiment to satisfactory levels surprisingly quick.
There's a chance this experiment wouldn't produce satisfactory results, but it should be tried before warfare.
Yeah, I know, terribly boring in this topic, but what ever.
Posted by: Risto | February 03, 2009 at 08:26 AM
Insofar as definitions can be right or wrong, so also counterfactual consequences can be right or wrong, and thus fictional evidence can be right or wrong...
So the rightnesses of the two bodies of fictional evidence in the two endings both depend on the audience's skill at applied metaethics? And you want to increase the expected rightness of the true ending by correlating the true ending with the audience's unknown skill? Or by giving the audience an incentive to increase their skill?
(I don't know the solution. This comment reasoning about your motives is to narrow the search space. Plus it proposes a meaning for your otherwise unexplained term "True".)
Posted by: Steve Rayhawk | February 03, 2009 at 08:39 AM
Go back to earth and detonate.
Obviously these two species are superior, having not destroyed themselves when they had the means.
They should remain uncorrupted.
Posted by: Maglick | February 03, 2009 at 08:46 AM
Since this is fiction (thankfully, seeing how many might allow the superhappy's the chance they need to escape the box)... an alternative ending.
The Confessor is bound by oath to allow the young to choose the path of the future no matter how morally distasteful.
The youngest in this encounter are clearly the babyeaters, technologically (and arguably morally).
Consequently the Confessor stuns everyone on board, pilots off to Baby Eater Prime and gives them the choice of how things should proceed from here.
The End
Posted by: Anonymous Coward | February 03, 2009 at 08:52 AM
They should go back to colony system Huygens and detonate.
Posted by: Vladimir Nesov | February 03, 2009 at 09:29 AM
Meanwhile, the Arabs and the Jews, communicating through the exclusive channel of the Great Khalif O. bin Laden negotiating through Internet Sex with Tzipi Livni, arrived at this compromise whereby the Jews would all worship Mohammed on Fridays, at which times they will explode a few of their children in buses, whereas the Arabs would ratiocinate psychotically around their scriptures on Saturdays, and spawn at least one Nobel prize winner in Medecine and Physics every five years.
Posted by: Faré | February 03, 2009 at 09:36 AM
The chance of running into two alien species in one day seems pretty unusual. Perhaps it means something?
Posted by: nine | February 03, 2009 at 09:36 AM
The chance of running into two alien species in one day seems pretty unusual. Perhaps it means something?
That is precisely what makes me think they are sockpuppets of a single entity (even within the story Universe, not just in the sense that Elizier invented them).
Posted by: Urizen | February 03, 2009 at 09:48 AM
Nova-ing the star isn't IMO a guarantee of no future contact - there may be other starlines that aren't discovered yet. Also, the SuperHappies may improve their tech over time, and may find ways of no longer needing starline tech.
Also, if there are three civilizations, odds are there are a lot more. The SuperHappies have a better structure to compete and grow with whatever other galactic superpowers exist out there.
In essence, the "closed locked door" is an illusion in my mind. Not something to base strategy on. It is the kind of thing that primitive 21st century humans would think of, and not the kind of option that an advanced 26th century human should consider viable. Were I the Confessor (and by implication, that is the role we 21st century readers are supposed to play), I would zap the Engineer, because he's building a house made of straw and taunting the big bad wolf.
But in the context of the story as it stands, this option is pointless, since the commander has already made his decision. Zapping the pilot is equally pointless, unless no one else is able to move the ship. That may be a defect of the story, or it may be deliberate.
Posted by: jb | February 03, 2009 at 10:00 AM
Option 1 is to cooperate, so I guess option 2 is defect. The correct way to defect is to destroy Huygens.
Posted by: Peter de Blanc | February 03, 2009 at 10:02 AM
Of course, meeting two new species on the same day is the crew of the Impossible having its leg pulled by some superior entity, namely Eliezer. But Eliezer is not above and outside *our* world, and we don't have to let ourselves intimidated by his scripture.
Why and how would communication possibly happen through only one channel? Since when is the unit of decision-making a race, species, nation, etc., rather than an individual? Is this Market-driven spaceship under totalitarian control where no one is allowed to communicate, and the whole crew too brain-damaged to work-around the interdiction? I wonder how the Soviet Union made it to the Interstellar Age. Where has your alleged individualism gone?
Why and how is compromise is even possible between two species, much less desirable? In the encounter of several species, the most efficient one will soon hoard all resources and leave the least efficient ones but as defanged zoo animals, at which point little do their opinions and decisions matter. No compromise. The only question is, who's on top. Dear Tigers, will you reform yourself? Can we negotiate? Let your Great Leader meet ours and discuss man to animal around a meal.
And of course, in your fantasy, the rationalist from way back when (EY) effectively wields the ultimate power on the ship, yet is not corrupted by power. What a wonderful saint! Makes you wonder what kind of wimps the rest of mankind has degenerated into to submit to THAT wimpy overlord. Where has gone your understanding of Evolutionary Forces?
Wanna see incredibly intelligent people wasting time on absurd meaningless questions? Come here to Overcoming Bias! A stupid person will believe in any old junk, but it takes someone very intelligent to specifically believe in such elaborate nonsense.
Posted by: Faré | February 03, 2009 at 10:08 AM
The nova acted as a rendezvous signal, causing all starlines connected to that star to flare up. Otherwise it's too hard to find aliens - opening starlines is expensive. It's the chance of a direct encounter (small) versus chance of at least one mutual neighbor (larger).
Posted by: Eliezer Yudkowsky | February 03, 2009 at 10:12 AM
And while I'm at it -- confusing pleasure and happiness is particularly dumb. Entities that would do that would be wiped from existence in a handful of generations, and not super-powerful. Habituation is how we keep functioning at the margin, where the effort is needed. The whole idea of a moral duty to minimize other people's pain is ridiculous, yet taken for granted in this whole story. Eliezer, you obviously are still under the influence of the judeo-christian superstitions you learned as a child.
If you're looking for an abstract value to maximize, well, it's time to shut up and eat your food. http://sifter.org/~simon/journal/20090103.h.html
Posted by: Faré | February 03, 2009 at 10:17 AM
From the fact that the physicists covered up knowledge that they thought was too dangerous for humanity to possess, the crew should immediately deduce that this could have happened several times in the past regarding several topics. The most obvious topic is AGI, so they should search their Archive for records of AGI projects that seemed promising but were mysteriously discontinued.
Posted by: Daniel Burfoot | February 03, 2009 at 10:17 AM
The nova acted as a rendezvous signal, causing all starlines connected to that star to flare up. Otherwise it's too hard to find aliens - opening starlines is expensive. It's the chance of a direct encounter (small) versus chance of at least one mutual neighbor (larger).
Even so, for reasons of which you are very well aware, meeting two sets of aliens should be a _lot_ less likely than meeting one set, so we ought to take that in account when we are trying to make sense of what is going on. But I accept that positing a minor god is rather a primitive reaction, especially as we already know that in your impossible possible world no Singularity is reachable by any means currently envisaged.
Posted by: Urizen | February 03, 2009 at 10:22 AM
"Carl - I'm pretty sure either way we get three more chapters."
Yes, but I was more worried that we'd only get three more chapters...;-)
---
Anyway, another reason that the confessor should interfere in this process is because they are awful at bargaining. If they follow through with the deal they will be (initially) seriously depressed about having to kill their own children, there's the risk of war or oppression of those who do not want to be augmented, and what to they get in return from the happies?
"We are willing to change to desire pleasure obtained in more complex ways, so long as the total amount of our pleasure does not significantly decrease. We will learn to create art you find pleasing. We will acquire a sense of humor, though we will not lie. From the perspective of humankind and the Babyeaters, our civilization will obtain much utility in your sight, which it did not previously possess. This is the compensation we offer you."
I don't see the value in this; if one wants more entertaining art and jokes, why not simply accept the augmentation and come up with them yourselves?
Posted by: Carl Jakobsson | February 03, 2009 at 10:24 AM
Given that the first installment mentions that Akon's words would be "inscribed for all time in the annals of history", any internally consistent conclusion would have to feature some subsequent contact with humanity.
Pardon my lapse in fourth-wall etiquette.
Posted by: His Own Devices | February 03, 2009 at 10:32 AM
What about following the SuperHappies to their first hop, then making THAT star go supernova? That way, they're cut off, but the humans still have a small chance to 'save' the babyeaters. Or vice-versa.
Posted by: nyu2 | February 03, 2009 at 10:45 AM
Peter, destroying Huygens isn't obviously the best way to defect, as in that scenario the Superhappies won't create art and humor or give us their tech.
Posted by: steven | February 03, 2009 at 10:49 AM
Pain and pleasure are *signals* that we are on the wrong or right path. There's a point in making it a better signal. But the following propositions are wholly absurd:
* to eliminate pain itself (i.e. no more signal)
* to bias the system to have either more or less pain in the average (i.e. bias the signal so it carries less than 1 bit of information per bit of code).
* to forcefully arrange for others to never possibly have pain in their own name (i.e. disconnecting them from reality, denying their moral agency -- and/or obey their every whims until reality strikes back despite your shielding).
* to feel responsible for other people's pain (i.e. deny the fact that they are their own moral agents).
As for promising a world of equal happiness for all, shameless self-quote:
"Life is the worst of all social inequalities. To suppress inequalities, one must either resurrect all the dead people (and give life to all the potential living people), or exterminate all the actually living. Egalitarians, since they cannot further their goal by the former method, inevitably come to further it by the latter method."
A rational individual has no reason to care for the suffering of alien entities, or even other human entities, except inasmuch as it affects his own survival, enjoyment, control of resources.
Posted by: Faré | February 03, 2009 at 10:52 AM
Reasonable options depend on how much the Superhappies really know. If they really know enough to make this a newcomb-like problem, any defection against them is going to make them blow up the Impossible Possible World before it can jump.
That situation might leave one possible option: rally the earth fleet now, invade the Babyeater starline network before the Superhappies do, and for each star have all but one vessel follow open starlines, with the straggler detonating the local star.
This relies on humanity having more readily available vessels with >= medium-sized Alderson drives than the Babyeaters have settled systems.
Afterwards, cooperate with the Superhappies; compromising with one alien species will dilute human values less than compromising with two. The Superhappies might judge this as conflicting with their goals in any case; I don't really understand Superhappy morality.
If this situation is in fact not newcomb-like, aside from detonating Huygens there is the option of rallying the human fleet, jumping to an uninhabitated system, having all but one vessel jump one system further - opening new starlines in both cases if necessary - and detonating the first star.
The humans on the ships in question will then stay human, and can rebuild a human civilization somewhere suitable.
Have the rest of humanity compromise with the Babyeaters and Superhappies. This leaves a civilization optimizing for the pure Good Thing, as opposed to the AVG(Good,Babyeating,Superhappy) thing, and doesn't outright kill any existing humans, which may not be possible in the case of blowing up Huygens.
Posted by: 51a1fc26f78b0296a69f53c615ab5a2f64ab1d1e | February 03, 2009 at 10:56 AM
Steven, the Superhappies will still create art as part of a compromise with the Babyeaters. But yes, we would miss out on their technology.
Posted by: Peter de Blanc | February 03, 2009 at 10:56 AM
Life is the worst of all social inequalities. To suppress inequalities, one must either resurrect all the dead people (and give life to all the potential living people), or exterminate all the actually living. Egalitarians, since they cannot further their goal by the former method, inevitably come to further it by the latter method
This seems more like a neat epigram than a thesis with its basis in fact. Have you any evidence for its truth? I am by no means an egalitarian but N.B. that in fact the communists did not kill everyone in Russia. Similarly for China. And it is not as if anti-egalitarians have not killed quite a lot of people too..
Posted by: Urizen | February 03, 2009 at 11:02 AM