(Part 5 of 8 in "Three Worlds Collide")
Akon strode into the main Conference Room; and though he walked like a physically exhausted man, at least his face was determined. Behind him, the shadowy Confessor followed.
The Command Conference looked up at him, and exchanged glances.
"You look better," the Ship's Master of Fandom ventured.
Akon put a hand on the back of his seat, and paused. Someone was absent. "The Ship's Engineer?"
The Lord Programmer frowned. "He said he had an experiment to run, my lord. He refused to clarify further, but I suppose it must have something to do with the Babyeaters' data -"
"You're joking," Akon said. "Our Ship's Engineer is off Nobel-hunting? Now? With the fate of the human species at stake?"
The Lord Programmer shrugged. "He seemed to think it was important, my lord."
Akon sighed. He pulled his chair back and half-slid, half-fell into it. "I don't suppose that the ship's markets have settled down?"
The Lord Pilot grinned sardonically. "Read for yourself."
Akon twitched, calling up a screen. "Ah, I see. The ship's Interpreter of the Market's Will reports, and I quote, 'Every single one of the underlying assets in my market is going up and down like a fucking yo-yo while the ship's hedgers try to adjust to a Black Swan that's going to wipe out ninety-eight percent of their planetside risk capital. Even the spot prices on this ship are going crazy; either we've got bubble traders coming out of the woodwork, or someone seriously believes that sex is overvalued relative to orange juice. One derivatives trader says she's working on a contract that will have a clearly defined value in the event that aliens wipe out the entire human species, but she says it's going to take a few hours and I say she's on crack. Indeed I believe an actual majority of the people still trying to trade in this environment are higher than the heliopause. Bid-ask spreads are so wide you could kick a fucking football stadium through them, nothing is clearing, and I have unisolated conditional dependencies coming out of my ass. I have no fucking clue what the market believes. Someone get me a drink.' Unquote." Akon looked at the Master of Fandom. "Any suggestions get reddited up from the rest of the crew?"
The Master cleared his throat. "My lord, we took the liberty of filtering out everything that was physically impossible, based on pure wishful thinking, or displayed a clear misunderstanding of naturalistic metaethics. I can show you the raw list, if you'd like."
"And what's left?" Akon said. "Oh, never mind, I get it."
"Well, not quite," said the Master. "To summarize the best ideas -" He gestured a small holo into existence.
Ask the Superhappies if their biotechnology is capable of in vivo cognitive alterations of Babyeater children to ensure that they don't grow up wanting to eat their own children. Sterilize the current adults. If Babyeater adults cannot be sterilized and will not surrender, imprison them. If that's too expensive, kill most of them, but leave enough in prison to preserve their culture for the children. Offer the Superhappies an alliance to invade the Babyeaters, in which we provide the capital and labor and they provide the technology.
"Not too bad," Akon said. His voice grew somewhat dry. "But it doesn't seem to address the question of what the Superhappies are supposed to do with us. The analogous treatment -"
"Yes, my lord," the Master said. "That was extensively pointed out in the comments, my lord. And the other problem is that the Superhappies don't really need our labor or our capital." The Master looked in the direction of the Lord Programmer, the Xenopsychologist, and the Lady Sensory.
The Lord Programmer said, "My lord, I believe the Superhappies think much faster than we do. If their cognitive systems are really based on something more like DNA than like neurons, that shouldn't be surprising. In fact, it's surprising that the speedup is as little as -" The Lord Programmer stopped, and swallowed. "My lord. The Superhappies responded to most of our transmissions extremely quickly. There was, however, a finite delay. And that delay was roughly proportional to the length of the response, plus an additive constant. Going by the proportion, my lord, I believe they think between fifteen and thirty times as fast as we do, to the extent such a comparison can be made. If I try to use Moore's Law type reasoning on some of the observable technological parameters in their ship - Alderson flux, power density, that sort of thing - then I get a reasonably convergent estimate that the aliens are two hundred years ahead of us in human-equivalent subjective time. Which means it would be twelve hundred equivalent years since their Scientific Revolution."
"If," the Xenopsychologist said, "their history went as slowly as ours. It probably didn't." The Xenopsychologist took a breath. "My lord, my suspicion is that the aliens are literally able to run their entire ship using only three kiritsugu as sole crew. My lord, this may represent, not only the superior programming ability that translated their communications to us, but also the highly probable case that Superhappies can trade knowledge and skills among themselves by having sex. Every individual of their species might contain the memory of their Einsteins and Newtons and a thousand other areas of expertise, no more conserved than DNA is conserved among humans. My lord, I suspect their version of Galileo was something like thirty objective years ago, as the stars count time, and that they've been in space for maybe twenty years."
The Lady Sensory said, "Their ship has a plane of symmetry, and it's been getting wider on the axis through that plane, as it sucks up nova dust and energy. It's growing on a smooth exponential at 2% per hour, which means it can split every thirty-five hours in this environment."
"I have no idea," the Xenopsychologist said, "how fast the Superhappies can reproduce themselves - how many children they have per generation, or how fast their children sexually mature. But all things considered, I don't think we can count on their kids taking twenty years to get through high school."
There was silence.
When Akon could speak again, he said, "Are you all quite finished?"
"If they let us live," the Lord Programmer said, "and if we can work out a trade agreement with them under Ricardo's Law of Comparative Advantage, interest rates will -"
"Interest rates can fall into an open sewer and die. Any further transmissions from the Superhappy ship?"
The Lady Sensory shook her head.
"All right," Akon said. "Open a transmission channel to them."
There was a stir around the table. "My lord -" said the Master of Fandom. "My lord, what are you going to say?"
Akon smiled wearily. "I'm going to ask them if they have any options to offer us."
The Lady Sensory looked at the Ship's Confessor. The hood silently nodded: He's still sane.
The Lady Sensory swallowed, and opened a channel. On the holo there first appeared, as a screen:
The Lady 3rd Kiritsugu
temporary co-chair of the Gameplayer
Language Translator version 9
Cultural Translator version 16
The Lady 3rd in this translation was slightly less pale, and looked a bit more concerned and sympathetic. She took in Akon's appearance at a glance, and her eyes widened in alarm. "My lord, you're hurting!"
"Just tired, milady," Akon said. He cleared his throat. "Our ship's decision-making usually relies on markets and our markets are behaving erratically. I'm sorry to inflict that on you as shared pain, and I'll try to get this over with quickly. Anyway -"
Out of the corner of his eye, Akon saw the Ship's Engineer re-enter the room; the Engineer looked as if he had something to say, but froze when he saw the holo.
There was no time for that now.
"Anyway," Akon said, "we've worked out that the key decisions depend heavily on your level of technology. What do you think you can actually do with us or the Babyeaters?"
The Lady 3rd sighed. "I really should get your independent component before giving you ours - you should at least think of it first - but I suppose we're out of luck on that. How about if I just tell you what we're currently planning?"
Akon nodded. "That would be much appreciated, milady." Some of his muscles that had been tense, started to relax. Cultural Translator version 16 was a lot easier on his brain. Distantly, he wondered if some transformed avatar of himself was making skillful love to the Lady 3rd -
"All right," the Lady 3rd said. "We consider that the obvious starting point upon which to build further negotiations, is to combine and compromise the utility functions of the three species until we mutually satisfice, providing compensation for all changes demanded. The Babyeaters must compromise their values to eat their children at a stage where they are not sentient - we might accomplish this most effectively by changing the lifecycle of the children themselves. We can even give the unsentient children an instinct to flee and scream, and generate simple spoken objections, but prevent their brain from developing self-awareness until after the hunt."
Akon straightened. That actually sounded - quite compassionate - sort of -
"Our own two species," the Lady 3rd said, "which desire this change of the Babyeaters, will compensate them by adopting Babyeater values, making our own civilization of greater utility in their sight: we will both change to spawn additional infants, and eat most of them at almost the last stage before they become sentient."
The Conference room was frozen. No one moved. Even their faces didn't change expression.
Akon's mind suddenly flashed back to those writhing, interpenetrating, visually painful blobs he had seen before.
A cultural translator could change the image, but not the reality.
"It is nonetheless probable," continued the Lady 3rd, "that the Babyeaters will not accept this change as it stands; it will be necessary to impose these changes by force. As for you, humankind, we hope you will be more reasonable. But both your species, and the Babyeaters, must relinquish bodily pain, embarrassment, and romantic troubles. In exchange, we will change our own values in the direction of yours. We are willing to change to desire pleasure obtained in more complex ways, so long as the total amount of our pleasure does not significantly decrease. We will learn to create art you find pleasing. We will acquire a sense of humor, though we will not lie. From the perspective of humankind and the Babyeaters, our civilization will obtain much utility in your sight, which it did not previously possess. This is the compensation we offer you. We furthermore request that you accept from us the gift of untranslatable 2, which we believe will enhance, on its own terms, the value that you name 'love'. This will also enable our kinds to have sex using mechanical aids, which we greatly desire. At the end of this procedure, all three species will satisfice each other's values and possess great common ground, upon which we may create a civilization together."
Akon slowly nodded. It was all quite unbelievably civilized. It might even be the categorically best general procedure when worlds collided.
The Lady 3rd brightened. "A nod - is that assent, humankind?"
"It's acknowledgment," Akon said. "We'll have to think about this."
"I understand," the Lady 3rd said. "Please think as swiftly as you can. Babyeater children are dying in horrible agony as you think."
"I understand," Akon said in return, and gestured to cut the transmission.
The holo blinked out.
There was a long, terrible silence.
"No."
The Lord Pilot said it. Cold, flat, absolute.
There was another silence.
"My lord," the Xenopsychologist said, very softly, as though afraid the messenger would be torn apart and dismembered, "I do not think they were offering us that option."
"Actually," Akon said, "The Superhappies offered us more than we were going to offer the Babyeaters. We weren't exactly thinking about how to compensate them." It was strange, Akon noticed, his voice was very calm, maybe even deadly calm. "The Superhappies really are a very fair-minded people. You get the impression they would have proposed exactly the same solution whether or not they happened to hold the upper hand. We might have just enforced our own will on the Babyeaters and told the Superhappies to take a hike. If we'd held the upper hand. But we don't. And that's that, I guess."
"No!" shouted the Lord Pilot. "That's not -"
Akon looked at him, still with that deadly calm.
The Lord Pilot was breathing deeply, not as if quieting himself, but as if preparing for battle on some ancient savanna plain that no longer existed. "They want to turn us into something inhuman. It - it cannot - we cannot - we must not allow -"
"Either give us a better option or shut up," the Lord Programmer said flatly. "The Superhappies are smarter than us, have a technological advantage, think faster, and probably reproduce faster. We have no hope of holding them off militarily. If our ships flee, the Superhappies will simply follow in faster ships. There's no way to shut a starline once opened, and no way to conceal the fact that it is open -"
"Um," the Ship's Engineer said.
Every eye turned to him.
"Um," the Ship's Engineer said. "My Lord Administrator, I must report to you in private."
The Ship's Confessor shook his head. "You could have handled that better, Engineer."
Akon nodded to himself. It was true. The Ship's Engineer had already betrayed the fact that a secret existed. Under the circumstances, easy to deduce that it had come from the Babyeater data. That was eighty percent of the secret right there. And if it was relevant to starline physics, that was half of the remainder.
"Engineer," Akon said, "since you have already revealed that a secret exists, I suggest you tell the full Command Conference. We need to stay in sync with each other. Two minds are not a committee. We'll worry later about keeping the secret classified."
The Ship's Engineer hesitated. "Um, my lord, I suggest that I report to you first, before you decide -"
"There's no time," Akon said. He pointed to where the holo had been.
"Yes," the Master of Fandom said, "we can always slit our own throats afterward, if the secret is that awful." The Master of Fandom gave a small laugh -
- then stopped, at the look on the Engineer's face.
"At your will, my lord," the Engineer said.
He drew a deep breath. "I asked the Lord Programmer to compare any identifiable equations and constants in the Babyeater's scientific archive, to the analogous scientific data of humanity. Most of the identified analogues were equal, of course. In some places we have more precise values, as befits our, um, superior technological level. But one anomaly did turn up: the Babyeater figure for Alderson's Coupling Constant was ten orders of magnitude larger than our own."
The Lord Pilot whistled. "Stars above, how did they manage to make that mistake -"
Then the Lord Pilot stopped abruptly.
"Alderson's Coupling Constant," Akon echoed. "That's the... coupling between Alderson interactions and the..."
"Between Alderson interactions and the nuclear strong force," the Lord Pilot said. He was beginning to smile, rather grimly. "It was a free parameter in the standard model, and so had to be established experimentally. But because the interaction is so incredibly... weak... they had to build an enormous Alderson generator to find the value. The size of a very small moon, just to give us that one number. Definitely not something you could check at home. That's the story in the physics textbooks, my lords, my lady."
The Master of Fandom frowned. "You're saying... the physicists faked the result in order to... fund a huge project...?" He looked puzzled.
"No," the Lord Pilot said. "Not for the love of power. Engineer, the Babyeater value should be testable using our own ship's Alderson drive, if the coupling constant is that strong. This you have done?"
The Ship's Engineer nodded. "The Babyeater value is correct, my lord."
The Ship's Engineer was pale. The Lord Pilot was clenching his jaw into a sardonic grin.
"Please explain," Akon said. "Is the universe going to end in another billion years, or something? Because if so, the issue can wait -"
"My lord," the Ship's Confessor said, "suppose the laws of physics in our universe had been such that the ancient Greeks could invent the equivalent of nuclear weapons from materials just lying around. Imagine the laws of physics had permitted a way to destroy whole countries with no more difficulty than mixing gunpowder. History would have looked quite different, would it not?"
Akon nodded, puzzled. "Well, yes," Akon said. "It would have been shorter."
"Aren't we lucky that physics didn't happen to turn out that way, my lord? That in our own time, the laws of physics don't permit cheap, irresistable superweapons?"
Akon furrowed his brow -
"But my lord," said the Ship's Confessor, "do we really know what we think we know? What different evidence would we see, if things were otherwise? After all - if you happened to be a physicist, and you happened to notice an easy way to wreak enormous destruction using off-the-shelf hardware - would you run out and tell you?"
"No," Akon said. A sinking feeling was dawning in the pit of his stomach. "You would try to conceal the discovery, and create a cover story that discouraged anyone else from looking there."
The Lord Pilot emitted a bark that was half laughter, and half something much darker. "It was perfect. I'm a Lord Pilot and I never suspected until now."
"So?" Akon said. "What is it, actually?"
"Um," the Ship's Engineer said. "Well... basically... to skip over the technical details..."
The Ship's Engineer drew a breath.
"Any ship with a medium-sized Alderson drive can make a star go supernova."
Silence.
"Which might seem like bad news in general," the Lord Pilot said, "but from our perspective, right here, right now, it's just what we need. A mere nova wouldn't do it. But blowing up the whole star - " He gave that bitter bark of laughter, again. "No star, no starlines. We can make the main star of this system go supernova - not the white dwarf, the companion. And then the Superhappies won't be able to get to us. That is, they won't be able to get to the human starline network. We will be dead. If you care about tiny irrelevant details like that." The Lord Pilot looked around the Conference Table. "Do you care? The correct answer is no, by the way."
"I care," the Lady Sensory said softly. "I care a whole lot. But..." She folded her hands atop the table and bowed her head.
There were nods from around the Table.
The Lord Pilot looked at the Ship's Engineer. "How long will it take for you to modify the ship's Alderson Drive -"
"It's done," said the Ship's Engineer. "But... we should, um, wait until the Superhappies are gone, so they don't detect us doing it."
The Lord Pilot nodded. "Sounds like a plan. Well, that's a relief. And here I thought the whole human race was doomed, instead of just us." He looked inquiringly at Akon. "My lord?"
Akon rested his head in his hands, suddenly feeling more weary than he had ever felt in his life. From across the table, the Confessor watched him - or so it seemed; the hood was turned in his direction, at any rate.
I told you so, the Confessor did not say.
"There is a certain problem with your plan," Akon said.
"Such as?" the Lord Pilot said.
"You've forgotten something," Akon said. "Something terribly important. Something you once swore you would protect."
Puzzled faces looked at him.
"If you say something bloody ridiculous like 'the safety of the ship' -" said the Lord Pilot.
The Lady Sensory gasped. "Oh, no," she murmured. "Oh, no. The Babyeater children."
The Lord Pilot looked like he had been punched in the stomach. The grim smiles that had begun to spread around the table were replaced with horror.
"Yes," Akon said. He looked away from the Conference Table. He didn't want to see the reactions. "The Superhappies wouldn't be able to get to us. And they couldn't get to the Babyeaters either. Neither could we. So the Babyeaters would go on eating their own children indefinitely. And the children would go on dying over days in their parents' stomachs. Indefinitely. Is the human race worth that?"
Akon looked back at the Table, just once. The Xenopsychologist looked sick, tears were running down the Master's face, and the Lord Pilot looked like he were being slowly torn in half. The Lord Programmer looked abstracted, the Lady Sensory was covering her face with her hands. (And the Confessor's face still lay in shadow, beneath the silver hood.)
Akon closed his eyes. "The Superhappies will transform us into something not human," Akon said. "No, let's be frank. Something less than human. But not all that much less than human. We'll still have art, and stories, and love. I've gone entire hours without being in pain, and on the whole, it wasn't that bad an experience -" The words were sticking in his throat, along with a terrible fear. "Well. Anyway. If remaining whole is that important to us - we have the option. It's just a question of whether we're willing to pay the price. Sacrifice the Babyeater children -"
They're a lot like human children, really.
"- to save humanity."
Someone in the darkness was screaming, a thin choked wail that sounded like nothing Akon had ever heard or wanted to hear. Akon thought it might be the Lord Pilot, or the Master of Fandom, or maybe the Ship's Engineer. He didn't open his eyes to find out.
There was a chime.
"In-c-c-coming c-call from the Super Happy," the Lady Sensory spit out the words like acid, "ship, my lord."
Akon opened his eyes, and felt, somehow, that he was still in darkness.
"Receive," Akon said.
The Lady 3rd Kiritsugu appeared before him. Her eyes widened once, as she took in his appearance, but she said nothing.
That's right, my lady, I don't look super happy.
"Humankind, we must have your answer," she said simply.
The Lord Administrator pinched the bridge of his nose, and rubbed his eyes. Absurd, that one human being should have to answer a question like that. He wanted to foist off the decision on a committee, a majority vote of the ship, a market - something that wouldn't demand that anyone accept full responsibility. But a ship run that way didn't work well under ordinary circumstances, and there was no reason to think that things would change under extraordinary circumstances. He was an Administrator; he had to accept all the advice, integrate it, and decide. Experiment had shown that no organizational structure of non-Administrators could match what he was trained to do, and motivated to do; anything that worked was simply absorbed into the Administrative weighting of advice.
Sole decision. Sole responsibility if he got it wrong. Absolute power and absolute accountability, and never forget the second half, my lord, or you'll be fired the moment you get home. Screw up indefensibly, my lord, and all your hundred and twenty years of accumulated salary in escrow, producing that lovely steady income, will vanish before you draw another breath.
Oh - and this time the whole human species will pay for it, too.
"I can't speak for all humankind," said the Lord Administrator. "I can decide, but others may decide differently. Do you understand?"
The Lady 3rd made a light gesture, as if it were of no consequence. "Are you an exceptional case of a human decision-maker?"
Akon tilted his head. "Not... particularly..."
"Then your decision is strongly indicative of what other human decisionmakers will decide," she said. "I find it hard to imagine that the options exactly balance in your decision mechanism, whatever your inability to admit your own preferences."
Akon slowly nodded. "Then..."
He drew a breath.
Surely, any species that reached the stars would understand the Prisoner's Dilemma. If you couldn't cooperate, you'd just destroy your own stars. A very easy thing to do, as it had turned out. By that standard, humanity might be something of an impostor next to the Babyeaters and the Superhappies. Humanity had kept it a secret from itself. The other two races - just managed not to do the stupid thing. You wouldn't meet anyone out among the stars, otherwise.
The Superhappies had done their very best to press C. Cooperated as fairly as they could.
Humanity could only do the same.
"For myself, I am inclined to accept your offer."
He didn't look around to see how anyone had reacted to that.
"There may be other things," Akon added, "that humanity would like to ask of your kind, when our representatives meet. Your technology is advanced beyond ours."
The Lady 3rd smiled. "We will, of course, be quite positively inclined toward any such requests. As I believe our first message to you said - 'we love you and we want you to be super happy'. Your joy will be shared by us, and we will be pleasured together."
Akon couldn't bring himself to smile. "Is that all?"
"This Babyeater ship," said the Lady 3rd, "the one that did not fire on you, even though they saw you first. Are you therefore allied with them?"
"What?" Akon said without thinking. "No -"
"My lord!" shouted the Ship's Confessor -
Too late.
"My lord," the Lady Sensory said, her voice breaking, "the Superhappy ship has fired on the Babyeater vessel and destroyed it."
Akon stared at the Lady 3rd in horror.
"I'm sorry," the Lady 3rd Kiritsugu said. "But our negotiations with them failed, as predicted. Our own ship owed them nothing and promised them nothing. This will make it considerably easier to sweep through their starline network when we return. Their children would be the ones to suffer from any delay. You understand, my lord?"
"Yes," Akon said, his voice trembling. "I understand, my lady kiritsugu." He wanted to protest, to scream out. But the war was only beginning, and this - would admittedly save -
"Will you warn them?" the Lady 3rd asked.
"No," Akon said. It was the truth.
"Transforming the Babyeaters will take precedence over transforming your own species. We estimate the Babyeater operation may take several weeks of your time to conclude. We hope you do not mind waiting. That is all," the Lady 3rd said.
And the holo faded.
"The Superhappy ship is moving out," the Lady Sensory said. She was crying, silently, as she steadily performed her duty of reporting. "They're heading back toward their starline origin."
"All right," Akon said. "Take us home. We need to report on the negotiations -"
There was an inarticulate scream, like that throat was trying to burst the walls of the Conference chamber, as the Lord Pilot burst out of his chair, burst all restraints he had placed on himself, and lunged forward.
But standing behind his target, unnoticed, the Ship's Confessor had produced from his sleeve the tiny stunner - the weapon which he alone on the ship was authorized to use, if he made a determination of outright mental breakdown. With a sudden motion, the Confessor's arm swept out...
- ... and anesthetized the Lord Pilot.
- ... [This option will become the True Ending only if someone suggests it in the comments before the previous ending is posted tomorrow. Otherwise, the first ending is the True one.]
Within the confines of the story:
* No star that has been visited by starline has ever been seen from another, which implies a vastly larger universe than can be seen from a given lightcone. Basically, granting the slightly cryptographic assumption that travel between stars is impossible.
* The weapon is truly effective: works as advertised.
Any disagreement with that would have to say why """ 'Assume there is no god, then...' "But there _is_ a god!" """ fallacy doesn't apply here.
The threat of a nova feels like a more interesting avenue than the mere detonation.
Posted by: cwillu | February 03, 2009 at 11:09 AM
I'm not planning to trust anyone. My suggestion was based on the assumption that it is possible to watch what the Superhappys actually do and detonate the star if they start heading for the wrong portal. If that is not the case (which depends on the mechanics of the Alderson drive) then either detonate the local star immediately, or the star one hop back.
Posted by: Russell Wallace | February 03, 2009 at 12:02 PM
Wanna see incredibly intelligent people wasting time on absurd meaningless questions? Come here to Overcoming Bias! A stupid person will believe in any old junk, but it takes someone very intelligent to specifically believe in such elaborate nonsense.
One can only wonder what that might imply about those wise folk who have recognised all of this as nonsense, yet continue to read and even respond to it.
Anonymous
Posted by: Anonymous Coward | February 03, 2009 at 12:13 PM
I'm not planning to trust anyone. My suggestion was based on the assumption that it is possible to watch what the Superhappys actually do and detonate the star if they start heading for the wrong portal.
---
Once you know someone has technologies vastly ahead of your own, you might as well assume they can do your worst nightmares - because your imagination and assumptions are unlikely to present limits to their capabilities.
Imagine a group of humans circa 800 A.D. making assumptions about how they will be tracked down by a team of modern day soldiers with advanced communications, GPS, satellite imagery, airborne drones, camouflaged clothing, accurate weapons, poison gas, ... and those soldiers aren't even biologically or intellectually more advanced.
Posted by: Anonymous Coward | February 03, 2009 at 12:18 PM
If I were the humans, I'd report back to earth (they have valuable information), then send out a robotic probe through the Alderson drive and blow up the star.
The humnans in this story know that there are at least two alien cultures, and the culture shock from them is too much to deal with. If there are more cultures, it will be worse.
Posted by: Cabalamat | February 03, 2009 at 12:24 PM
Another possibility would be to blow yup Earth's sun. This fragments the human species, but increases the probability that some branches of humanity will survive.
Posted by: Cabalamat | February 03, 2009 at 12:33 PM
Oh boy,
I do not care if anyone creates art, but i do care if sentient beings are hurt.
The Babyeater way of living is basically like a social accepted Gulag, only worse.
And evidentially the Happy see humanity the same way.
Now what i also don't like is collectivism. Even the super happies seem rather singleminded, and pretty willing to make decisions for their whole species.
Now despite not fully understanding super happy ethics, and not trying to break the story my proposal would be:
The superhappies offer the living Babyeaters to change them, and will nevertheless rescue each and every baby from being eaten. Then these kids get the choice to return home at any time later (no idea, if they would be accepted) or live with da happies, while also being offered treatment/change for their condition.
[Readers should be aware, that with some searching it would be possible to find human cultures with similar ethics in the past. Think samurai, or holy warriors]
The same solution also works perfect for the humans. Offer treatment, protect the kids.
The happies might be able to accept pain that lasts only seconds, but will prevent any form of child abuse.
Now that sounds like an awful lot of work, but i think the happies might be able to pull it off, and of course its the only ethical thing to do that i can think off.
The alternative of killing sentient beings is cruel, no matter what.
Martin
Posted by: Martin | February 03, 2009 at 12:45 PM
The deal Akon reached with the super happies is so preposterously one-sided it is no surprise at all the babyeaters did not agree to it- and that could have been foreseen. For either humans or the babyeaters to even consider destroying their identity so the super happies will make art and jokes is absurd. For people, at least, self-identity is vastly more important then overall utility. Super happy art and jokes are worth basically nothing to the babyeaters and humans. If the super happies want humans to switch off physical pain, embarrassment etc. they should agree to 1. unconditional sharing of every technological advancement they make, 2. allow the individual adult humans the option of turning pain etc. back on, 3. do our baby eating for us. But thats just a suggestion- the main problem is that the chances of the super happies nailing the fairest possible deal on their first guess is astoundingly unlikely. Even with complete knowledge of human and babyeater culture their knowledge is phenomenologically inadequate for coming up with a deal that is actually fair to all. Not negotiating was irrational, as was failing to contact the babyeaters to get their thoughts on the deal before agreeing to it- three-party deals require three-party negotiations.
That the Confessor didn't step in sooner... is kind of ruining the story for me. I'm not sure if these issues were brushed aside to make your point or if you really don't understand how absurd this deal is.
Posted by: Jack | February 03, 2009 at 12:51 PM
Stop the superhappies' ship before it jumps out! They must not learn of humanity's existence. Use the Alderson drive if necessary.
Posted by: CannibalSmith | February 03, 2009 at 12:59 PM
First, with regards to the solution proposed by the superhappies, my thought would have been, right at the start, this:
Accept _IF_ they can ensure the following: For us, the change away from pain doesn't end up having indirect effects that, well, more or less screw up other aspects of our development. ie, one of the primary reasons why humanity might have been very cautious in the first place with regards to such changes.
With regards to the business of us changing to more resemble babyeaters, can they simultaneously ensure that the eaten children will not have, at any point, been conscious? And can they ensure that the delayed consciousness (not merely self awareness, but consciousness, period) doesn't negatively impact, in other ways, human development?
Further, can they ensure that making us, well, babyeater like does _NOT_ otherwise screw with our sympathy and compassion?
_IF_ all of the above can be truly answered "yes", then (in my view) the price that humanity would pay would not really be all that bad.
Of course, we have to then ask about the changes to the babyeaters? Presumably, the ideal would be something like "delay onset of consciousness until after the culling (and not at all, of course, for those that are eaten)", but in such a way that intelligence and learning is still there, and when the babyeater becomes conscious, it can integrate data and experience acquired while it was not conscious.
But, a question arises, a possibly very important one: Should the Superhappies firing on the Babyeater ship be considered evidence that Superhappies are Prisoner's Dilemma _defectors_?
If yes, then how much can we trust the Superhappies to actually implement the solution they proposed, rather than do something entirely different? And _THAT_ consideration would be perhaps the only consideration (I can think of so far) for really considering the "blow up a star to close down the paths leading to humanity's worlds" option (post Babyeater fix, perhaps))
Posted by: Psy-Kosh | February 03, 2009 at 01:05 PM
If the humans know how to find the babyeaters' star,
and if the babyeater civilization can be destroyed by blowing up one star,
then I would like to suggest that they kill off the babyeaters.
Not for the sake of the babyeaters (I consider the proposed modifications to them better than annihilation from humanity's perspective)
but to prevent the super-happies from making even watered down modifications adding baby-eater values -
not so much to humans, since this can also be (at least temporarily) prevented by destroying Huygens -
but to themselves, as they are going to be the dominant life form in the universe over time, being the fastest growing and advancing species.
Of course, relative to destroying Huygens the price to pay in terms of modifications to human values is high, so I would not make this decision lightly.
Posted by: simon | February 03, 2009 at 01:14 PM
Is this story self-consistent? Consider that:
(i) it's easy to make stars go nova.
(ii) when a star goes nova, its Alderson lines disappear, disconnecting parts of the network from each other, and stopping a war if the different sides are no different parts of it (the fact that the network is sparce is important here)
(iii) both Babyeaters and the Superhappies know this
(iv) nevertheless the Superhappies still plan to prosecute a war against the babyeaters
Posted by: Cabalamat | February 03, 2009 at 01:36 PM
Well. I guess that stunning the Pilot is a reasonable thing to do, since he is obviously starting to act anti-socially. That is not the point though. Two things strike me as a bit silly, if not outright irrational.
First is about the babyeaters. Pain is relative. In case of higher creatures on earth, we define pain as a stimuli signaling the brain of some damage to the body. Biologically, pain is not all that different from other stimuli, such as cold or heat or just tactile feedback. The main difference seems to be in that we, humans, most of the time, experience pain in a highly negative way. And that is the only point of reference we know, so when humans say that babyeater babies are dying in agony they are making some unwarranted assumptions about the way babies percieve the world. After all, they are structurally VERY different from humans.
Second is about the "help" humans are considering for babyeaters and superhappies are considering for both humans and babyeaters. Basically by changing the babyeaters to not eat babies or to eat unconscious babies, their culture, as it is, is being destroyed. Whatever the result, the resulting species are not babyeaters and babyeaters are therefore dead. So, however you want to put it, it is a genocide. Same goes for humans modified to never feel pain and eat hundreds of dumb children. Whatever those resulting creatures are, they are no longer human either biologically, psychologically or culturally and humans, as a race, are effectively dead.
The problem seems to be that humans are not willing to accept any solution that doesn't lead to the most efficient and speedy stoppage of baby eating. That is, any solution where babyeaters will continue to eat babies for any period of time is considered inferior to any solution where babyeaters will stop right away. And the only reason for this is because humans are feeling discomfort at the thought of what they perceive as suffering of babies. In that aspect humans are no better then superhappies, they would rather genocide the whole race then allow themselves to feel bad about that race's behavior. If humans (and hopefully superhappies) stop to be such prudes and allow other races rights to make their own mistakes, a sample solution might lie in making the best possible effort to teach babyeaters human language and human moral philosophy, so they might understand human view on the value of individual consciousness and human view on individual suffering and make their own decision to stop eating babies by whatever means they deem appropriate. Or argue that their way is superior for their race, but this time with full information.
Posted by: Dmitriy Kropivnitskiy | February 03, 2009 at 01:40 PM
... but relative to simply cooperating, it seems a clear win. Unless the superhappies have thought of it and planned a response.
Of course, the corollary for the real world would seem to be: those people who think that most people would not converge if "extrapolated" by Eliezer's CEV ought to exterminate other people who they disagree with on moral questions before the AI is strong enough to stop them, if Eliezer has not programmed the AI to do something to punish that sort of thing.
Hmm. That doesn't seem so intuitively nice. I wonder if it's just a quantitative difference between the scenarios (eg quantity of moral divergence), or a qualitative one (eg. the babykillers are bad enough to justifiably be killed in the first place).
Posted by: simon | February 03, 2009 at 01:40 PM
Let's make a bit of summary.
Similarities: Each species considers suffering, in general, negative utility. Each species considers survival very high in utility. (Though at least some humans consider the possibility of sacrificing their species for the others' benefit, so this is not necessarily highest in value.) Each species has a kind of “fun” that's compatible with the others', and that's high in utility. They are all made of individuals, reproduce sexually, can communicate among themselves and at least somewhat compatibly with the others.
Differences:
* crystal pogo-sticks:
- this appears to indicate that they have some equivalent of empathy for other species
- have other "compatible pleasures" with humans, e.g. living & eating, reproduction, and art;
- but consider suffering of winnowed children acceptable (indeed, good) because it is useful for the existence and evolution of their species (the main selective pressure); so the existence and evolution of their species is considered to have massive positive utility. The relationship appears hard-wired in their thinking processes due to natural evolution (because that's _how_ evolution worked for them).
- avoid their suffering, and that of other species'
- this is not conditioned on the other species' eating of their children: they tried to “help” humans adopt children-eating although humans don't already do it; therefore, they assign positive utility to other species' utility _independently_ of whether or not they eat their children. Also, they didn't instantly kill the humans, even though they could have had at the start.
- appear to be very good team players as a species, even hard-wired for that. In fact, this appears to be the _top_ of their value pyramid.
* noisy bipeds:
- enjoy various pleasures, like living & eating, reproduction, and art and humor;
- avoid their own suffering, and that of others (empathy); this is hard-wired into their brains, as a survival mechanism. But they consider low-level suffering (of children and adults) acceptable (indeed, good) because: it is useful for the existence of their species (learning to avoid things with unpleasant consequences); natural evolution hard-wired biped's brains to _like_ the results of suffering (this goes as far as valuing more something obtained effort-fully than the same thing obtained effortlessly); in the ancestral environment, many useful things could not be obtained without some suffering, so a complex system of trade-offs evolved in the brain.
- much of their team-playing is rational: they have instincts to cheat, and those are rationally countered if an unpleasant outcome is anticipated (though anticipation is also influenced by cooperative instincts; the rational part has at least some part in balancing them).
* happy tentacly lumps:
- avoid suffering; no explicit indication why, presumably evolved as in the other two species.
- have empathy; this might be evolved or engineered, not clear; but it's not an absolute value, if we trust their statement that they're willing to alter it if it causes them unavoidable suffering.
- don't seem to assign any value to suffering, however.
- like happiness a lot, but this doesn't seem to be the absolute core value: they've not short-circuited their pleasure centers. So there must be something higher: experiencing the Universe? Liking happiness was probably originally evolved (it's a mechanism of evolution), but might have been tampered with then.
- they seem rational team-players, too: it promises more future happiness rather than less future suffering.
* * *
I'm a bit less versed in the Prisoner's Dilemma than I suspect most here are, so I'll summarize what I understand. There's supposed to be, for each “player”, the best personal outcome (everyone else cooperates, you cheat), the worst personal outcome (you cooperate, everyone else cheats) and the global compromise (everyone cooperates, nobody gets the bad outcome). I suppose in more that two players there are all sorts of combinations (two ally and cooperate, but collectively cheat against the other); I'm not sure how relevant that is here, we'll see. In real situations there are also more than two options, even with just two players (like the ultimatum game, you may "cheat" more or less). There's also another difference between the game and reality: in real life you may not really know the utility of each outcome (either because you mis-anticipated the consequences of each option, or because you don't know what you really want; I'm not sure if these two mean the same thing or not).
Let's see the extreme options. “+” means what each species considers the best outcome and “-” means what it considers worst if each species defects (as far as I can tell).
* crystal pogo-sticks:
+ everyone starts having a hundred children and eating them just before puberty.
- they are forced to keep living and multiplying, but prevented from eating their children; they don't even _want_ to eat them, the horror!
- same as above, but they're also happy about it and everything else.
* noisy bipeds:
+ they keep living and evolving as they do now; the crystal pogo-sticks stop eating self-aware children and are happy about it; and the happy tentacly lumps keep being happy and help everyone else being as happy as they want; either they start liking “useful” suffering or they stop empathizing with suffering of people who do want it.
- everyone starts having a hundred children and eating them just before puberty.
- everyone stops suffering and tries to be as happy as possible, having sex all the time. The current definition of “humanity” no longer applies to anything in the observable Universe.
* happy tentacly lumps:
+ everyone stops suffering and tries to be as happy as possible, having sex all the time. Horrible things like the current “humanity” and “baby-eaters” no longer exist in the observable Universe :))
- everyone starts having a hundred children and eating them just before puberty.
- humans keep suffering as much as they want, and keep living and evolving as they do now; the crystal pogo-sticks stop eating self-aware children and are happy about it, but may keep as much suffering as the humans believe acceptable; and they themselves keep being happy, help everyone else being as happy as they want, and start liking “useful” suffering.
This doesn't mean necessarily that each outcome is actually possible. As far as I can tell from the story, only the happy tentacles can actually cheat that way. The worst that humans _can_ do from the tentacle's POV is start a Dispersion: run back and start jumping randomly between stars, destroying the first few stars after jumping. Depending on who they want to screw most, they may also destroy the meeting point, and/or send warning and/or advice to the crystal pogo-sticks. I think the pogo-sticks can do the same (it appears from the story that the star-destroying option is obvious, so they could start a Dispersion, too). This wouldn't prevent problems forever, but it would at least give time to the Dispersed to find other options.
The “compromise” proposed by the happy tentacly lumps doesn't seem much worse than their best option, though: the only difference I can see is that everyone starts eating unconscious children. (I don't see why they wouldn't try humor and more complex pleasures anyway: they haven't turned themselves into orgasmium, so they presumably want to experience pleasurable _things_, not pleasure itself.) I don't understand crystalline psychology well enough, but it seems pretty close to the worst-case scenario for them. And it's actually a bit worse than the worst-case tentacle-defecting scenario for the humans.
The tentacly lumps may think fast, but it seems to me that either they don't think much better, or they're conning everyone else. They're in quite a hurry to act, which is suspicious a bit:
OK, it's reasonable that they're concerned about the crystalline children. But they also know that the other species have trouble thinking as fast as them, and there's another option that I'm surprised nobody mentioned:
As long as everyone cooperates, everyone can just agree to _temporarily_ stop doing whatever the others find unacceptable, and use the time to find more solutions, or at any case understand the solution others propose. They may find each other “varelse” and start a war, but I see no reason for any species to do it _right_now_, even if they know they'd win it. (This assumes they all cooperate in the Prisoner's Dilemma as a matter of principle, of course.)
* * *
While the crystal pogo-sticks and the noisy bipeds won't much enjoy putting a temporary stop to having children (say, a year or a decade, even a century), I don't see why having the “happy tentacly compromise” _right_now_ would be higher in their preference ordering, since apparently nobody ages significantly. Even a _temporary_ stop to disliking not having children doesn't seem a problem (none of the three species seem inclined to reproduce unlimitedly, so they must have some sort of reproductive controls already beyond the natural ones). The happy tentacly lumps are carefully designed in the story to not have any unwanted attributes themselves except that they want and can transform the other species without their will. The humans (and myself) seem to consider their private habits, as far as they shared them, merely a bit boring relative to others, and the crystalline pogo-sticks seem to consider not eating children dis-likable but acceptable in other species, at least temporarily (since they didn't attack anyone). So the only compromise they'd have to do is _temporarily_ stop empathizing with small amounts of suffering (i.e., that of the other species not having children during the debate) and not forcibly convert them until afterwards.
As far as I can tell after a day of thinking, the result of the debate would include the crystalline pogo-sticks understanding that not eating children and cooperating are compatible in other species (they do have the concept of “mistaken hypothesis”, and they just got a lot more data than they had before; also they didn't instantly attack a species that never eats its children), and also accept some way of continuing their way of life without eating _conscious_ children. Depending on the reproduction (& death, if applicable) rates of each species, and their flexibility, it might even be technically possible to let them reproduce normally, but modify their children such that they don't suffer during the winnowing, and the eaten ones become a separate non-reproducing species voluntarily.
As for the humans, from my reading of the story I understand that the happy tentacly lumps mostly object to _involuntary_ human suffering, i.e. the children. They don't like the voluntary suffering, but it doesn't seem to me they'd force the issue on adults. So they should at least accept letting the existing adults decide if they want to keep their suffering, such as it is. I don't find it unacceptable a compromise where children get to grow up without any suffering they don't want, especially (but not necessarily) if the growing is engineered so that the final effect is essentially the same (i.e., they become as “normal” humans and accept suffering in “usual” circumstances, even if they didn't grow up with it). Of course, we're psychologically closer to the Confessor than to the rest of the humans in the story, so what we consider acceptable is as irrelevant as his to what decision they'd take.
The happy tentacly lumps might have simply anticipated all this, and decided on the best outcome they want. (In case they're really _really_ smart and practically managed to simulate the others species.) This would explain why they didn't propose the above, but would make the story moot. In that case the situation is somewhat analogue to an AI in the box, except that you can't destroy the box nor the AI inside, you can only decide to keep it there. My decision there would be to put as big a pile of locks as I can on the box, and hope the AI can't eventually get out by itself. The analogue of which would be Dispersion. (But the analogy is not an isomorphism: the AI is in an open box right now, and it doesn't seem to try to jump out, i.e. it didn't blow up the human ship yet, which is why the story is still interesting.)
Posted by: bogdanb | February 03, 2009 at 01:53 PM
Go back to Earth and detonate. It will mean the end of the civilization they know, but the Superhappys will still hunt the survivors down with 2^2^2^2 ships, and will force an equitable compromise for each surviving pocket of humanity, each of which will make the whole more human then they would be with just one compromise with humanity.
I just can't figure out who the Confessor will shoot, or if he will just threaten, to make it happen. And I want to read both endings.
Posted by: spriteless | February 03, 2009 at 02:02 PM
If the supperhappies are more advanced than us, then shouldn't they know the true value of the strong nuclear force, and thus know that blowing up the star is an option?
Posted by: bbot | February 03, 2009 at 02:05 PM
The Superhappies' decision seems reasonable. I am not sure what alternative solution might be. Hrm.
Posted by: Cassandra | February 03, 2009 at 02:14 PM
Dmitry, concerning genocide, I believe you are anthropomorphizing a culture. "Babyeater culture" is not a person. Eliminating the culture is not a crime if performed by non-murderous means; consider an alternative "final solution" of using rational arguments and financial incentives to convince Jews to discard Judaism.
Perhaps the act of forcible biological modification to prevent criminal behavior is wrong (e.g. chemical castration for child molesters), but it isn't the same as a murder.
Posted by: Chris | February 03, 2009 at 02:16 PM
What is giving some people the impression that saying, "no," was an option? I mean, they could have turned down the compromise, but unless they had something to offer right then, that would have meant instant death (and then the compromise would be implemented anyway). "Yes" means the humans are not defecting right now, while it is (pointlessly) suicidal.
Posted by: Zubon | February 03, 2009 at 02:17 PM
Chris, I don't think I am wrong in this. To give an analogy (and yes, I might be anthropomorphizing, but I still think I am right), if someone gives me a lobotomy, I, Dmitriy Kropivnitskiy, will no longer exist, so effectively it would be murder. If Jews are forced to give up Judaism and move out of Israel, there will no longer be Jews as we know them or as they perceive themselves, so effectively this would be genocide.
Posted by: Dmitriy Kropivnitskiy | February 03, 2009 at 02:27 PM
I am not certain I understand the terms of the puzzle. Should the audience come up with a better ending, a more plausible ending, or an ending which works better as story? And if we fail at this task, will we still get to know the other ending you had in mind?
Posted by: Rolf Andreassen | February 03, 2009 at 02:32 PM
Humanity could always offer to sacrifice itself. Compare the world where humanity compromises with both the Babyeaters and the Super Happy, versus one where we convince them to not compromise and instead make everybody Super Happy.
Of course, I'm just guessing, since I'm not a Utilitarian.
Posted by: Thom Blake | February 03, 2009 at 02:56 PM
The Super Happies hate pain, and seeing others in pain causes them to experience pain. Humans tolerate pain better than the Super Happies do. This gives the humans a weapon to use against them, or at least negotiating leverage. They can threaten to hurt themselves unless the Super Happies give them a better deal.
(So, in order to unlock the True Ending, do we have to come up with a way for the humans to "win" and get what they want, alien utility functions be damned, or should we take the aliens' preferences into account too?)
Posted by: Doug S. | February 03, 2009 at 03:16 PM
(Long time lurker - first post)
The course I would suggest, if on the IPW, would be to rally the Human fleet to set up a redundant and tamper-resistant self-destruct system on the newly-discovered star - with a similar system set up at the Human colony one jump further back.
When the Super-Happys return, we would give them the option:
1. Altering their preferences to align with Human values, at least enough so that they would no longer consider changing Humans without their full consent.
2. Immediately detonating the star - so they would no longer be able to rescue the Baby-Eater's Babies.
Any other course of action, or attempting to tamper with the self-destruct would trigger the self-destruct (and perhaps that on the next Human Colony in case they prevented the first nova).
We would offer volunteers to join the Super-Happys, in order to explore the feasibility and desirability of further harmonization. (and also monitor their compliance with the agreement... and steal as much technology as possible).
I say this as an unabashed defender of the superiority of Human values, who is willing to use our native endowment of vicious craftiness to defend and promote those ideals.
Posted by: Eric Gilbertson | February 03, 2009 at 03:17 PM
Akon clearly lost his mind, so the Confessor should anesthetize him. He does not need to break his oath and take the command of the ship. Instead he can just point out some obvious things to the rest. Such as that it would be crazy to blackmail Supperhappies using a single ship with no communication to the rest of humanity. Or that interest rates need not fall through the floor the way Akon was trying to convince them, but instead would rise by the similar amount. Or what Cabalamat pointed out. I am only not sure what ending does this lead to.
Posted by: Giedrius | February 03, 2009 at 03:31 PM
This was a failed negotiation. The fact that the babyeaters rejected the superhappy proposal means it is not symmetric. It is not a compromise that fair babyeaters would propose if they were in the superior position.
That the superhappies proposed it and then ignored evidence that it was unacceptable, is evidence that the superhappies are not being as fair as Akon seemed to think they were. It is obvious that they are not sacrificing their value system as much as they are requiring the babyeaters to. They are pushing their own values on the babyeaters because they CAN, not because they are offering a balanced utility exchange. They are likely doing the same to us.
They view the babyeater situation as dire enough that they are willing to enact modifications without acceptance. They gave humankind a general proposal that they predicted humankind would accept. They COULD just make modifications, but part of their value system includes getting human acceptance.
I'm not sure, but I think the humans should threaten/ go to war with them, So they make no more modifications except those that they think they MUST make. That'll be my guess. Stun the captain, go to war.
I'm not sure what the babyeater's current stance says about how much they've considered the possibility that they will encounter superpowered babyeaters in the future.
Posted by: James Andrix | February 03, 2009 at 03:44 PM
Dmitry, if someone destroys your brain or alters it enough so that it is effectively the brain of a different person, that is indeed murder. Your future utility is lost, and this is bad. Forcing you to behave differently is not murder. It may be a crime (slavery) or it may not be (forcing you to not eat your children), but it is not murder.
Genocide (as I understand the term) is murder with the goal of eliminating an identifiable group. It is horrific because of the murder, not because the identifiable characteristics of the group disappear.
My understanding is that preventing babyeating will be done in such a way as to minimize harm done to adult babyeaters, and only if such harm is outweighed by the utility of saving babyeater children. It is vastly different than genocide; the goal is to prevent as much killing as possible, not eliminate the babyeating aliens.
Incidentally, my hypothetical "final solution" is actually a Pareto improvement: every Jew who converts does so because it increases his/her utility.
Posted by: Chris | February 03, 2009 at 03:55 PM
I would guess that the True Ending involves the Confessor stunning Akon. The aliens used every trick in the book to influence the humans. They communicated using real-time video instead of text transmissions. They gave speeches perfectly suited to tug on people's emotional levers. Since the Superhappies run at an accelerated rate, this also forced Akon to respond before he could fully process information.
I would almost say Akon's mind has been hacked. Akon had very little time to think before accepting the Superhappy terms and he currently seems resigned to the destruction of humanity. He uses "negotiations" to describe the Superhappy ultimatum. Anyway, he's probably not fit to lead the ship. The Pilot hasn't had a mental breakdown, he's just (understandably) outraged at what's going on. If the stunner is only used in the case of mental breakdown, the Pilot will have to be stopped by other means. Once a new leader is elected/promoted/whatever, the Confessor should require all real-time communication from the Superhappies to be text-only.
The Superhappies may be technologically superior, but their weakness is the fact that they don't separate genes from memories. They also don't withhold information from each other. This could allow a specially-crafted memory to disrupt or destroy the entire race. Even the kiritsugu are shocked by the slightest display of suffering, so it's not much of a stretch to say some images exist that would permanently traumatize all Superhappies.
Of course, destruction isn't the goal, modifying is. Before the Superhappies leave, the humans should ask to stay in contact with one Superhappy ship during Operation Babyeater. By studying them more, the humans could find a way to insert a memory that changes Superhappies to be less of a threat. If the humans have the upper hand, they can actually decide whether or not to adopt superhappiness instead of having the choice forced on them.
If it doesn't work, at least the humans will know how big the Superhappy armada is. They could wait for the Superhappies to return from Babyeater territory and blow up the system. The babies would be saved and humanity would be safe until the next nova.
Full cooperation is not one of the scenarios I outlined, since most humans would not want to become Superhappy. As the Confessor said, "You have judged. What else is there?"
Posted by: Geoff | February 03, 2009 at 04:05 PM
Does anyone else have suspicions about the "several weeks" timeframe that the Lady 3rd has given for the transforming of the Babyeaters?
What can the Superhappies do in several weeks, regardless of their hyper-advanced and hyper-advancing technology? I suspect not much other than kill off most of the species. A quick genocide will decrease more suffering on the long run than an arduous peaceful solution.
Genocide seems even more likely since lady 3rd told Akon that his decision would be identical to other human decision makers.
The Babyeaters of the ship decided not to cooperate and they were destroyed. The rest of the decision makers of the Babyeaters will not cooperate and will have to be destroyed (in the mind of the Lady 3rd).
So at this point, the Confessor shocks the Administrator and they allow the Superhappies to go on with their genocide of the Babyeaters. Unavoidable and humanity would have done a very similar thing anyway. Then destroy the star and go back to Earth to prepare to meet the Superhappies again in a few decades or so (since their progress is a few orders of magnitude faster, humans can easily expect to see them again uncomfortably soon). Preparations would include eliminating suffering and such so that a new war would be avoided after the next meeting. Why on earth haven't they eliminated pain anyway? :)
Posted by: Hikikomori | February 03, 2009 at 04:24 PM
I'm beginning to suspect this is a trick question. Well, sort of.
If the situation were reversed, how would you answer? If the technologically advanced Babyeaters had offered a one-sided "compromise" and then destroyed the primitive Superhappy ship when they refused?
The strong aliens have demonstrated their willingness to defect in a prisoner's dilemma type situation while the weak ones cooperated. That suggests we should cooperate with the weak ones and defect against the strong ones. I don't think the particulars of their moral systems should override that.
Prisoner's Dilemma has been prominent enough in the story that Akon's failure to appreciate the implications of the defection seems like a severe lapse of judgement. The Confessor stuns him and the remaining crew reconsiders the situation.
Posted by: Aleksi Liimatainen | February 03, 2009 at 04:30 PM
The Informations told/implied to the Humans that they don't lie or withold information. That is not the same as the Humans knowing that the Informations don't lie.
Posted by: GreedyAlgorithm | February 03, 2009 at 04:30 PM
Eliezer's novella provides a vivid illustration of the danger of promoting what should have stayed an instrumental value to the the status of a terminal value. Eliezer likes to refer to this all-too-common mistake as losing purpose. I like to refer to it as adding a false terminal value.
For example, eating babies was a valid instrumental goal when the Babyeaters were at an early state of technological development. It is not IMHO evil to eat babies when the only alternative is chronic severe population pressure which will eventually either lead to your extinction or the disintegration of your agricultural civilization with a reversion to a more primitive existence in which technological advancement is slow, uncertain and easily reversed by things like natural disasters.
But then babyeating became an end in itself.
By clinging to the false terminal value of babyeating, the Babyeaters caused their own extinction even though at the time of their extinction they had an alternative means of preventing an explosion of their population (particularly, editing their own genome so that fewer babies are born: if they did not have the tech to do that, they could have asked the humans or the the Superhappies for it).
In the same way, the humans in the novella and the Superhappies are the victims of a false terminal value, which we might call "hedonic altruism": the goal of extinguishing suffering wherever it exists in the universe. Eliezer explains some of the reasons for the great instrumental value of becoming motivated by the suffering of others in Sympathetic Minds in the passage that starts with "Who is the most formidable, among the human kind?" Again, just because something has great instrumental value is no reason to promote it to a terminal value; when circumstances change, it may lose its instrumental value; and a terminal value once created tends to persist indefinitely because by definition there is no criterion by which to judge a system of terminal values.
I hope that human civilization will abandon the false terminal value of hedonic altruism before it spreads to the stars. I.e., I hope that the human dystopian future portrayed in the novella can be averted.
Posted by: Richard Hollerith | February 03, 2009 at 04:46 PM
Geoff: "They also don't withhold information from each other. This could allow a specially-crafted memory to disrupt or destroy the entire race."
This is not Star Trek, my Lord.
Posted by: Z. M. Davis | February 03, 2009 at 04:55 PM
Note that the kiritsugu as depicted through Cultural Translater versions 2 and 3 doesn't show any shock at humans being stressed; that depiction only appears in version 16. As such, it seems likely that this depiction is not based on the kiritsugu's actual emotional state, but rather added to better allow humans to communicate with ver.
This is not going to work. The kiritsugu learned about the Babyeater culture without being impaired. Nothing humanity can reasonably come up with in the relevant timeframe will come close to that knowledge in shock-value.Posted by: 51a1fc26f78b0296a69f53c615ab5a2f64ab1d1e | February 03, 2009 at 05:34 PM
I do. Pain is painful in "beasts" too. What does it matter if they are made of crystals, are hairy or whatever?
Posted by: db | February 03, 2009 at 05:36 PM
Chris, continuing with my analogy, if instead of lobotomy, I was forced to undergo a procedure, that would make me a completely different person without any debilitating mental or physical side effects, I would still consider it murder. In case of Eliezer's story, we are not talking about enforcement of a rule or a bunch of rules, we are talking a permanent change of the whole species on biological, psychological and cultural level. And that, I think, can be safely considered genocide.
Posted by: Dmitriy Kropivnitskiy | February 03, 2009 at 05:42 PM
The humans, Babyeaters, and Superhappies were attracted by the nova. They were all eager to meet aliens. The Babyeaters and the Superhappies have the means to create supernovae artificially. They should be able to create ordinary novae too. This would be a good way to meet aliens. Why haven't they tried that?
Posted by: Peter de Blanc | February 03, 2009 at 05:45 PM
Peter - I am, sadly, not an astrophysicist, but it seems reasonable that such an act would substantially decrease the negentropy available from that matter, which is important if you're a species of immortals thinking of the long haul.
Posted by: Mike Blume | February 03, 2009 at 06:19 PM
Peter, being able to blow up a whole star (a process that is obviously going to involve some kind of positive feedback cycle) is not the same as being able to start novas. A nova is not a detonation of a star. A nova is the detonation of a shell of hydrogen that has accumulated from a companion and compressed on the surface of a degenerate star (white dwarf).
Posted by: Eliezer Yudkowsky | February 03, 2009 at 06:27 PM
I had asked why the Babyeaters and Superhappies have not intentionally created novae. But now I think it's pretty likely that the Babyeaters actually caused the nova. The Babyeaters were in the system first, despite being the least technologically advanced race, and despite having made special preparations for the hostile environment (the mirror shielding). If they had come in response to the nova, they probably would have been the last to arrive.
Posted by: Peter de Blanc | February 03, 2009 at 06:41 PM
We know an Alderson drive can cause a supernova. We should consider the possibility that the original nova wasn't just a coincidental rendezvous signal, but was intentionally created by the superhappys. Of course this assumes that Alderson drives are just as good for creating a nova as a supernova.
Posted by: Doug | February 03, 2009 at 06:42 PM
I missed Eli's reply before my most recent post. Although he hasn't said that the Babyeaters can't induce a nova, I'm lowering my probability that they did.
Posted by: Peter de Blanc | February 03, 2009 at 06:45 PM
What if the Superhappys created the Babyeaters and the supernova? The baby eaters wouldn't really eat babies, they wouldn't even really exist. And seeing the baby eaters would make humans more apt to compromise when they shouldn't. http://en.wikipedia.org/wiki/Argument_to_moderation
So shoot the hypnotized Captain.
Posted by: spriteless | February 03, 2009 at 07:04 PM
2. ... and anesthetized the entire crew, at which point he proceeded to have nonconsensual sex with every person aboard the ship. When in Rome...!
Posted by: Greg | February 03, 2009 at 07:08 PM
Z. M. Davis: OK, well it may not be self-replicating but it was worth a shot. Extreme empathy is basically the only weakness the Superhappies have. I'm not a big Star Trek fan, so I haven't seen the first two episodes you linked to and I only vaguely remember the last one.
51a1fc26f78b0296a69f53c615ab5a2f64ab1d1e: Or early versions of the translator failed to convey the humans' stress to the Superhappies. The kiritsugu are rather isolated from the rest of the crew, so while they have knowledge of the Babyeaters, maybe they haven't seen the videos. It would be analogous to reading about the Holocaust versus stepping into a holodeck depicting a concentration camp. Yes, I'm assuming aliens have a bias similar to humans. If that's not the case, then all non-kiritsugu Superhappies will be grief-stricken for quite some time after hearing about the Babyeaters. There would also have to be a very good reason why kiritsugu lack an emotion/reaction found in the rest of their kind. Humans without empathy are autistic or psychopaths. Again I'm arguing from a human analogy, but removing an emotion can completely change a being (http://www.overcomingbias.com/2009/01/boredom.html).
Anyway, most of my speculation is probably wrong, but the main point I tried to make in my previous post is that Akon's leadership is seriously compromised. The Superhappies are very manipulative and the Confessor needs to get a handle on things before saving humanity gets any tougher.
Did I mention a holodeck? Ugh, curse you Star Trek.
Posted by: Geoff | February 03, 2009 at 07:23 PM
Another question:
Do the Super Happies already know where the human worlds are (from the Net dump), or are they planning on following the human ship back home?
Posted by: Doug S. | February 03, 2009 at 07:23 PM
As noted earlier, the Superhappies don't appear to be concerned about the presumed ability of the Babyeaters to make supernovas. Perhaps they have a way of countering the effect, and have already injected anti-supernova magicons through the starline network back to Earth and Babyeater Prime. In that case trying to detonate either immediately or at Huygens would fail, while eliminating any trust the Superhappies had in us. Maybe that's not much worse; they wouldn't punish us for the attempt, it might just make them more aggressive about fixing us.
Also, is the cosmology such that the general lack of visible supernovas is significant? It would seem that the normal development for "human-like" technological civilizations is that shortly after discovering the Alderson drive, a mad scientist or misguided experiment blows up the home star. Babyeaters and Superhappies apparently avoided this by having some form of a singleton, and humans got lucky because the scientists were able to suppress the information. Humans may be the most individualistic technological civilization in the universe.
Posted by: Brian 2 | February 03, 2009 at 08:24 PM
I'm surprised the Super Happy People are willing to allow pre-sentient Baby Eaters to be eaten. Since they do not distinguish between DNA and synaptic activity, they might regard the process of growing a brain as a type of thought and that beings with growing brains are thus sentient.
Posted by: Joseph Hertzlinger | February 03, 2009 at 08:40 PM
It seems we are at a disadvantage relative to Eliezer in thinking of alternative endings, since he has a background notion of what things are possible and what aren't, and we have to guess from the story.
Things like:
How quickly can you go from star to star?
Does the greater advancement of the superhappies translate into higher travel speed, or is this constrained by physics?
Can information be sent from star to star without couriering it with a ship, and arrive in a reasonable time?
How long will the lines connected to the novaing star remain open?
Can information be left in the system in a way that it would likely be found by a human ship coming later?
Is it likely that there are multiple stars that connect the nova to one, two or all three alderson networks?
And also about behaviour:
Will the superhappies have the system they use to connect with the nova under guard?
How long will it be before the babyeaters send in another ship? the humans, if no information is received?
How soon will the superhappies send in their ships to begin modifying the babyeaters?
Here's another option with different ways to implement it depending on the situation (possibly already mentioned by others, if so, sorry):
Cut off the superhappy connection, leaving or sending info for other humans to discover, so they deal with the babyeaters at their leisure.
Go back to give info to humans at Huygens, then cut off the superhappy connection.
Go back to get reinforcements, then quickly destroy the babyeater civilization (suicidally if necessary) and the novaing star (immediately after the fleet goes from it to the babyeater star(s), if necessary).
In all cases, I assume the superhappies will be able to guess what happened in retrospect. If not, send them an explicit message if possible.
Posted by: simon | February 03, 2009 at 08:42 PM