« Thoughtful Music | Main | Lying Stimuli »

February 03, 2009

Comments

But standing behind his target, unnoticed, the Ship's Confessor had produced from his sleeve the tiny stunner - the weapon which he alone on the ship was authorized to use, if he made a determination of outright mental breakdown. With a sudden motion, the Confessor's arm swept out...

---

... and anaesthetised everyone in the room. He then went downstairs to the engine room, and caused the sun to go supernova, blocking access to earth.

Regardless of his own preferences, he takes the option for humanity to 'painlessly' defect in inter-stellar prisoners dilemma, knowing apriori that the superhappys chose to co-operate.

Hmm. The three networks are otherwise disconnected from each other? And the Babyeaters are the first target?

Wait a week for a Superhappy fleet to make the jump into Babyeater space, then set off the bomb.

(Otherwise, yes, I would set off the bomb immediately.)

"[This option will become the True Ending only if someone suggests it in the comments before the previous ending is posted tomorrow. Otherwise, the first ending is the True one.]"

I'm not sure I understand what you mean. If no one chooses (2) does that mean that the (True) story ends with the Confessor stunning the Lord Pilot? ...or does it continue after he's stunned? ...or have I gotten it all wrong?

Are the storylines like these:

1. - - - - (END)
2. - - - - - - - - (END)

or

1. - - - - - - - - (END)
2. - - - - - - - - (END)

@Anonymous Coward:Reasonable, except even by defecting you haven't gained the substantially greater payoff that is the whole point of Prisoner's Dilemma. In other words, like he asks: what about the Babyeater children? I wouldn't know just how to quantify the 2 options-I believe that's the whole point of this series :) -but I wouldn't call it much better than what the Superhappy aliens offered, at least with the more "inclusive" altruistic concern that the humans in this illustration are supposed to have.

Carl - I'm pretty sure either way we get three more chapters.

Given that the number of parts in the story has been explicitly stated all along, I doubt it'd change in length.

No, you've got to suggest someone else to stun, I'm pretty sure.

---

One thing I'm wondering about the superhappys. They're so eager to cooperate, even to the point of changing their own utility function; what would happen if they kept running into one alien race after another, all of which would alter it in the same direction?

I can't figure out a better solution than what they've proposed. I wouldn't particularily want to eat nonsentient babies - it seems so *pointless*, by all three pre-existing utility functions - but so is art, by the happyhappyhappys' function.

Eliezer, if your point is to emotionally drive the point that utility functions are basically arbitrary, you've succeeded.

2) ...and anesthetized himself.

Umm... 'Superstimulus'.

I think Eliezer has written passionately and pointedly about rationality, will to become stronger and need for FAI. Writing this story makes a separate point about those ideas.

After reading this story I feel myself agreeing with Eliezer more on his views and that seems to be a sign of manipulation and not of a rationality.

Philosophy expressed in form of fiction seems to have a very strong effect on people - even if the fiction isn't very good (ref. Ayn Rand). I find this story well written and engaging. I'm having other people read and comment story without background of reading Eliezers writings before to have better idea if this story actually has made a point instead of creating stronger attachment to ideas presented earlier.

Few comments in no particular order (randomized):

Format of the story being released in small bite sized installments creates an artificial scarcity.

The story compactly addresses matters that readers have spend time studying here which is very rewarding.

Engaging people in the creation of the story creates attachment to it.

Characters use very familiar phrases that help formation of in-group feeling.

No matter which of the three alien species one happens to cheer for in the story that is still cheering for someone.

Svein: No, you've got to suggest someone else to stun, I'm pretty sure.

I doubt Eliezer's grand challenge to us would be to contribute less than four bits to his story.

So.. (even taking MST3K into account)

Akon certainly has gone mad. He believes that he is in unique position of power (even his decision markets and his Command staff is divided) and he has to make the decision NOW with great unlikely secrets revealed essentially just to him. There are too many unlikely events to believe in for Akon. I think he has failed his excercise or whatever he is living in.

Anonymous Coward's defection isn't. A real defection would be the Confessor anesthetizing Akon, then commandeering the ship to chase the Super Happies and nova their star.

"But our negotiations with them failed, as predicted."

If the Lady 3rd speaks the truth, and human behaviour is not more difficult to model than Babyeater behaviour, then the crew faces a classical Newcomblike problem. (Eliezer hints through Akon's thoughts that the Supper Happies have indeed built relieable models of at least some crewmembers.)

So if you write an alternative ending, take into account that whatever the Confessor, or anyone else, does, will have been already predicted and taken into account by the Super Happy People.

Why should we care for some crystalline beasts? We don't desire to modify lions to eat vegetables, and their prey is much more like us. Destroy the star immediately, or better do it at the moment when it can do the greatest damage to the damned self-righteous superhappies (revenge is, after all, also a sort of human value).

You'll get the same next three installments regardless of whether someone comes up with the Alternative Solution before Ending 1 is posted. But only if someone suggests the Alternative Solution will Ending 2 become the True Ending - the one that, as 'twere, actually happened in that ficton.

This is based on the visual novel format where a given storyline often has two endings, the True Ending and the Good Ending, or the Normal Ending and the True Ending (depending on which of the two is sadder).

To make the second ending the True Ending, someone has to suggest the alternative thing for the Impossible to do in this situation - it's not enough to guess who the Confessor goes after.

Well, I'm glad the story wasn't ruined by the alternative being too obvious. If no one's thought of it yet in the comments, then it's at least plausible that the people on the ship didn't think of it earlier.

Anonymous - yes, I keep wondering myself about the ethics of writing illustrative fiction. So far I'm coming out on the net positive side, especially after Robin's post on Near versus Far thinking. But it does seem to put more of a strain on how much you trust the author - both their honesty and their intelligence.

PS: Anna and Steve, Shulman, Vassar, and Marcello, please don't post the solution if you get it - I want to leave the field at least a little open here...

I thought these "events" might be a test for the humans, a mass hallucination. It is strange that three civilisations should encounter each other at the same time like this.

It is difficult to alter one human characteristic without changing the whole person: difficult to change from male to female. Far more difficult to Improve a civilization by changing one characteristic of the humans, take away the ability to feel pain. Take away the whole basis of moral action and cooperation, by preventing babyeaters from eating babies. Would the superhappies really desire to make every one else just like them? Possibly, I think that is a morally poorer choice but making that choice is very common among humans.

But I cannot think of a *better* ending.

I wonder if the Superhappys could be persuaded that instead of modifying us to not feel pain at all, we could be modified to have the ability to feel pain switched off by default, but with the potential to be activated if we so chose - that would avoid their concerns about non-consensually inflicting pain on children who hadn't come to the philosophical realisation that it was worth it, but would still allow us to remain fully human if that was what we actually desired as individuals given the choice.

... and stuns Akon (or everyone). He then opens a channel to the Superhappies, and threatens to detonate the star - thus preventing the Superhappies from "fixing" the Babyeaters, their highest priority. He uses this to blackmail them into fixing the Babyeaters while leaving humanity untouched.

I don't really have a good enough grasp on the world to predict what is possible, it all seems to unreal.

One possibility is to jump one star away back towards earth and then blow up that star, if that is the only link to the new star.

...and stuns Akon, for failing to be rational and jumping to a decision with insufficient information. Doesn't it seem a little TOO convenient that the first alien race is less powerful, while the second one is massively more powerful? And now that the one is gone, and the other is dust, humans seem to have accepted being modified in ways that would make the babyeaters happy... without even bringing up any other scenarios. That's contrary to the stated mission of the Confessor.

The Confessor finds Akon's acceptance of part of the terms of capitulation flawed, and stuns him, effectively relieving him of command. The rest of the crew deliberate over their options.

Something about Akon's unwillingness to warn the Babyeaters of the Superhappy's plans set my "Plot Device" warning lights off. Might the rest of the story involve following the Babyeater's starline to attempt to warn/renegotiate with them, and, upon probably failing, detonating that sun to protect the Babyeaters (who didn't choose to capitulate) and consigning humanity to a marathon carnal surplus-infant-eating future?

Not that I don't struggle to come up with a rational case for this course of action, or even to rationalise it. It's just that humanity advocating the universal eating of babies is the sort of perverse outcome I'd expect from following alien first principles to their logical conclusions.

Is there also a Scooby Doo ending, like in Wayne's World?

@Anonymous Coward:Reasonable, except even by defecting you haven't gained the substantially greater payoff that is the whole point of Prisoner's Dilemma. In other words, like he asks: what about the Babyeater children?
---

I misread the story and thought the superhappys had flown off to deal with them first. But in fact, the superhappys are 'returning to their home planet' before going to deal with the babyeaters. "This will make it considerably easier to sweep through their starline network when we return.". Oops.

In any event, if the ship's crew is immediately anaesthetised and the sun exploded, then earth remains ignorant of the suffering of the babyeaters, and earth is not coerced to have its value system changed by an external superior power. The only human that feels bad about all this is the one remaining conscious human on the ship before it is fried. The babyeaters experience no net change in their position and the superhappys have made a net loss (by discovering unhappiness in the universe and being made unable to fix it). Humanity has met a more powerful force with a very different value system that wishes to impose values on other cultures, but has achieved a draw. Humanity remains ignorant of suffering - again a draw when the only other options are to lose in some way (either by imposing values when we feel we have no right; or by knowingly allowing suffering).

Of course the confessor might wish to first transmit a message back to earth that neglects to mention any babyeaters, and warns of the highly dangerous 'superhappys', and perhaps describing them falsely as super-powerful babyeaters (ala alderson scientists) to prevent anyone from being tempted to find them, thereby preventing any individual from sacrificing the human race's control of it's own values...

I guess it depends on whether he believes 'right to choose your own species values' ranks above 'right to experience endless orgasms'. If he truly has no preference for either, he might as well consider everyone dangerously highly strung and emotional and an unsuitable sample size to make decisions for humanity. In that case, perhaps he should stun everyone in the control room and cause the ship to return to earth, if he is able to do so, to tell humanity what has happened in full detail. This at least allows the decision to be made by a larger fraction of humanity.

A final practical point. So far, the people on the ship only know what they have received in communications or what they can measure with their sensors. In fact, we can't trust either of these things; a sufficiently advanced species can fool sensors and any species can lie. We can observe the superhappys are clearly more technologically advanced from the evidence of the one ship present, and the growth rate suggests they can rapidly overpower humanity. Humanity has no idea what the superhappys will really do when they return. In fact, if they wish, they might simply turn all humans into superhappys and throw away all human values, without honouring the deal. They could torture all humans till the end of time if they wish or turn us into babyeaters. Equally, we know there is a race that is pleased to advertise they eat babies and wishes to encourage other races to do the same; and we know that they have one quite advanced ship that is slightly technologically inferior to us; what else they have, we don't really know. Perhaps the babyeaters have better crews and ships back home. Perhaps the babyeaters have advanced technology that masks the real capabilities of their ship. All we have is a single unreliable sample point of two advanced civilisations with very different value systems. What we have here is a giant knowledge gap.

The only thing we know for certain is that the superhappys are almost certainly technologically superior to humanity and can basically do whatever they want to us; unless the sun is blown up. And we know that the babyeaters have culturally unacceptable values to us; and we don't know if they might really have the ability to impose those values on us or not. Given this knowledge of these two dangerous forces, one of which is vastly superior, and one of which is advanced and might later turn out to be superior, if humanity can achieve a 'zero loss outcome' for itself by blowing up the sun, it is doing rather well in such an incredibly dangerous situation. Humanity should take advantage of the fact the superhappys already placed a 'co-operate' card on the table and allowed us decide what to do next.

> Wait a week for a Superhappy fleet to make the jump into Babyeater space, then set off the bomb.

You guys are very trusting of super-advanced species who already showed a strong willingness to manipulate humanity with superstimulus and pornographic advertising.

Assuming the Lord Pilot was correct in saying that, without the nova star, the Happy Fun People would never be able to reach the human starline network
...and assuming it's literally impossible to travel FTL without a starline
...and assuming the only starline to the nova star was the one they took
...and assuming Huygens, described as a "colony world", is sparsely populated, and either can be evacuated or is considered "expendable" compared to the alternatives

...then blow up Huygens' star. Without the Huygens-Nova starline, the Happy People won't be able to cross into human space, but the Happy-Nova-Babyeater starline will be unaffected. The Happy People can take care of the Babyeaters, and humankind will be safe. For a while.

Still not sure I'd actually take that solution. It depends on how populated Huygens is and how confident I am the Super Happy People can't come up with alternate transportation, and I'm also not *entirely* opposed to the Happy People's proposal. But:

If I had a comm link to the Happy People, I'd also want to hear their answer to the following line of reasoning: one ordinary nova in a single galaxy just attracted three separate civilizations. That means intelligent life is likely to be pretty common across the universe, and our three somewhat-united species are likely to encounter far more of it in the years to come. If the Happy People keep adjusting their (and our) utility functions each time we meet a new intelligent species, then by the millionth species there's not going to be a whole lot remaining of the original Super Happy way of thinking - or the human way of thinking, for that matter. If they're so smart, what's their plan for when that happens?

If they answer "We're fully prepared to compromise our and your utility functions limitlessly many times for the sake of achieving harmonious moralities among all forms of life in the Universe, and we predict each time will involve a change approximately as drastic as making you eat babies," then it will be a bad day to be a colonist on Huygens.

If they're going to play the game of Chicken, then symbolically speaking the Confessor should perhaps stun himself to help commit the ship to sufficient insanity to go through with destroying the solar system.

Attempting to paraphrase the known facts.

1. You and your family and friends go for a walk. You walk into an old building with 1 entrance/exit. Your friends/family are behind you.

2. You notice the door has a irrevocable self-locking mechanism if it is closed.

3. You have a knife in your pocket.

4. As you walk in you see three people dressed in 'lunatic's asylum' clothes.

5. Two of them are in the corner; one is a guy who is beating up a woman. He appears unarmed but may have a concealed weapon.

6. The guy shouts to you that 'god is making him do it' and suggests that you should join in and attack your family who are still outside the door.

7. The 3rd person in the room has a machine gun pointed at you. He tells you that he is going to give you and your family 1000000 pounds each if you just step inside, and he says he is also going to stop the other inmate from being violent.

8. You can choose to close the door (which will lock). What will happen next inside the room will then be unknown to you.

9. Or you can allow your family and friends into the room with the lunatics at least one of whom is armed with a machine gun.

10. Inside the room, as long as that machine gun exists, you have no control over what actually happens next in the room.

11. Outside the room, once the door is locked, you also have no control over what happens next in the room.

12. But if you invite your family inside, you are risking that they may be killed by a machine or may be given 1 million pounds. But the matter is in the hands of the machine gun toting lunatic.

13. Your family are otherwise presently happy and well adjusted and do not appear to NEED 1 million pounds, though some might benefit from it a great deal.

Personally in this situation I wouldn't need to think twice; I would immediately close the door. I have no control over the unfortunate situation the woman is facing either way, but at least I don't risk a huge negative outcome (the death of myself and my family at the hands of a machine gun armed lunatic).

It is foolish to risk what you have and need for what you do not have, do not entirely know, and do not need.

Anonymous Coward's defection isn't. A real defection would be the Confessor anesthetizing Akon, then commandeering the ship to chase the Super Happies and nova their star.
--

Your defection isn't. There are no longer any guarantees of anything whenever a vastly superior technology is definitely in the vicinity. There are no guarantees while any staff member of the ship is still conscious besides the Confessor and it is a known fact (from the prediction markets and people in the room) that at least some of humanity is behaving very irrationally.

Your proposal takes unnecessary ultimate risk (the potential freezing, capture or destruction of the human ship upon arrival, leading to the destruction of humanity - since we don't know what the superhappys will REALLY do, after all) in exchange for unnecessary minimal gain (so we can attempt to reduce the suffering of a species whose technological extent we don't truly know and whose value system we know to be in at least one place substantially opposed to our own, and whom we can remain ignorant of, as a species, by anaesthetised self-destruction of the human ship).

It is more rational to take action as soon as possible to guarantee a minimum acceptable level of safety for humankind and its value system, given the unknown but clearly vastly superior technological capabilities of the superhappys if no action is immediately taken.

If you let an AI out of the box and it tells you its value system is opposed to humanity's and that it intends to convert all humanity to a form that it prefers, then it FOOLISHLY trusts you and steps back inside the box for a minute, then what you do NOT do is:

- mess around
- give it any chance to come back out of the box
- allow anyone else the chance to let it out of the box (or the chance to disable you while you're trying to lock the box).

Anonymous

Probably I have watched too much Star Trek but it is hard to shake the suspicion that both the Superhappies and the Babyeaters are sockpuppets for some kind of weakly godlike entity messing around with us for a laff..

"You can't always get what you want" is one of the few samples of true rationality in this mess, I don't see why Akon seemed to completely forget that notion.

The happies, humans and babyeaters can not reach a quick conclusion that everyone will be satisfied with. In short, they should all just suck it up. All specieses have their little hiccups from the point of view of the others.

Becoming a painfeeling superhappy babyeater and any other combination imaginable should be a choice made available to members of all races. This might lead somewhere, or it might not. The happies should feel ashamed for their rash reaction (although the babyeaters will forgive them for this "reasonable mistake") and after that all three races should continue their personal sufferings until they, inevitably, will find a way to come to terms with it, especially given the way they would be exposed to each other culturally in the meanwhile. It's fair, rational and offers a longterm solution out.

The happy solution is essentially to fight wars until something gives.

Trying to force a solution is hardly rational, which should actually be obvious to all sides, especially the happies. The babyeater society and culture will suffer terrible devastation through a war which they will quickly lose, and I fail to see how putting the rest of the babyeater children through war (in which many will die, painfully) can be called a "definite improvement". (Many babyeaters would propably just quietly go on eating babies anyway.) It should be completely obvious to the happies that at least a significant part of humanity will not turn itself into baby-eaters willingly, which will result in another war. The same will possibly go for the happies actually, so they also risk a civil war, and for what? To have three cultures that eat babies for purely symbolic reasons. Makes no sense.

Cultural standards that are forced upon people will be rejected both among the humans and the babyeaters. There will inevitably be continuous rebellions, and the happies would keep enforcing their ideas, which will result in a cycle of wars until the happies leave for one reason or the other, and then there will be civil wars among the humans and babyeaters trying to decide for themselves whether to return to the "old ways" or keep the new alien ways (which would have significant support by then).

And that's just the start of a cycle of wars with three breeds, now on a path of revenge. At this point, the only sensible solution is to go supernova, breaking the connection. There's better chance of the babyeaters finding a path out of the most significant cultural issue, the babyeating, than there is of the three breeds learning to live together once one starts enforcing itself upon the others in the proposed magnitude. The amount of fighting would clipse the suffering of the babyeater children, thus being pointless.

Blowing the star right now might only be a temporary solution. The happies might find other connections, if they put their minds to it. They obviously travel and develop fast, so it might not even take that long, and they have a lot of data from the humans and babyeaters to work with. With luck the babyeaters might get over their little cultural hiccup before that, but that's not exactly a sound ethical foundation. (It beats the hell out of starting wars though.)

This war must be stopped before it starts, or at least an attempt must be made (as the humans can't just force the happies to do anything).

They could attack the happies as a show of "We are willing to die for their right to their values, as much as I loathe them". It could also remind them that killing is not so much fun once you need to do it to people who are not doing it for "selfish" reasons, and not to people who are just "wrong". And just as a reminder of what a mess of wars they're about to create. They could kill themselves. And yes, they could go and blow up the superhappy star as a last resort, hoping that the happies couldn't recover.

It's terribly pompous to think that just because all cultures are happy with the way they are now that it somehow makes them superior to the cultures that they had previously, let alones the ones someone else has. We think ourselves superior because our standard of living is improved and we like things the way they are better than what we about the way they were. The only way to compare is to try. We can not try previous cultures, but the happies should, for the sake of argument, at least try living more like the babyeaters. However, if they change EVERYBODY, there is once again no comparison.

If the happies argue that the babyeaters will learn to be satisfied not being babyeating, then by the same reasoning the happies should be able to learn to be satisfied to eat babies, or at least live with the idea of the babyeaters being babyeating.

So to return to the point; options of trying differents aspects of the three cultures should be made available, and it could propably be agreed that members of each specieses must be found that are willing to go for this experiment. (It shouldn't be too hard really; all volunteers would be doing it so hopefully others wouldn't.) The happy technology should even make it possible to complete this experiment to satisfactory levels surprisingly quick.

There's a chance this experiment wouldn't produce satisfactory results, but it should be tried before warfare.

Yeah, I know, terribly boring in this topic, but what ever.

After I wrote the first ending, I realized that there was another possibility, so I wrote a second ending as well. The Overcoming Bias readership will be given a chance to determine which one becomes the True Ending and which the alternative - though it won't be as easy as a vote.

  1. ... and anesthetized the Lord Pilot.
  2. ... [This option will become the True Ending only if someone suggests it in the comments before the previous ending is posted tomorrow.  Otherwise, the first ending is the True one.]

Insofar as definitions can be right or wrong, so also counterfactual consequences can be right or wrong, and thus fictional evidence can be right or wrong...

Well, I'm glad the story wasn't ruined by the alternative being too obvious. If no one's thought of it yet in the comments, then it's at least plausible that the people on the ship didn't think of it earlier.

So the rightnesses of the two bodies of fictional evidence in the two endings both depend on the audience's skill at applied metaethics? And you want to increase the expected rightness of the true ending by correlating the true ending with the audience's unknown skill? Or by giving the audience an incentive to increase their skill?

(I don't know the solution. This comment reasoning about your motives is to narrow the search space. Plus it proposes a meaning for your otherwise unexplained term "True".)

Go back to earth and detonate.
Obviously these two species are superior, having not destroyed themselves when they had the means.
They should remain uncorrupted.

Since this is fiction (thankfully, seeing how many might allow the superhappy's the chance they need to escape the box)... an alternative ending.

The Confessor is bound by oath to allow the young to choose the path of the future no matter how morally distasteful.

The youngest in this encounter are clearly the babyeaters, technologically (and arguably morally).

Consequently the Confessor stuns everyone on board, pilots off to Baby Eater Prime and gives them the choice of how things should proceed from here.

The End

They should go back to colony system Huygens and detonate.

Meanwhile, the Arabs and the Jews, communicating through the exclusive channel of the Great Khalif O. bin Laden negotiating through Internet Sex with Tzipi Livni, arrived at this compromise whereby the Jews would all worship Mohammed on Fridays, at which times they will explode a few of their children in buses, whereas the Arabs would ratiocinate psychotically around their scriptures on Saturdays, and spawn at least one Nobel prize winner in Medecine and Physics every five years.

The chance of running into two alien species in one day seems pretty unusual. Perhaps it means something?

The chance of running into two alien species in one day seems pretty unusual. Perhaps it means something?

That is precisely what makes me think they are sockpuppets of a single entity (even within the story Universe, not just in the sense that Elizier invented them).

Nova-ing the star isn't IMO a guarantee of no future contact - there may be other starlines that aren't discovered yet. Also, the SuperHappies may improve their tech over time, and may find ways of no longer needing starline tech.

Also, if there are three civilizations, odds are there are a lot more. The SuperHappies have a better structure to compete and grow with whatever other galactic superpowers exist out there.

In essence, the "closed locked door" is an illusion in my mind. Not something to base strategy on. It is the kind of thing that primitive 21st century humans would think of, and not the kind of option that an advanced 26th century human should consider viable. Were I the Confessor (and by implication, that is the role we 21st century readers are supposed to play), I would zap the Engineer, because he's building a house made of straw and taunting the big bad wolf.

But in the context of the story as it stands, this option is pointless, since the commander has already made his decision. Zapping the pilot is equally pointless, unless no one else is able to move the ship. That may be a defect of the story, or it may be deliberate.

Option 1 is to cooperate, so I guess option 2 is defect. The correct way to defect is to destroy Huygens.

Of course, meeting two new species on the same day is the crew of the Impossible having its leg pulled by some superior entity, namely Eliezer. But Eliezer is not above and outside *our* world, and we don't have to let ourselves intimidated by his scripture.

Why and how would communication possibly happen through only one channel? Since when is the unit of decision-making a race, species, nation, etc., rather than an individual? Is this Market-driven spaceship under totalitarian control where no one is allowed to communicate, and the whole crew too brain-damaged to work-around the interdiction? I wonder how the Soviet Union made it to the Interstellar Age. Where has your alleged individualism gone?

Why and how is compromise is even possible between two species, much less desirable? In the encounter of several species, the most efficient one will soon hoard all resources and leave the least efficient ones but as defanged zoo animals, at which point little do their opinions and decisions matter. No compromise. The only question is, who's on top. Dear Tigers, will you reform yourself? Can we negotiate? Let your Great Leader meet ours and discuss man to animal around a meal.

And of course, in your fantasy, the rationalist from way back when (EY) effectively wields the ultimate power on the ship, yet is not corrupted by power. What a wonderful saint! Makes you wonder what kind of wimps the rest of mankind has degenerated into to submit to THAT wimpy overlord. Where has gone your understanding of Evolutionary Forces?

Wanna see incredibly intelligent people wasting time on absurd meaningless questions? Come here to Overcoming Bias! A stupid person will believe in any old junk, but it takes someone very intelligent to specifically believe in such elaborate nonsense.

The nova acted as a rendezvous signal, causing all starlines connected to that star to flare up. Otherwise it's too hard to find aliens - opening starlines is expensive. It's the chance of a direct encounter (small) versus chance of at least one mutual neighbor (larger).

And while I'm at it -- confusing pleasure and happiness is particularly dumb. Entities that would do that would be wiped from existence in a handful of generations, and not super-powerful. Habituation is how we keep functioning at the margin, where the effort is needed. The whole idea of a moral duty to minimize other people's pain is ridiculous, yet taken for granted in this whole story. Eliezer, you obviously are still under the influence of the judeo-christian superstitions you learned as a child.

If you're looking for an abstract value to maximize, well, it's time to shut up and eat your food. http://sifter.org/~simon/journal/20090103.h.html

From the fact that the physicists covered up knowledge that they thought was too dangerous for humanity to possess, the crew should immediately deduce that this could have happened several times in the past regarding several topics. The most obvious topic is AGI, so they should search their Archive for records of AGI projects that seemed promising but were mysteriously discontinued.


The nova acted as a rendezvous signal, causing all starlines connected to that star to flare up. Otherwise it's too hard to find aliens - opening starlines is expensive. It's the chance of a direct encounter (small) versus chance of at least one mutual neighbor (larger).

Even so, for reasons of which you are very well aware, meeting two sets of aliens should be a _lot_ less likely than meeting one set, so we ought to take that in account when we are trying to make sense of what is going on. But I accept that positing a minor god is rather a primitive reaction, especially as we already know that in your impossible possible world no Singularity is reachable by any means currently envisaged.

"Carl - I'm pretty sure either way we get three more chapters."

Yes, but I was more worried that we'd only get three more chapters...;-)

---

Anyway, another reason that the confessor should interfere in this process is because they are awful at bargaining. If they follow through with the deal they will be (initially) seriously depressed about having to kill their own children, there's the risk of war or oppression of those who do not want to be augmented, and what to they get in return from the happies?

"We are willing to change to desire pleasure obtained in more complex ways, so long as the total amount of our pleasure does not significantly decrease. We will learn to create art you find pleasing. We will acquire a sense of humor, though we will not lie. From the perspective of humankind and the Babyeaters, our civilization will obtain much utility in your sight, which it did not previously possess. This is the compensation we offer you."

I don't see the value in this; if one wants more entertaining art and jokes, why not simply accept the augmentation and come up with them yourselves?

Given that the first installment mentions that Akon's words would be "inscribed for all time in the annals of history", any internally consistent conclusion would have to feature some subsequent contact with humanity.

Pardon my lapse in fourth-wall etiquette.

What about following the SuperHappies to their first hop, then making THAT star go supernova? That way, they're cut off, but the humans still have a small chance to 'save' the babyeaters. Or vice-versa.

Peter, destroying Huygens isn't obviously the best way to defect, as in that scenario the Superhappies won't create art and humor or give us their tech.

Pain and pleasure are *signals* that we are on the wrong or right path. There's a point in making it a better signal. But the following propositions are wholly absurd:
* to eliminate pain itself (i.e. no more signal)
* to bias the system to have either more or less pain in the average (i.e. bias the signal so it carries less than 1 bit of information per bit of code).
* to forcefully arrange for others to never possibly have pain in their own name (i.e. disconnecting them from reality, denying their moral agency -- and/or obey their every whims until reality strikes back despite your shielding).
* to feel responsible for other people's pain (i.e. deny the fact that they are their own moral agents).

As for promising a world of equal happiness for all, shameless self-quote:
"Life is the worst of all social inequalities. To suppress inequalities, one must either resurrect all the dead people (and give life to all the potential living people), or exterminate all the actually living. Egalitarians, since they cannot further their goal by the former method, inevitably come to further it by the latter method."

A rational individual has no reason to care for the suffering of alien entities, or even other human entities, except inasmuch as it affects his own survival, enjoyment, control of resources.

Reasonable options depend on how much the Superhappies really know. If they really know enough to make this a newcomb-like problem, any defection against them is going to make them blow up the Impossible Possible World before it can jump.
That situation might leave one possible option: rally the earth fleet now, invade the Babyeater starline network before the Superhappies do, and for each star have all but one vessel follow open starlines, with the straggler detonating the local star.
This relies on humanity having more readily available vessels with >= medium-sized Alderson drives than the Babyeaters have settled systems.
Afterwards, cooperate with the Superhappies; compromising with one alien species will dilute human values less than compromising with two. The Superhappies might judge this as conflicting with their goals in any case; I don't really understand Superhappy morality.

If this situation is in fact not newcomb-like, aside from detonating Huygens there is the option of rallying the human fleet, jumping to an uninhabitated system, having all but one vessel jump one system further - opening new starlines in both cases if necessary - and detonating the first star.
The humans on the ships in question will then stay human, and can rebuild a human civilization somewhere suitable.
Have the rest of humanity compromise with the Babyeaters and Superhappies. This leaves a civilization optimizing for the pure Good Thing, as opposed to the AVG(Good,Babyeating,Superhappy) thing, and doesn't outright kill any existing humans, which may not be possible in the case of blowing up Huygens.

Steven, the Superhappies will still create art as part of a compromise with the Babyeaters. But yes, we would miss out on their technology.

Life is the worst of all social inequalities. To suppress inequalities, one must either resurrect all the dead people (and give life to all the potential living people), or exterminate all the actually living. Egalitarians, since they cannot further their goal by the former method, inevitably come to further it by the latter method

This seems more like a neat epigram than a thesis with its basis in fact. Have you any evidence for its truth? I am by no means an egalitarian but N.B. that in fact the communists did not kill everyone in Russia. Similarly for China. And it is not as if anti-egalitarians have not killed quite a lot of people too..

The comments to this entry are closed.

Less Wrong (sister site)

May 2009

Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31