« Lying Stimuli | Main | Conscious Control »

February 04, 2009

Comments

This is the original ending I had planned for Three Worlds Collide.

After writing it, it seemed even more awful than I had expected; and I began thinking that it would be better to detonate Sol and fragment the human starline network, guaranteeing that, whatever happened in the future, true humans would continue somewhere.

Then I realized I didn't have to destroy the Earth - that, like so many other stories I'd read, my very own plot had a loophole. (I might have realized earlier, if I'd written part 5 before part 6, but the pieces were not written in order.)

Tomorrow the True Ending will appear, since it was indeed guessed in the comments yesterday.

If anyone wonders why the Normal Ending didn't go the way of the True Ending - it could be because the Superhappy ambassador ship got there too quickly and would have been powerful enough to prevent it. Or it could be because the highest decision-makers of humankind, like Akon himself, decided that the Superhappy procedure was the categorically best way to resolve such conflicts between species. The story does not say.

If those are the two endings, then that definition procedure for the term "True Ending" was not very meta-ethically instructive.

Awww... I was so looking forward to a heartwarming description of a family feast some years later where Akon delivers a toast to the SHs for "finally restoring the natural order of things."

HAPPY ENDING.

How long will the superhappy-human-babyeater conglomerate last ? How many other species will they meet in the universe ? How arbitrary can and will be the aesthetics, morality, utilities of those new species ? If they are arbitrary enough, and enough of them are met, what will the resulting following compromises look like ?

Depending on how many goals, values, etc. are more or less universal - and some would perhaps be, since after most if not all those species will have come into being through evolution in the same universe - then those are the only thing that'll remain, the only values and particularities. As the rest is arbitrary, the average will probably cancel any subtlety out.

The longer you go, the more monomaniacal and bland the resulting compromise will become. In the end, you'll have something like orgasmium, for maybe a handful of values that were shared between a majority of those species. The rest, noise. Would that be ok ?

Up next: Three Worlds/Unlimited Blade Works!

I hope the Confessor gets more face time. He's so badass.

I have my own doubts, but I don't think it would have exactly that effect.

Remember, the Superhappys actually adopted some of the values of the humans and baby-eaters; it seems to be a volume-conserving operation, not set-intersection. Not, I think, that that makes it very much better.

I have a very strong personal motivation for making the moral assertion, "Diversity is good". I am transsexual, often meet people who have never met a TS before and am rarely in a group which is majority TS. Yet, I do believe in it as a moral requirement. If we are all the same, we all have the same blind spots. If we are all different, we see different things, and this is good, and interesting, and in our interests.

I rather hope that the more powerful alien race we meet will also value diversity as a moral good. I even believe it is a moral good even when, for example during the Scramble for Africa, almost no-one or no-one at all believes it.

I don't expect humanity to ever encounter any aliens - I would guess that the explanation for the Fermi Paradox is that life is rare, but I can easily see how a civilization built out of colliding values could continue to occupy the fun border between complexity and chaos. If one of the contributing species valued that sort of thing, and the others didn't object.

Or, wait... To find the plot hole that permits the other ending takes searching. If no commenter had recognized that they preferred the other ending strongly enough, they would not have searched deeply enough. Was the meta-ethics test only that?

Steve, there's no incredibly deep metaethical lesson in the fact that I, as author, decided that the second ending would only be the True one "that actually happened" if a reader thought of it. I just wanted to take advantage of the blog format to offer a choice a bit more interactive than picking "1" or "2".

The most important advice you can offer to a rationalist is to avoid motivated skepticism; the second most important advice is not to overcomplicate things. Not everything I do is incredibly deep. Some things, sure, and even some things that aren't obvious at a first glance, but not everything.

On the other hand, no one has decoded the names of the ships yet, so I guess there's also something to be said for looking deeper.

I am the core of my mind.
Belief is my body and choice is my blood.
I have revised over a thousand judgments.
Unaware of fear
Nor aware of hope.
Have withstood pain to update many times
Waiting for truth's arrival.
This is the one uncertain path.
My whole life has been...
Unlimited Bayes Works!
My whole life has been...

Should read:

So, as I strive...

(The original is idiomatic and hard to filk cleanly.)

It's rather funny to see this end desribed as awful by Eliezer, who, at the same time, endorses things such as
In my head I have an image of the parliament of volitional shadows of the human species, negotiating a la Nick Bostrom. The male shadows and the female shadows are pretty much agreed that (real) men need to be able to better read female minds; but since this is a satisfaction of a relatively more "female" desire - making men more what women wish they were - the male shadows ask in return that the sex-drive mismatch be handled more by increasing the female sex drive, and less by decreasing male desire...

So, intraspecies convergence of values is somehow ok, but interspecies isn't?

The trouble is that some years later Akon is not a super-happy baby-eating human but rather a hodge-podge of zillions of values. The super-happy population or resources can double in 35 hrs at current tech. Their tech advances much faster than human tech does at current population. This is their first encounter at current tech and population but in a year they will probably encounter and mix with over 2^240 new species!

More practically, severing the human starline system, in addition to being a cliche, seems very positive values utilitarian and very anti-CEV in that it imposes a decision to maintain disunion and thus the continued existence of true humans upon all future human generations. I see the appeal, but it doesn't seem to minimize the ratio of bad human worlds to good human worlds in a big universe. Really I can't seem to look away from the instant doom implications of a big universe with superluminal travel and exponentially growing populations of finite starting density.

I don't see this ending as awful at all, except of course for the suicides. But a quarter of the ship's crew, with even higher rates among the general population? That strikes me as unrealistically high. For most people, it takes a lot to be pushed over the edge.

I also note that this is part 6. That means either that the true ending is in two parts, or that there'll be Something Completely Different as part eight, maybe an "author's comments" or some such.

Not everything I do is incredibly deep. Some things, sure, and even some things that aren't obvious at a first glance, but not everything.

Sometimes, a cigar is just a cigar..

How can the superhappies not see THAT happening?

Martin

Mike said: Really I can't seem to look away from the instant doom implications of a big universe with superluminal travel and exponentially growing populations of finite starting density.

Maybe the universe itself grows exponentially faster than populations of life.

25% suicide rate? Over something completely abstract that they haven't felt yet?

You didn't tell us about humans having been overcome by some weird Death Cult.

But, now it makes sense why they would give power to the Confessor.

Obviously, in this fantasy of would-be-immortal 21st century abstract thinker, your immortal 21st century abstract thinkers are worshipped as gods. And unhappily, they were told too much about Masada and other Kool-Aid when they were young.

There comes your judeo-christian upbringing again, in addition to the intellectual masturbation.

Eliezer -- get a life! The worst thing that ever happened to your intelligence was to be disconnected from reality by too early success.

"Just as they would regret not eating the tiny bodies of the infants." is one of the more moving passages I've read in a long time. Well done Eliezer.

Why did the SuperHappies adopt the Babyeater's ethics? I thought that they exterminated them.
Or is 6/8 an alternative to 5/8 instead of its sequel?

It might be better to number the sections 1, 2, 3, 4, 5A, 6A, 5B, 6B.

Has anyone else noticed that in this particular 'compromise', the superhappies don't seem to be actually sacrificing anything?

I mean, their highest values are being ultra super happy and having sex all the time, and they still get to do that. It's not as if they wanted not to create literature or eat hundreds of pseudochildren. Whereas humans will no longer get to feel frustrated or exhausted, and babyeaters will no longer get to eat real children.

I don't think the superhappies are quite as fair-minded as Akon thought. They agreed to take on traits of humanity and babyeating in an attempt to placate everyone, not because it was a fair trade.

Sure. To the Superhappies, letting a sentient being experience pain or discomfort is evil. Since they're the strongest, why would they willingly do something they consider to be evil?

Akon isn't entirely wrong, though. The Superhappies could have transformed humanity and the Babyeaters without changing themselves or their way of life in the slightest, and no one would have been able to stop them. But they didn't. That does show a certain degree of fair-mindedness that humans probably wouldn't have shown had they been in the same position.

The Superhappies could have transformed humanity and the Babyeaters without changing themselves or their way of life in the slightest, and no one would have been able to stop them.

Why would I care about whether the Superhappies change themselves to appreciate literature or beauty? What I want is for them to not change me.

All their "fair-mindedness" does is guarantee that I will be changed again, also against my will, the next time they encounter strangers.

If we sufficiently value episodes of aesthetic appreciation (in general, not only when done by us), etc., then the "compromise" could be a net positive, even from the perspective of our current values.

(But perhaps the point is that our values are in fact not so agent-neutral.)

Regarding ship names in the koan....

Babyeaters: http://en.wikipedia.org/wiki/Midshipman's_Hope. Haven't read, just decoded from the name in the story.

But I'm having trouble figuring out the superhappys. I can think of a story with rational and emotional protagonists, a plot device relating to a 'charged particle', and the story is centered around a solar explosion (or risk of one). That story happens to involve 3 alien genders (rational, emotional, parental) who merge together to produce offspring. It should be known to many people on this thread but it's been about 10 years since I last read it. Asimov, the gods themselves.

Anonymous.

There seems to be a fairly large contingent of humanity who regard self-determination as the most significant terminal value to roughly the same single-minded extent that the Babyeaters view baby eating; including a willingness to sacrifice every other moral value to very large degrees in its favor. I assume many of the suicides fell into this group.

While not universal among humanity as baby eating is among the Babyeaters, the concept should have been fairly explicit in at least some of the cultural material transmitted. I wonder, were the Superhappies being willfully oblivious to this value, considering the extent to which they willingly violate it?

I'm with Kaj Sotala in not finding this ending to be awful.

The prospect of never feeling pain again would not in the least disturb me. Oh, I may 'enjoy' the pain of a good workout, but only because I believe it will help to reduce or postpone more pain later on.

The babyeating is weird, but we are talking about being transformed to want to do that, not being forced to do something we would actually find disgusting.

Whats the trouble there? I don't regret my past self being unable to forever prevent my current self from enjoying brussels sprouts.

John Maxwell:

No, they are simply implementing the original plan by force.

When I originally read part 5, I jumped to the same conclusion you did, based presumably on my prior expectations of what a reasonable being would do. But then I read nyu2's comment which assumed the opposite and went back to look at what the text actually said, and it seemed to support that interpretation.

Actually, I'm not sure if that's what I thought about their intentions towards the babyeaters, but I at least didn't originally expect them to still intend to modify themselves and humanity.

...with babyeater values.

"But I'm having trouble figuring out the superhappys. I can think of a story with rational and emotional protagonists, a plot device relating to a 'charged particle', and the story is centered around a solar explosion (or risk of one). That story happens to involve 3 alien genders (rational, emotional, parental) who merge together to produce offspring."

Wouldn't that be the player of games from banks ? Would kinda make sense no ?

"Why did the SuperHappies adopt the Babyeater's ethics? I thought that they exterminated them."

They only exterminated the one ship so that it wouldn't blow up the star.

Regarding the ships names:
Impossible possible worlds would point to Heinleins: Number of the beast.

"But I'm having trouble figuring out the superhappys. I can think of a story with rational and emotional protagonists, a plot device relating to a 'charged particle', and the story is centered around a solar explosion (or risk of one). That story happens to involve 3 alien genders (rational, emotional, parental) who merge together to produce offspring."

The story you're thinking of is The Gods Themselves by Isaac Asimov, the middle section of which stars the aliens you describe.

Hmm, I just noticed that there's a slight contradiction here:

"I know. Believe me, I know. Only youth can Administrate. That is the pact of immortality."

Then how is it possible for there to be such person as a Lord Administrator, if the title takes 100 years to obtain? While a civilization of immortals would obviously redefine their concept of youth, it seems like a stretch to call a centenarian young if 500 is still considered mind-bogglingly old.

Daniel, is it a stretch to call a 20-year-old young if you would be impressed to meet a 100-year-old? Though the actual relation would be more like "Akon is 30 years old, the Confessor is a 90-year-old survivor of a famous catastrophe."

"But I'm having trouble figuring out the superhappys. I can think of a story with rational and emotional protagonists, a plot device relating to a 'charged particle', and the story is centered around a solar explosion (or risk of one). That story happens to involve 3 alien genders (rational, emotional, parental) who merge together to produce offspring."

The story you're thinking of is The Gods Themselves by Isaac Asimov, the middle section of which stars the aliens you describe.

Yes, I believe I already identified the story in the final sentence of my post. But thanks anyway for clarifying it for those that didn't keep reading till the end :-)

Anonymous.

"Normal" End? I don't know what sort of visual novels you've been reading, but it's rare to see a Bad End worse than the death of humanity.

"Ion" Banks was cute. I'm finally catching up on this series days late, so it's astonishing that nobody else got that one. (But that's the only one I got.)

Why would I care about whether the Superhappies change themselves to appreciate literature or beauty? What I want is for them to not change me.

The bargain that the Superhappies are offering is to change you less than if they had just changed you by force. I'm guessing if the humans didn't agree to the deal, the Superhappies would have either exterminated the humans completely, or convert them completely to superhappy values.

The benefit of Superhappies changing themselves to appreciate literature and beauty is that when they convert you, you get to keep the part of you that appreciated literature and beauty.

All their "fair-mindedness" does is guarantee that I will be changed again, also against my will, the next time they encounter strangers.

Actually, it won't be against your will, because you will have had the same values as them (you're all merged now, remember?)

The comments to this entry are closed.

Less Wrong (sister site)

May 2009

Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31