« War and/or Peace (2/8) | Main | Open Thread »

February 01, 2009

Comments

I'm waiting for the super happy people to bust out Unlimited Blade Works, any time now.

The distinction between kiritsugu and Confessor is quite interesting. It hadn't quite occurred to me that there could be distinct branches of rationalism. That's to say, I knew that two rational agents could disagree about something -- that's a consequence of Bayes -- but it hadn't quite occurred to me that the space of rational agents was quite that large.

I imagine you've covered some of this in your work on metaethics; I should probably dig that up.

Apparently the Super Happy race have adopted knuth arrow notation more broadly than we have.

I'd be surprised at humans not making child birth non-painful when we have the technology, we do the best we can at the moment (epidurals etc). Although I mainly suspect we will build artificial wombs as well, although that may require mucking about with female biochemistry so you can convince them that they are pregnant when they are not, so that they bond properly with the newborns. Pregnancy is not fun, from what I can tell.

Pain is a signal that something might be going wrong in the evolutionary game, that you might be causing yourself permanent damage or might need other people to help you. If you get rid of the evolutionary game, then you can get rid of pain.

Wait. Aren't they *right*? I don't like that they don't terminally value sympathy (though they're pretty close), but that's beside the point. Why keep the children suffering? If there is a good reason - that humans need a painful childhood to explore, learn and develop properly, for example - shouldn't the Super Happy be conviced by that? They value other things than a big orgasm - they grow and learn - they even tried to forsake some happiness for more accurate beliefs - if, despite this, they end up preferring stupid happy superbabies to painful growth, it's likely we agree. I don't want to just tile the galaxy with happiness counters - but if collapsing into orgasmium means the Supper Happy, sign me up.

This was brilliant.

Brilliant, brilliant, brilliant! I've been a pretty avid follower of this blog for a long time now, but I think this is the first time I've commented. Hesitated because I always felt like I didn't have anything sufficiently wonderful or thought-provoking to say. But I can't help it now: brilliant! Waiting for the rest!

"We found unacceptable the alternative of leaving the Babyeaters be. We found unacceptable the alternative of exterminating them. We wish to respect their choices and their nature as a species, but their children, who do not share that choice, are unwilling victims; this is unacceptable to us. We desire to keep the children alive but we do not know what to do with them once they become adult and start wanting to eat their own babies. Those were all the alternatives we had gotten as far as generating, at the very moment your ship arrived."

Akon forgot to mention the possibility of trying to genetically modify the Babyeaters.

"Humankind, we possess a generalized faculty to feel what others feel. That is the simple, compact relation. We did not think to complicate that faculty to exclude pain. We did not then assign dense probability that other sentient species would traverse the stars, and be encountered by us, and yet fail to have repaired themselves. Should we encounter some future species in circumstances that do not permit its repair, we will modify our empathic faculty to exclude sympathy with pain, and substitute an urge to meliorate pain."

So the Lady 3rd says, basically, "modifying ourselves to exclude this pain would be hard, and we don't want to do it until all other options are proved harder."

Akon, then should be able to say exactly the same thing, with equal truth, regarding the human pain that the Lady 3rd wants eliminated. She is too quick in dismissing this symmetry.

Or was Akon really trying to argue that the Lady 3rd's kind keep their sympathy-pain as a terminal value? He is described as thinking hastily on his feet, so I suppose it's plausible for him to make such a silly argument. And in that case it's plausible to the Lady 3rd to dismiss this argument as she did. But it's not plausible that Akon really thinks that humans want to hold on to their pain as a terminal value. Star Trek V notwithstanding, it's hard to imagine a human society of the kind depicted here feeling that kind of attachment to pain.

So, for this scene to be believable, Akon should very quickly realize that humans are reluctant to eliminate their pain because they don't know how to do so without interfering with other values. And that should be a reason that is immediately understandable to the Lady 3rd, because she has offered essentially the same justification for not proceeding immediately to eliminate her own sympathy-pain.

Of course, she should then offer immediately to figure out how to eliminate our pain for us if we can't do it ourselves. But it shouldn't be hard for her to see why we would be reluctant to trust her ability to do that without interfering with our other values, given what we've seen of their abilities to understand us so far. Her evident bafflement at our reluctance to modify ourselves is prima facie evidence that they do not understand us well enough that we would be willing to let them muck around with our source code.

Of course, she might still conclude that they do understand us well enough to modify us, even against our wishes. But it shouldn't be surprising to her that it would be against our wishes, at least at this stage of the encounter.

Tyrrell: The Babyeaters don't seem to exactly have genes, in the sense that we think about them. I didn't entirely understand how the information transfer/birth of the next generation works with regards to the crystal growth thing. Either way though, there would seem to be a prisoner's dilemma of sorts with regards to that. I'm not sure about this, but let's say we could do unto the Babyeaters without them being able to do unto us, with regards to altering them (even against their will) for the sake of our values. Wouldn't that sort of be a form of Prisoner's Dilemma with regards to, say, other species with different values than us and more powerful than us that could do the same to us? Wouldn't the same metarationality results hold? I'm not entirely sure about this, but..

Eliezer: "Humankind, we possess a generalized faculty to feel what others feel." Huh? Very good story, but the quoted bit is perhaps the single most puzzling thing I've seen so far in this entire sequence. If that line is meant to be interpreted as "all possible feelings", ie, really general... Then how would that work? Aren't specific types of feelings associated with specific types of cognitive hardware? How the fluff would they be able to feel all possible types of feelings that all possible types of feeling beings (that is, beings that do the sort of thing that we'd call "feeling") feel? I'm assuming stuff like "brain scan/emulate sections theirof" is not the type of tech that you're allowing in this setting, right?

(And yes, I know the MST3K mantra. As I said, this way is more fun, though! (If we're instead just being annoying, and you'd rather we wait with these nitpicks until you finish the story, well... say so.)

Either way though, there would seem to be a prisoner's dilemma of sorts with regards to that. I'm not sure about this, but let's say we could do unto the Babyeaters without them being able to do unto us, with regards to altering them (even against their will) for the sake of our values. Wouldn't that sort of be a form of Prisoner's Dilemma with regards to, say, other species with different values than us and more powerful than us that could do the same to us? Wouldn't the same metarationality results hold? I'm not entirely sure about this, but..

I'm inclined to think so, which is one reason I wasn't in favor of going to war on the Babyeaters: what if the next species who doesn't share our values is stronger than us, how would I have them deal with us? what sort of universe do we want to live in?

(Another reason being that I'm highly skeptical of victory in anything other than a bloody war of total extermination. Consider analogous situations in real life where atrocities are being committed in other countries, e.g. female circumcision in Africa; we typically don't go to war over them, and for good reason.)

Good story! It's not often you see aliens who aren't just humans in silly make up. I particularly liked the exchange between the Confessor and the Kiritsugu.

This is truly fascinating. I am grateful to receive for free what I would have paid good money for.

> How the fluff would they be able to feel all possible types of feelings

They seem to be limited to mapping others' experiences to their own feelings for analogous experiences. For instance, they first mapped giving birth to pleasure. Hardly epic angelic universal empathy powers.

What else is there, though? How do you define the feelings "pleasure" and "pain", distinctly from "goals sought" and "things avoided"? How do you empathize with a really alien intelligence without mapping its behavior and experiences to your own?

Alright, so we are headed for some variety of golden rule\ mutual defense treaty imposed to respect each others' values simply because there is reason to believe, if not provable, that there exists some OTHER force in the universe more powerful than the ones currently signing the treaty. This of course does not void 'friendly' attempts to modify unwanted behaviors, which added together with a 'will to power', would likely have civilizations drifting towards a common position ultimately.

The aliens should be allowed to extend to human infants the invitation to leave their parents to join the alien civilization. As I understand it, this would result in every human infant leaving their parents, but that would indeed be the correct rational choice when one has the preferences that human infants have.

So every human infant would leave their parents, and it would be ethically correct to allow this. If humans want to reproduce in a way where their children don't immediately leave them, they should modify their reproduction techniques to produce children with preferences that allow such an outcome. (Or just start constructing new additions to the human race as adults to begin with, instead of producing children.)

I wonder why the confessor is "too old" to lead. Maybe very old humans do not share the CEV of younger humans?

Aleksei, children are rarely enthusiastic about the idea of leaving their parents. Why would they trust the Super Happy People?

"And you should understand, humankind, that when a child anywhere suffers pain and calls for it to stop, then we will answer that call if it requires sixty-five thousand five hundred and thirty-six ships."

How would they hear it? They did not even know about humanity until just now, much less hear the calls for help of any human child. All they have to do is not go looking for miserable children, and they will not find any, or feel their suffering.

On a related note:
For whatever reason, you currently permit the existence of suffering which our species has eliminated. Bodily pain, embarrassment, and romantic troubles are still known among you. Your existence, therefore, is shared by us as pain.

What did the Super Happy People do with every previous species they have encountered?

> Aleksei, children are rarely enthusiastic about the idea of leaving their parents.

I don't think young infants think in terms of parents and non-parents, especially if the Super Happy People make their offer before the infants have gotten accustomed to their parents.


> Why would they trust the Super Happy People?

Why do they trust their parents? How are the Super Happy People at a disadvantage?

If there is distrust to overcome, the Super Happy People would probably arrange for the infants to directly feel their intentions and feelings.

Aleksei, do you mean they would have sex with the children once and then ask them if they'd like to leave their parents and have sex every day for the rest of their lives? :-)

Anyway, it takes too long for unmodified human children to develop proper minds in order to consent to anything like this. What do you do about pain incurred at the age of a few months? A year?

I'm also bothered that nobody has mentioned non-human animals. Why should cats and chimps and dolphins have to suffer pain and romantic disappointment? The Super Happy People should modify all the higher life forms and completely reshape the ecology.

"All right. Open a channel, transmitting my voice only." [...] Out of sight of the visual frame, Akon gestured [...] [emphasis added]

Erratum?

His voice, as opposed to other people's voices, I assume; i.e., the Confessor's warning was not transmitted.

Arguably unclear wording, though.

You know, they aren't the "Trade Federation", but I come out of this post with a distinctly East Asian impression of the Super Happy Fun People, which I think probably shouldn't happen for a truly alien race, since I would expect its variance from humanity to be orthogonal to ethnic and cultural differences. It may just be the names and superlatives, but I think that the shadows of Buddhism are having some of the effect. OTOH, that really might be a fairly strong universal attractor in which case I'm being unfair.

Also, it seems to me that part of the intention of the story is to put us in the middle of a situation where motivations are symmetric in both directions, but that doesn't really happen. The SHFP values and generally existence call out to humans as plausibly a more proper expression of our values than our own existence is, though we are told that physical ugliness tends to drive us away. The human values do not have the same effect on the baby eaters, thus the humans don't face a threat to their values analogous to that faced by the baby eaters.

Also, a very important question regards the nature of baby eater children. I'm not sure in what sense they can be a lot like human children but not value "good", yet if they do value "good" where in their evolution does that value come from.

michael vassar: The situation is more symmetrical than that, I think.

The babyeaters, I imagine, don't like suffering either. That is, I doubt they would inflict suffering on their children outside of the winnowing, and would likely act to prevent suffering where possible. But, while suffering is certainly bad, it would be far worse to violate the much higher moral value of eating the young--that imperative is far greater than some suffering, no matter how great, isn't it?

Humans, of course, don't like suffering. They certainly wouldn't inflict it needlessly. But eliminating a little bit of suffering isn't necessarily worth what it would take--altering ourselves, perhaps, and losing our humanity in the process. And besides, who are they to decide for us? Humanity's moral right to self-determination is far more important than some minor suffering... right?

Fun bit: the last line of the initial message is not a "Pioneer engraving," it is a translation of how they would say, "Can we talk?" The two questions would be the same to them.

I do not recall having seen anyone comment yet on having a "Master of Fandom" on the ship. It is a more fun if somewhat less dignified title than "literature professor." Is his position normally part of the support staff, somewhat like a librarian or holodeck operator who helps with staff education and entertainment while off-duty? Or do research vessels typically expect to stumble upon bodies of fiction when engaged in astrophysics?

I have arguments that may convince the Super Happy Fun People not to drug our babies happy. 1. Humans have evolved to make choices, and receive the deepest pleasure from making meaningful decisions. If everything is pleasurable, then that robs decisions of their complexity, and thus meaning. 2. Our brains can handle only so much stimulation, too much awareness given over to happiness and there's no room for anything more. This is why our descendants are shown to be reliant on computers for data storage and translation, and still argue over choices like angry monkeys once they grep the situation.

I'm with Manon, I'd take the deal. I'm assuming that there would be some bailout money to help with the freedom from pain effort. I wouldn't think that this would make Akon (my only MST3K'ing will be to publicly wince at that name) a traitor to the human race.

If it were up to me, when Akon is saying "We found unacceptable the alternative of leaving the Babyeaters be. We found unacceptable the alternative of exterminating them" I'd also have included something like "we aren't positive that our translation mechanism is free of defects."

The comments to this entry are closed.

Less Wrong (sister site)

May 2009

Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31