« Morality as Fixed Computation | Main | Inseparably Right; or, Joy in the Merely Good »

August 08, 2008

Comments

I learned a sequel called Ark comes out next year.

Christian SF? As if Narnia wasn't bad enough.

Tried Anna Kavan's Ice? Possibly the best SF book ever - and it involves a man-made disaster causing the end of the world. It may be up your street.

Is this post helping our hero with mere exposure bias? Tune in next time same biased time, same biased channel.

Baxter's Titan was along the same lines as well. A mission is sent to see if life can survive on the Saturnian (or Jovian? Don't recall or feel like wikiing at the moment) moon. Meanwhile, the USA is rent asunder by fundamentalists, and eventually falls into civil war, if I remember correctly. The mission does not go terribly well either.
Pardon the not-so-great summary, it's been years since I read it. It certainly did not paint a rosy view of our future.

The sequel idea reminds me of one of my favorite B movies, When Worlds Collide. I thought the tone was about as realistic as I've seen for a Hollywood movie speculating on how humanity would react to a near term catastrophic existential threat. It's very anti-Speilbergian film.

The issue is that I think, and the authors probably do as well, that going back to a primitive form of civilization is actually MORE REPELLENT than allowing humanity to die out. I'm not saying this as a moral judgement--it's actually more visceral than that and a reason I hate post-apocalyptic fiction.

Why would those people want to save humanity? Some tech could make their remaining lifetimes less occupied with primitive survival needs and therefore more worth living. But to save humanity... Why would anyone want that at all?
Well, at least for me, only existing sentient individuals and their qualia are really important and worth caring about. Not genes, humanity, life as a whole, etc.

XOR, you could reduce your scope even further and focus on saving you, the people you actually know, or know about and care about, and me (perhaps a Dunbar's number of 150 people

robin, how would one persuade the powerful to act in the interest of the rest of humanity? it seems to me they don't see it as a worthwhile endeavor. what practical steps would one have been able to make to induce them to act differently?

HA, such a reduction of scope would be unfair if we agree that all living sentient beings have equal rights (at least initially). It's not about selfishness, it's about caring for real people and their happiness rather than for abstract entities like humanity.

"HA, such a reduction of scope would be unfair if we agree that all living sentient beings have equal rights". We don't agree about that, XOR. If I don't persist, the concept of the rights or non-rights of others is absurd, from my perspective. Although I do encourage other people to put a concept of "HA's right to persistence" over maximizing their own persistence odds. For example, I encourage people to donate their brains to brain banks, whereas I plan to be cryopreserved. Hopefully research we do on their brains will maximize my reanimation odds at a future date, or even better, may result in breakthroughs making it unecessary for me to be cryopreserved. How do you handle the brain bank vs. cryopreserved question for other living sentient beings? What should they put in their will?

XOR: "But to save humanity... Why would anyone want that at all?"

Well, if you save the species, then there might be a chance of building up to civilization (and posthumanity?) in the long, long, long, long run. We can care about possible future people without putting intrinsic value on humanity qua humanity.

HA,
I think everyone should choose cryopreservation. In fact, I'm no great altruist. Equal rights concept just feels good at some "higher values" level, and has some kind of aesthetic appeal to me. Advocating cryopreservation doesn't even start to feel like sacrifice. Really hard moral dilemmas are another story, of course. But I surely do not intend to persist at all costs. Some costs would make life just not worth living.

Z. M. Davis,
We cannot say these possible future people want to exist. At the very least, they are surely not going to suffer from my failure to bring them into existence. Since a lot of already existing people do suffer, they seem literally infinitely more important to think about.

"Since a lot of already existing people do suffer, they seem literally infinitely more important to think about."

I would think twice before using phrases like "literally infinitely." Choice criteria don't always imply what we think they do. Forget the flood scenario, and simply suppose that having children in today's world is more of a burden than a joy to parents. Would that really make it right to let the population dwindle to zero?

Of course I agree that presently nonexistent people don't have rights and can't be harmed, but I value more things than simply the minimization of suffering in the here-and-now. I am willing to suffer a little today, that I might live and flourish tomorrow. I suggest we take a similar attitude towards potential future people as we do towards our potential future selves.

The comments to this entry are closed.

Less Wrong (sister site)

May 2009

Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31