« The Magnitude of His Own Folly | Main | Intrade and the Dow Drop »

September 30, 2008

Comments

"Alas, for those who turn their eyes from dragons and dream of zebras! If we cannot take joy in the merely fantasy, our lives shall be empty indeed." - Eliezer S. Yudkowsky, in a parallel universe.

I never expected a post from Overcoming Bias as informal as a picture with commentary from flickr.com. (But I suppose that's a fact about my own state of poor calibration).

Ohhhh... oh so many things I could substitute for the word 'Zebra'....

Well, a picture of a zebra is real.

And you'll probably agree that the merely real is, in some ways, in need of improvement, which is the whole point of transhumanism.

I didn't know Eliezer had a girlfriend, how can you justify spending resources on that sort of thing?

Not an attack though, you probably have a good reason, I just can't figure out what it is.

I didn't know Eliezer had a girlfriend, how can you justify spending resources on that sort of thing?

Not an attack though, you probably have a good reason, I just can't figure out what it is.

Was this written in jest? It's hilarious.

No time for love, we've got a world to save!


...or so the theory runs.

Now that I think about it I seem to recall seeing a clever excuse for indulging in the pleasures of the flesh that Eliezer had written. Can't remember where off the top of my head, though...

[missing the point]
I like cats better.
[/missing the point]

*gasp* Hasn't Eli been working only on his mind-children?
Can we expect another permutation of the superior genes that brought us so much awesomeness in the form of Eli?

Aspiring Vulcan: I didn't think it was written in jest, it seems like a legitimate question to me. It definitely seems plausible that having a girlfriend would have some benefits that would help Eli save the world, but how to justify spending time and resources on a girlfriend that could be spent on other things is a good question nonetheless.

Alas for those who turn their eyes from ladies and google themselves.

Can one make scientific breakthroughs without dedicating all of one's waking hours to it?
Newton: Science: Best Sans Booty.

In medicine, the concept "zebra" represents a strange, unlikely condition or diagnosis, usually to be avoided or considered on a lower tier, iterated thus: When one hears hoofbeats, one should think of horses rather than zebras. Spending too much time chasing zebras detracts from making the diagnosis of "horse". Coincidence? Or just another example of the medical field's poor thought process?

Science: Best Sans Booty.

Schrödinger disagreed. (So did Einstein... and Feynman... I could mention Kinsey, but that would be cheating, I supppose.)

It's not just about spending resources - In my experience, having a girlfriend makes you dangerously comfortable with being a mere human, whereas bitter loneliness makes you see the necessity of achieving incorporeal modes of existence much more clearly.

(this comment will be remembered as a significant milestone in singularitarian demographics)

Against Cyan I refer to James Watson and some nifty graphs.

I suggest Eliezer (any aspiring world-saving scientist) deal with the-signal-from-downstairs like with hunger; get a snack and forget about it. No whipping into eating-frenzy by watching cooking shows...

...and no, you don't need your own personal chef. Sure, they cook a tastier meal, but they also make you eat more often than you need to; having a chef just combines the maximum of temptation with the maximum of opportunity...

I need physics more than friends. I place more importance on my studies than myself. I often go long periods without social contact outside of my professional colleagues, and at times even long periods without food or rest.

Her: "Pass me the laptop, I want to see if there were any comments on that post."

Me: "No, you don't. They're pretty awful."

(Girlfriend looks.)

Her: "That's awful."

Me: "Yep."

My fears:

1. Paperclip AI
2. People I know IRL catching me reading something embarrassing on the Internet
3. Nuclear war
4. The zombie under my bed

Awful, perhaps, but are these arguments true?

Are you willing to take the risk that you're going to have to admit on your death bed:
If only I hadn't succumbed to the dictates of my genes...
If only I hadn't lived like the majority; if only I hadn't done what had been done so many times before by so many billions of organisms...
If only I had been different enough. But I wasn't. I let my genes pilfer my time for simple, two-dimensional pleasure signals. I placed greater value on companionship than progress. I exchanged the now, for the eternity.
I could have saved the world... but now I lay here dying, instead of living forever.

After the Singularity, there's plenty of time for girlfriends. And boyfriends. And robofriends. Trillions of them. You can super-saturate your re-designed gender-specific feedback modules again and again.

But not now. Lead the way. Go wirehead later. Delay gratification. Be a transhuman. Ditch the...


If you know you can save the world, or do anything significant that has eternal repercussions, that no other human is likely to do in the mid to long term - if you are truly a once-in-a-civilization-event - how irrational is it to do anything else?

If you can't reprogram your mind from "Having a companion is a good idea." to "Having a companion is neglecting my duties.", having no companion may feel like sacrifice. But if you can, or you start with the latter conviction, having a companion feels like the height of folly. Torment. Frustration. Time away from your true love - The Singularity.

With great power comes great responsibility - non-transferable, non-delegatable responsibility.

They're still pretty awful, IMHO.

As someone who is pooling all my resources (mind and money) into expediting the development of technologies leading to the Singularity, having a companion never entered my consciousness. (And I've had my share of applicants - heck, I've been asked to marry on the spot several times by super hotties, so I'm really making a real sacrifice here - if I thought like a baseline human, which I don't.)

Strong AI doesn't have to be the only thing that's really frikkin' hard.

Not doing what your genes tell you to do, e.g. not having a companion (not wanting to have a companion), overcoming your genes, is an artificial goal, the result of reprogramming - of self-improvement.

Through his cerebrations Eliezer appears to have attracted the Cream of the Singularitarian Crop here, who are now Collectively Disappointed.

l, Aww, Ben_Wraith:

While I appreciate the effort toward optimal decision-making, surely there is some way to contribute without invading Eliezer's personal life?

Can you picture anyone doing peak creative work while trying to justify every ounce of their resource use? To others or to themselves? Eliezer presumably knows he doesn't need to do that, but... threads like this can't help his or others' morale. And morale is a precious resource.

Group efforts in general, and particularly philanthropic efforts, devolve too easily into shows of self-sacrifice. After all, sacrifice takes less effort in many ways, and it looks like trying . If we want to create a positive singularity, we'll need to make our project fun, we'll need to make the actual useful work attractive, we'll need to get people aim for achievement (not for an appearance of "using all their resources"), and we'll to make it something that real people want to join and don't burn out at.

It isn't only Eliezer who can help, by the way. If nothing else, you can help the effort get money; some of those willing and able able to do FAI research are spending their time raising money, right now, for lack of other ways to get money. If you find a way to gather money for the effort, more research will be done and the chances of a positive singularity will improve. There are other possibilities for helping, too. If you're concerned about the future, perhaps take a look at what you have, who you know, and what you might do? Creating a positive singularity can be a lot of fun.

And with that, I'll close the thread. I may be mistaken, but I don't think this is what most of our readership comes here to read.

Note that the authors "SL5", "Reprogrammed goal system", "Baseline singularitarian", "Singularity sooner, fun later", "Awww, a girlfriend > Singularity", "Oppenheimer", and "Eli's other project" all seem to be the same person based on IP.

The comments to this entry are closed.

Less Wrong (sister site)

May 2009

Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31