« Glory vs. Relations | Main | The Opposite Sex »

June 28, 2008


Surely it's plausible at this point that I'm trying to present certain concepts in an ordered sequence? Imagine saying to me, one year ago, "Hey, it's time to discuss free will."

Ohh! We don't want to wait that long! *fidgets like impatient child*

Robin: "I can also find a tech that looks pretty likely to appear within the predicted time-frame, and an economic analysis suggests it could plausibly deliver the forecasted speedup. And this tech is a kind of AI! "

- and by this you presumably mean uploads?

Eliezer, I doubt you and I would have had much problem discussing free will a year ago. But perhaps many of our readers would have found it harder to follow our discussion then.

Roko, yes.

If the singularity is arriving so soon, how do you have time to spend whole years laying the groundwork for discussing these complicated concepts? I'm not complaining, but...

I don't think there's going to be much time for future growth spurts to happen "later". Our current exponential growth seems unlikely to continue for very long, since it seems likely to propel us into the realm of physical limits before too many more doublings.

Tim, if previous trends continued to the next two singularities, the second one would happen within about two years after the first one.

So maybe the first singularity could be caused by uploads, the second by AGI. This would be consistent with Eliezer's claim that AGI wouldn't have such a limited doubling time.

Right. So: that's two years of your doubling every two weeks? I make that 52 doublings. Almost as many doublings as there are square on a chessboard.
Growth by a factor of 4,503,599,627,370,496. Do you really think that the
laws of physics will stomach that?

Since you are extrapolating from such a small number of data points, figures like those could be way off base - but it seems as though your model would suggest that the next such development might well be the last one of its kind.

Tim, "within" means "less than" here.

I can certainly see both sides of this issue(regarding timing of discussion).

Eliezer, perhaps it makes more sense to keep Robin informed ahead of time what you want to discuss. At the least as a rough timeline, so that he can discuss similar topics, and you will both be ready when you're ready.


4 quadrillion is not a difficult amount of expansion for our Solar System to absorb, but I think it does require nanotech. Not to mention a choice.

I can't criticise your analysis if you don't present it. Anyhow, physical limits can be made out on the horizon - and it is far from obvious that there's ever going to be anything in the future with remotely the same impact as the ongoing technological revolution looks like it is going to have - as it overtakes the "natural-technology" of existing evolved lifeforms.

Maybe if we get a signal from another galaxy - with a dramatic influx of knowledge - but you can't easily predict things like that.

As I understand it, the idea of economic doubling is not really to do with resources. Resources are not yet much of a factor - since we have plenty of them. Growth appears to have more to do with our ability to manipulate signals. Transistor densities double annually, while our gold supplies grow only modestly. So the idea is that - if exponential growth is to continue - it should be supported on a small scale - by the "room at the bottom" - otherwise we don't have a good reason to expect such growth to happen in the first place.

For myself, I like Eliezer's drawn-out foundation building. If you don't cross inferential distances in baby steps, it seems you wind up with a lot more noise in the comments (in the form of people unwittingly criticizing strawmen, arguing semantics, and so on).

Zings abound! Best OCB post in years.

I found Robin's analysis of timing and speedup interesting, if pretty speculative - due to extrapolation from a tiny number of data points. But I thought the analysis of transition inequalities sucked. It wasn't even clear if Robin was counting AIs as people or not.

Re: So maybe the first singularity could be caused by uploads, the second by AGI.

Ah, the hypothetical uploads-before-AI scenario again. Has anyone ever articulated any coherent reasons for taking this idea more seriously than angels dancing on pinheads?

I could not help but paraphrase a little more.
Let's suppose Robin and Eliezer are talking about explosions sometime in the past.

Robin: I have analyzed most of the known ways to make an explosion. If you heat up a boiler, it explodes. If you put gunpowder into closed metallic container and heat it up, it explodes even more powerfully. Water is a single substance; gunpowder is a mixture of three substances. I have a theory, that if you make a mixture of FIVE substances and use it instead of gunpowder, you might get an even more powerful explosion. I have even proposed a recipe that might work. It involves a new substance, produced using nitric acid and some organic materials…

Eliezer: I don't have time to talk, but I have a better idea. If you bring together two big pieces of radioactive material, you will get a really big explosion. If you get the details right, that is.

Robin: Historical evidence does not support that. For all the known cases when two pieces of anything were brought together, explosion never occurred. But since you do not specify how hard it is to figure out the details of your explosion, why can't we think mine is more feasible?

So we go to four quadrillion times in two years, and then things really start to pick up? Unless someone has a picture handy of what a quadrillion looks like, I think we are past the point where your meat brain substitutes "gazillion" when trying to think about what the numbers really mean, like 3^^^3. Whether or not we are uploaded, most value will be in the form of information by that point, or at least the information that is incorporated into the matter we are enjoying.

But the real point of uploading has already been mentioned: to speed up Overcoming Bias posts. When we can directly connect brains and transfer information the way our computers do, we can get through the introductory explanations much more quickly. Perhaps non-biological post-humans will put great value on this discourse, and that will be the source of the octillion-dollar economy.

(I suspect this is an audience that would get it if I made an obscure reference here to Oracle's digital telepathy line in Rock of Ages. Very obscure.)

Zubon: at http://www.kokogiak.com/megapenny/eighteen.asp there is an image of one quintillion pennies, with a comparison with one quadrillion pennies, each drawn to scale with certain other familiar objects.

Everyone, it is Tim, not I who predicted 52 doublings (or quadrillions)! I would only suggest that the next mode might plausibly see a similar number of doublings to the last few modes, i.e., six to sixteen.

Unknown, cool picts!

What I actually said was that 52 more doublings after the next substantial increase in the growth rate seemed unlikely to happen - due to physical limitations.

Even 16 doublings at Robin's 2.3 weeks per doubling is only 36.8 weeks - less than your average pregnancy.

Thinking about what chain of events could plausibly produce a two-stage increase in the acceleration rate, my guess would be:

1. Synthetic minds
2. Synthetic bodies

We'll probably get advanced AI first, which will then help to produce advanced nanotechnology. In both cases, any shift is likely to occur when the engineered technology becomes broadly competitive with the existing natural one it is supplanting.

The new replicators will first build themselves minds, and then use those minds to build themselves bodies.

The comments to this entry are closed.

Less Wrong (sister site)

May 2009

Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30