« Modern Depressions | Main | Thanksgiving Prayer »

November 27, 2008

Comments

"The key common assumption is that of a very powerful but autonomous area of technology. Overall progress in that area must depend only on advances in this area, advances that a small group of researchers can continue to produce at will. And great progress in this area alone must be sufficient to let a small group essentially take over the world. ..."
A Manhattan Project backed by Japan or the U.S. or China is a 'small group?' What if an improvement in em efficiency gives it a supermajority of the world's top-notch researchers until that advance is duplicated?

Slightly off topic, but I just reviewed a book for Amazon that makes the point of the interconnections, and argues that it greatly reduces the risk of war.

Producing Security: Multinational Corporations, Globalization, and the Changing Calculus of Conflict (Princeton Studies in International History and Politics) by Stephen G Brooks

My review
The main thesis of the book is that since almost anything manufactured today that is even moderately complicated has its manufacture integrated in multiple locations around the world, therefore one of the main causes of aggressive war, seizure of valuable properties, is averted. It would seem that Iraq's invasion of Kuwait for oil is a counter-argument, but natural resources, even oil, are becoming less and less valuable relative to other things. Anyone, like Iraq, that would consider any natural resources particularly valuable would be too weak to actually get away with the aggression. The book is rather dry and academic in tone, but thoroughly argued. Highly recommended for anyone interested in military or international trade issues.

Bill, yes, good point.

Carl, the Manhattan Project was probably the largest isolated research project in history, relative to world product at the time. And even it was a pretty small fraction.

This isn't directly related to this post, but I'd like to throw a widely different perspective on the fire. There's a whole other subculture that has (I think) good arguments that the future will be very different than anything we consider here.
Here.

Robin,

It was a small part of total GDP, but a large portion of the world's best scientists in relevant domains.

We generally specialize when it comes to bugs in computer programs - rather than monitoring their behavior and fixing them ourselves, we inform the central development authority for that program of the problem, and rely on them to fix it everywhere.

The benefit from automation depends on the amount of human labor already in the process, a la the bee-sting principle of poverty. Automating one operation while many others are still human-controlled is a marginal improvement, because you can't run at full speed or fire your human resources department until you've gotten rid of all the humans.

The incentive for automation depends on the number of operations being performed. If you're doing something a trillion times over, it has to be automatic. We pay whatever energy cost is required to make transistor operations on chips fully reliable, because it would be impossible to have a chip if each transistor required human monitoring. DNA sequencing is increasingly automated as we try to do more and more of it.

With nanotechnology it is more possible to automate because you are designing all the machine elements of the system on a finer grain, closer to the level of physical law where interactions are perfectly regular; and more importantly, closing the system: no humans wandering around on your manufacturing floor.

And the incentive to automate is tremendous because of the gigantic number of operations you want to perform, and the higher levels of organization you want to build on top - it is akin to the incentive to automate the internal workings of a computer chip.

Now with all that said, I find it extremely plausible that, as with DNA sequencing, we will only see an increasing degree of automation over time, rather than a sudden fully automated system appearing ab initio. The operators will be there, but they'll handle larger and larger systems, and finally, in at least some cases, they'll disappear. Not assembly line workers, sysadmins. Bugs will continue to be found but their handling will be centralized and one-off rather than local and continuous. The system will behave more like the inside of a computer chip than the inside of a factory.

- such would be my guess, not to materialize instantly but as a trend over time.

This line was particularly illuminating:

"Yes, small isolated entities are getting more capable, but so are small non-isolated entities, and the later remain far more capable than the former."

Eliezer, yes the degree of automation will probably increase incrementally. As I explore somewhat here, there is also the related issue of the degree of local production, vs. importing inputs made elsewhere. A high degree of automation need not induce a high degree of local production. Perhaps each different group specializes in automating certain aspects of production, and they coordinate by sending physical inputs to each other.

Whether the sting of poverty principle applies depends on whether what is being automated lies in parallel with some human operation on the critical path - and there will be plenty of cases where that's not true.

Robin, numerous informational tasks can be performed far more quickly by special-purpose hardware, arguably analogous to more efficient special-purpose molecular manufacturers. The cost of shipping information is incredibly cheap. Yet the typical computer contains a CPU and a GPU and does not farm out hard computational tasks to distant specialized processors. Even when we do farm out some tasks, mostly for reason of centralizing information rather than computational difficulty, the tasks are still to large systems of conventional CPUs. Even supercomputers are mostly made of conventional CPUs.

This proves nothing, of course; but it is worth observing of the computational economy, in case you have some point that differentiates it from the nanotech economy. Are you sure you're not being prejudiced by the sheer traditionalness of moving physical inputs around through specialized processors?

Yes, small isolated entities are getting more capable, but so are small non-isolated entities, and the later remain far more capable than the former.

I mentioned this recently on thhe thread two down from this one, but just in case it isn't sinking in, the main issue in this area is not with isolated entities - rather it is with folks like Google - who take from the rest of the world, but don't contribute everything they build back again - and so develop their own self-improving ecosystem that those outside the company have no access to. The only cost they pay involves not gaining in the short term by monetising their private tech (by sharing it) - and that cost can be swallowed gradually, a drop at a time.

Eliezer, both computing and manufacturing are old enough now to be "traditional"; I expect each mode of operation is reasonably well adapted to current circumstances. Yes future circumstances will change, but do we really know in which direction? Manufacturing systems may well also now ship material over distances "for reason of centralizing information".

The comments to this entry are closed.

Less Wrong (sister site)

May 2009

Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31