« "Arbitrary" | Main | Self-Indication Solves Time-Asymmetry »

August 12, 2008

Comments

You can see by the glee with which the argument is typically advanced that they're making it precisely because they like the idea of violating the taboo,

Or perhaps they value rationality and abhor inconsistency, and having the smallest nuclear explosives be hands-off while the largest conventional explosives are accepted is irrational and inconsistent.

Consistency does not imply embracing the nukes. It may also imply rejecting the chemical bombs.

I think the intuition is that taboos often draw crude clear decision lines. Crude lines make them forbid things that are not really bad, and allow things that are not really good, but the clarity this gains makes it easier for everyone to see when the line has been crossed.

I think there's something particularly creepy about the kind of people who use ad hominems. Rare truths are more valuable than common truths, and sometimes the reason truths are rare is because they're taboos, so taboo-busting has relatively high expected value.

That said I agree with what Robin Hanson said.

http://en.wikipedia.org/wiki/Bright-line_rule

Is it a taboo or is it just that both sides had so many no-one dared use them.

Given the huge costs we have incurred from stupid decisions of even teams of really smart people, I don't think luck can be overestimated to have played a role. In effect, was the waste of the Iraq War really more damaging than the use of several nuclear weapons? How about a variety of other economic, environmental, safety, and other wastes?

Right now one can imagine that there is a near perfect, or at least good-enough safety system in place in all of the countries that have nukes to keep any from being accidentally or stupidly used. This is similar to how one could imagine in 2000 that there were secret anti-aircraft missles protecting the Pentagon and other key U.S. governmental and military infrastructure from attack.

One could also speculatively ask with such a perfect safety record on nukes, are we overspending on nuclear safety? Could our resources be used more efficiently to overall increase our safety, with money diverted from nuclear safety to other safety risks.

The other nobel prize winner Aumann seems to be more to the point than Schelling. His lecture explains what Ian above puts succinctly in one line. In my opinion, what he says reflects reality more than what Schelling does.

Following are a couple of paragraphs from the lecture:

Let me give an example. Economics teaches us that things are not always as they appear. For example, suppose you want to raise revenue from taxes. To do that, obviously you should raise the tax rates, right? No, wrong. You might want to lower the tax rates. To give people an incentive to work, or to reduce avoidance and evasion of taxes, or to heat up the economy, or whatever. That’s just one example; there are thousands like it. An economy is a game: the incentives of the players interact in complex ways, and lead to surprising, often counter-intuitive results. But as it turns out, the economy really works that way.

So now, let’s get back to war, and how homo economicus – rational man – fits into the picture. An example, in the spirit of the previous item, is this. You want to prevent war. To do that, obviously you should disarm, lower the level of armaments. Right? No, wrong. You might want to do the exact opposite. In the long years of the cold war between the US and the Soviet Union, what prevented “hot” war was that bombers carrying nuclear weapons were in the air 24 hours a day, 365 days a year. Disarming would have led to war.

"In the long years of the cold war between the US and the Soviet Union, what prevented “hot” war was that bombers carrying nuclear weapons were in the air 24 hours a day, 365 days a year. Disarming would have led to war."

Is there expert consensus on this?

"what prevented “hot” war was that bombers carrying nuclear weapons were in the air 24 hours a day, 365 days a year."

"The art of war teaches us to rely not on the likelihood of the enemy's not coming, but [...] on the fact that we have made our position unassailable."

-- Sun Tzu

Robin and steven, It is true that the ultimate benefit of the taboo derives from the unambiguity that it affords. But it is different from other kinds of bright-line rules. Everyone agreeing to drive on the right-hand side of the street is also a bright-line rule which has the virtue of avoiding ambiguity, but driving on the left is not a taboo, it's simply something that nobody does. The reason is that no one really cares which side they drive on, which means that everyone buys into the rule and so it doesn't need to be defended against people who would like to undermine it. To maintain a bright line restriction on the use of some kinds of weapons in a world full of people who would like to use those weapons, you need to marshall some force that will protect the bright line from assault, and in this case that force is moral repugnance. This is irrational in the narrow sense that it requires people to get genuinely upset about some things that are objectively no worse than other things that they don't get nearly as upset about. Schelling is saying that that amount of irrationality is a good thing.

Schelling is saying that that amount of irrationality is a good thing.
Only in short-term, limited scope view that cares only for immediate results and discounts longer consequences.

Saying that irrational arguments are good when they convince people of your position, and bad when they work against you, is extremely short-sighted. The reality is that irrational arguments are bad, period - and become even worse when they are used to convince people of something. Anything. Because that increases the future vulnerability to nonsense immensely.

@HA

"Is there expert consensus on this?"

Yes, there was until the end of the Cold War. MAD - or as Ian C. rightly quote Sun Tzu "unassailability" - as the chosen form of nuclear deterrence is a form of Nash equilibrium, as I noted on EY's Hiroshima post.

What's interesting to me, however, is Robin's recent post on game theory, in which several other newer forms seemed to be found better than Nash equilibrium.

Since game theory was what got us safely through the Cold War, will newly urgent nuclear policies - considering recent events in Georgia, Iran, North Korea, and India! - adopt these other game theories?

If so, how will policy look different in the near future? Will these new game theories prove helpful in cases of asymmetrical nuclear threat (terrorist group with stolen Soviet suitcase nuke, for example)?

Perhaps Robin will stop by and give us all a viewquake.

"In the long years of the cold war between the US and the Soviet Union, what prevented “hot” war was that bombers carrying nuclear weapons were in the air 24 hours a day, 365 days a year. Disarming would have led to war."

Ideally this experiment should have had a control group.

But lots of people believe it on no evidence. What similar claims could we make? How about this one: In the 1960's we spent many millions of dollars on Welfare to prevent a humanitarian catastrophe. Lots of experts believed the humanitarian catastrophe was imminent. Therefore, it was true that without Welfare millions of americans would have starved. I don't know anybody who believes this one, but there's far more evidence for it than for the inevitable third world war.

How can we explain this bias? Why do people so strongly maintain this utterly irrational belief?

Here is one possibility. Consider the money we spend on our military. Most of it is paid to Americans -- US defense corporations, and suppliers of all kinds, and salaries and pensions, etc. Consider the money those people spend, that they wouldn't have if the military didn't pay them? Secondary effects. And the money that their pay goes to still more people who spend it, who wouldn't have it if the military hadn't spent it first. Probably a quarter of our economy supports the military, one way or another. And then, what if all our soldiers were unemployed and looking for work, that would be pretty bad.

When you think about it that way, it's likely to seem that what's good for the US military is good for the USA. So we should believe anything about the US military that sounds good, and disbelieve anything that sounds bad, because our own prosperity depends on the military spending enough money.

I'm not sure how to test that. Maybe questionnaires? Ask people what they think about the military's effect on the economy, and ask them how they feel about the military. If the ones who irrationally support the military think that the military has a bad influence on the economy, that would tend to disprove the idea.

But what if the same ones who say that military spending is great for the economy also say that without a military that spends 45+% of world military spending we'll get invaded by our implacable enemies? How do we decide which irrational beliefs lead to the irrational support? People who think an irrational belief improves their own prosperity might take up multiple other irrational beliefs to support the first one. How do we decide which of them if any is the root cause?

"In the long years of the cold war between the US and the Soviet Union, what prevented “hot” war was that bombers carrying nuclear weapons were in the air 24 hours a day, 365 days a year. Disarming would have led to war."

To assume disarming wouldn't give one side the first strike advantage is ludacris, Russia (Formerly U.S.S.R) hardly holds up to it's nuclear treaties and deals.

Nuclear, when you make wild claims that you have absolutely no evidence for, thay do not become more believable when you repeat them again with absolutely no evidence.

The comments to this entry are closed.

Less Wrong (sister site)

May 2009

Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31