« Fake Optimization Criteria | Main | Adaptation-Executers, not Fitness-Maximizers »

November 10, 2007


I would start by mining the various books with the words "Anti-Patterns" in the title. While these books are in general not all that useful for programmers (good programmers already know most of what's in them, bad programmers probably never will), they provide handy checklists of ways that software projects fail. I would imagine that most of these can be boiled down to determine the fundamental biases underneath them, usually some variant of endowment effect, confirmation bias, or bandwagon effect.

Random thought: "agile programming" is basically a set of neuro-linguistic hacks for short-circuiting endowment effects.

Evidence-Based Scheduling (built in to the FogBugz bug-tracking system). Automatically tracks and corrects for systematic biases in programmers' time estimates.

One source of bias that I consciously try to avoid is performing quick preliminary investigations. I have found these are especially dangerous if there is going to be a significant gap between the preliminary and the final investigation. During this gap my mind tends to inflate the reliability of the preliminary conclusions, making it much harder to get to the truth in the final investigation.

I first came across this effect many years ago, when plotting a chart for a variable star I was planning to observe. I identified the general area of the star on an atlas and noticed a star marked with a "V" (indicating variable) near there. I was just about to confirm this identification by precise measurement when my mother came in and told me to clear the table because she wanted to lay out the cutlery for dinner. I quickly pencilled a ring round the candidate star and packed my atlas and papers away. After dinner, when I got them out again, my mind had solidified this candidate into 'the' star and I drew up my chart accordingly. It took me only a week to suspect that something was wrong, but it wasn't until several years later when I analysed all my observations that I finally realised that I had observed the wrong star

In my work as a software engineer, I first came across this effect when I was given the task of enabling the data and instruction caches in an embedded system. It was near the end of the day so I just glanced through the manuals and noted that the caches were enabled by setting a particular bit pattern in a register. I scribbled a few notes in my log book, including what I though was the hex version of the bit pattern that would do the job, intending to check this more thoroughly the following day. But the next day I was pulled off onto another job and didn't return to the cache job until a few weeks later. By then I had forgotten about the need to check the bit pattern and just used the value from my logbook. It wasn't until a few weeks later when I attempted to confirm the forecast speed-up resulting from the enabling of the caches that I discovered I had got the bit pattern wrong (I had used something like 0x0008008 instead of 0x80008000).

On another occasion, I was analysing a diagnostic data dump after the above embedded system had crashed. A quick glance suggested one possible cause but I didn't have time to do a proper analysis then. When I came back to the investigation, I just followed that possibility without doing the proper analysis. I wasted a whole day before I realised that I had made a simple error in my quick glance analysis and the cause was elsewhere. After that, I wrote a script to automate the initial analysis of the diagnostic data dumps. Automation can be a good way of avoiding bias.

Most of the useful heuristics are going to be useful for *all* professionals, but if your management won't fund it without the "specifically for engineers" angle, then so be it.

Software engineers tend to:

1. Have below-average social skills, compared with others at their professional pay level. For example, this argues that "illusion of transparency" should be included in the curriculum. Usual caveats: this is only on-average, don't take offense, yadda yadda. Note that I am an outlier who is exempt from this tendency, as are my friends, colleagues, and any other potentially angry people who happen to read this blog. (Phew, that was close.)

2. We frequently have to quickly estimate minor-task completion times, because the decision of whether to add a functional point or fix a bug often depends on task completion times. Perhaps the Planning Fallacy and Overconfidence could be included in the curriculum.

BTW, see Monash for a resource that appears to have actually tested different ways of teaching critical thinking.

Go read this book. That's my advice.

I've added an entry to my blog to respond to the topic of engineers and biases. You can read it here:


I would also like to echo Dave's earlier comment about "Anti-Patterns". These are not just for engineering, necessarily, but they very nicely express the kinds of dysfunctional thought patterns and behaviors to be avoided for those involved in technical projects.


On the topic of engineers, a study shows they are prone to becoming terrorists.

One that I often see at work is the unexamined assumption that the user of a product is similar to the engineer in ability, preferences, usage pattern, etc. Hence it is common to get command-line-only tools for artists and the like.

TGGP, I see no reason to rule out selection bias in that study. They looked at lists of terrorists, and the lists were likely to include only the more important terrorists found. Then they threw out the names they couldn't find out anything else about. Lots of room for biased sampling here.

It could be true, but this evidence is not particularly believable.

My Refactor Your Wetware book that you mention is now available in beta from our website. More to come soon!


The comments to this entry are closed.

Less Wrong (sister site)

May 2009

Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30