An example of real world problems from cognitive biases:

Forensic science - Ignorance is bliss

According to Dr Rudin, the attitude that cognitive bias can somehow be willed away, by education, training or good intentions, is still pervasive.

Hat-tip to Bruce Schneier.

New Comment
12 comments, sorted by Click to highlight new comments since:

Consider the story of Harry Markowitz, a Nobel Prize-winning economist who largely invented the field of investment-portfolio theory. By relying on a set of complicated equations, Markowitz was able to calculate the optimal mix of financial assets. (Due to loss-aversion, most investors hold too many low-risk bonds, but Markowitz’s work helped minimize the effect of the bias by mathematizing the decision.) Markowitz, however, was incapable of using his own research, at least when setting up his personal retirement fund. “I should have computed the historical co-variances of the asset classes and drawn an efficient frontier,” Markowitz later confessed. “Instead, I visualized my grief if the stock market … went way down and I was completely in it. My intention was to minimize my future regret. So I split my contributions 50/50 between bonds and equities.”

[...]

One of the most refreshing things about “Thinking, Fast and Slow” is his deep sense of modesty: he is that rare guru who doesn’t promise to change your life. In fact, Kahneman admits that his decades of groundbreaking research have failed to significantly improve his own mental performance. “My intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy”—a tendency to underestimate how long it will take to complete a task—“as it was before I made a study of these issues,” he writes.

http://www.newyorker.com/online/blogs/books/2011/10/is-self-knowledge-overrated.html

http://www.wired.com/wiredscience/2011/09/can-irrational-decisions-be-corrected-a-football-case-study/

According to Dr Rudin, the attitude that cognitive bias can somehow be willed away, by education, training or good intentions, is still pervasive.

That particular attitude is still pervasive on Less Wrong! In fact, it is a somewhat simplistic version of the raison d'être of this community (not to mention the plans for the creation of a new "Rationality Institute").

Luckily, the "attitude that cognitive bias can somehow be willed away, by education, training or good intentions," is fairly well justified, and there are signs that these is lots of low-hanging fruit in this area.

According to Dr Rudin, the attitude that cognitive bias can somehow be willed away, by education, training or good intentions, is still pervasive.

I don't think "good intentions" and "willing things away" should be grouped with education and training in this statement. The former two strategies are not particularly effective (does anyone intend to be biased?), but the latter two-- while certainly nontrivial to implement-- can, as you point out, work well.

Is avoiding bias similar to a fat person avoiding cake? I don't doubt fat people generally want to lose weight but they avoid all the deliberation that would prevent poor choices. They're aware their will fails when directly tested and yet they don't take simple actions such as planning out healthy grocery lists in far mode. Anecdotally, I see the same pattern with people that know of biases yet while time and again failing to avoid them (Kahneman, even).

It's kind of amazing. It makes Odysseus tying himself to the mast of his ship to avoid the influence of the Sirens' song seem implausible. Still, a small percentage of fat people do somehow succeed.

I think it's pretty significantly unlike a fat person trying to avoid cake. At least when you fail to not eat cake, you notice. One might be able to hold to a restrictive diet by sheer willpower, albeit with great difficulty, but bias can't simply be willed away.

At least when you fail to not eat cake, you notice.

It's true that people don't always notice but at least some of the time biases do end up smashing against reality (like the two examples I gave here).

albeit with great difficulty, but bias can't simply be willed away.

I think Luke has given us reason to think they can be, although my point was that even if we know how to avoid biases, will we? Are we like fat people avoiding cake?

There's a big difference between being able to train biases away and being able to will them away.

When you try to cut something out of your diet, you have a conflict between your urge to eat it and your desire to avoid it, which comes down to a battle of willpower unless you engineer your life to make the resolution easier. A fat person confronted with cake they have resolved not to eat knows that they will regret eating it, and with sufficient willpower, can resist.

When dealing with cognitive bias, on the other hand, the hard part is usually noticing you're about to make a biased judgment at all. You're not going to commit the Conjunction Fallacy, for instance, thinking "I know that the conjunction of all these specific things makes this prediction unlikely, but thinking that it's probable is so tempting." Noticing in retrospect, as in the examples you gave, is a very different thing from noticing prior to or during the decision, but doing the irrational thing anyway.

Of course I don't think training away something is the same as willing away something. I'm making a distinction between training for something and succeeding at that something.

A fat person can train in a way similar to how we would train to avoid biases. They might strategize for the future knowing they would fail otherwise--like we might fail to recognize a bias we know we have. For example, they might plan a healthy grocery list for the entire month because they knew without that strategy they'd buy a lot of junk food. However, like Markowitz, they still end up failing given their training; fat people's will fails them and they deviate from their strategy. So, I wonder if biases are similar in this way. One way to find this out, I think, is to drain people's will power (it's a finite resource) and see if they're more susceptible to biases.

I disagree that it is well justified, but I think it is probable enough to warrant us giving it a shot.

I wrote something about the possibility of blind expertise (potentially used to overcome such biases) a while back.

You can find a preprint version here:

https://sites.google.com/site/johndanaher84/papers-and-presentations

I'd like to point out that the article still indicates that systemic solutions are possible - avoid situations where the biases apply.

Also, did their education, training, or good intentions really directly address cognitive biases? My formal training and education certainly didn't. And of course good intentions won't. How much would that make a difference? I don't know, obviously. This is only evidence that whatever training or education about cognitive biases they received was ineffective.