The harm didn't come from "leading medical and then governmental associations" adopting recommendations before they were proven, it came from them holding to those recommendations when the evidence had turned.

Practical Advice Backed By Deep Theories

Once upon a time, Seth Roberts took a European vacation and found that he started losing weight while drinking unfamiliar-tasting caloric fruit juices.

Now suppose Roberts had not known, and never did know, anything about metabolic set points or flavor-calorie associations—all this high-falutin' scientific experimental research that had been done on rats and occasionally humans.

He would have posted to his blog, "Gosh, everyone!  You should try these amazing fruit juices that are making me lose weight!"  And that would have been the end of it.  Some people would have tried it, it would have worked temporarily for some of them (until the flavor-calorie association kicked in) and there never would have been a Shangri-La Diet per se.

The existing Shangri-La Diet is visibly incomplete—for some people, like me, it doesn't seem to work, and there is no apparent reason for this or any logic permitting it.  But the reason why as many people have benefited as they have—the reason why there was more than just one more blog post describing a trick that seemed to work for one person and didn't work for anyone else—is that Roberts knew the experimental science that let him interpret what he was seeing, in terms of deep factors that actually did exist.

One of the pieces of advice on OB/LW that was frequently cited as the most important thing learned, was the idea of "the bottom line"—that once a conclusion is written in your mind, it is already true or already false, already wise or already stupid, and no amount of later argument can change that except by changing the conclusion.  And this ties directly into another oft-cited most important thing, which is the idea of "engines of cognition", minds as mapping engines that require evidence as fuel.

If I had merely written one more blog post that said, "You know, you really should be more open to changing your mind—it's pretty important—and oh yes, you should pay attention to the evidence too."  And this would not have been as useful.  Not just because it was less persuasive, but because the actual operations would have been much less clear without the explicit theory backing it up.  What constitutes evidence, for example?  Is it anything that seems like a forceful argument?  Having an explicit probability theory and an explicit causal account of what makes reasoning effective, makes a large difference in the forcefulness and implementational details of the old advice to "Keep an open mind and pay attention to the evidence."

It is also important to realize that causal theories are much more likely to be true when they are picked up from a science textbook than when invented on the fly—it is very easy to invent cognitive structures that look like causal theories but are not even anticipation-controlling, let alone true.

This is the signature style I want to convey from all those posts that entangled cognitive science experiments and probability theory and epistemology with the practical advice—that practical advice actually becomes practically more powerful if you go out and read up on cognitive science experiments, or probability theory, or even materialist epistemology, and realize what you're seeing.  This is the brand that can distinguish LW from ten thousand other blogs purporting to offer advice.

I could tell you, "You know, how much you're satisfied with your food probably depends more on the quality of the food than on how much of it you eat."  And you would read it and forget about it, and the impulse to finish off a whole plate would still feel just as strong.  But if I tell you about scope insensitivity, and duration neglect and the Peak/End rule, you are suddenly aware in a very concrete way, looking at your plate, that you will form almost exactly the same retrospective memory whether your portion size is large or small; you now possess a deep theory about the rules governing your memory, and you know that this is what the rules say.  (You also know to save the dessert for last.)

I want to hear how I can overcome akrasia—how I can have more willpower, or get more done with less mental pain.  But there are ten thousand people purporting to give advice on this, and for the most part, it is on the level of that alternate Seth Roberts who just tells people about the amazing effects of drinking fruit juice.  Or actually, somewhat worse than that—it's people trying to describe internal mental levers that they pulled, for which there are no standard words, and which they do not actually know how to point to.  See also the illusion of transparency, inferential distance, and double illusion of transparency.  (Notice how "You overestimate how much you're explaining and your listeners overestimate how much they're hearing" becomes much more forceful as advice, after I back it up with a cognitive science experiment and some evolutionary psychology?)

I think that the advice I need is from someone who reads up on a whole lot of experimental psychology dealing with willpower, mental conflicts, ego depletion, preference reversals, hyperbolic discounting, the breakdown of the self, picoeconomics, etcetera, and who, in the process of overcoming their own akrasia, manages to understand what they did in truly general terms—thanks to experiments that give them a vocabulary of cognitive phenomena that actually exist, as opposed to phenomena they just made up.  And moreover, someone who can explain what they did to someone else, thanks again to the experimental and theoretical vocabulary that lets them point to replicable experiments that ground the ideas in very concrete results, or mathematically clear ideas.

Note the grade of increasing difficulty in citing:

  • Concrete experimental results (for which one need merely consult a paper, hopefully one that reported p < 0.01 because p < 0.05 may fail to replicate)
  • Causal accounts that are actually true (which may be most reliably obtained by looking for the theories that are used by a majority within a given science)
  • Math validly interpreted (on which I have trouble offering useful advice because so much of my own math talent is intuition that kicks in before I get a chance to deliberate)

If you don't know who to trust, or you don't trust yourself, you should concentrate on experimental results to start with, move on to thinking in terms of causal theories that are widely used within a science, and dip your toes into math and epistemology with extreme caution.

But practical advice really, really does become a lot more powerful when it's backed up by concrete experimental results, causal accounts that are actually true, and math validly interpreted.

 

Part of the sequence The Craft and the Community

Next post: "Less Meta"

Previous post: "Well-Kept Gardens Die By Pacifism"

Comments

sorted by
magical algorithm
Highlighting new comments since Today at 11:13 AM
Select new highlight date
Rendering 50/114 comments  show more

The thing is, it can take a long time until the deep theory to support a given practical advice is discovered and understood. Moving forward through trial and error can give faster and as effective results.

If you look at human history you will find several examples like the making of steel where practical procedures where discovered through massive experimentation centuries before the theoretical basis to understand them.

This comment is I think an essential couterbalance to the post's valid points. To expand a little, the book Good Calories, Bad Calories by Gary Taubes argues that bad nutritional recommendations were adopted by leading medical and then governmental associations, partly justified by the above advice (we need recommendations to help people now, can't wait for full testing). So someone could refer to this as an example of why the comment above is dangerous in areas that are harder to test than the efficacy of steel production (which I presume they knew worked better than other procedures, whereas some nutritional effects have long term consequences that aren't clear or it's not clear which component of the recommendation is affecting what). However, Taubes also shows that this was also used to justify overlooking flaws in the evidence, and he points to a group heuristic bias (if that's the right term) of information cascades. There are other biases and failures of rationality (how certain statistical evidence was interpreted) in the story as well. So all this to say, while trial and error give give faster and as effective results, the less clear the measurement of the results are, the more care required interpreting them. When stated, it sounds obvious and I almost feel dumb for saying it, yet it's one of those rules honored more in the breach as they say. In the field of nutrition, you'll have headlines that say "Meat causes cancer" based on a study that points to a small statistical correlation between two diets which have very many differences other than type and amount of meat and itself concludes that more studies are called for to examine possible links between meat and cancer but not other possible causes that are just as much pointed to by the study.

The harm didn't come from "leading medical and then governmental associations" adopting recommendations before they were proven, it came from them holding to those recommendations when the evidence had turned.

It seems to me that many people don't realize that math results have to be validly interpreted in order to be compelling. LOTS of bad thinking by smart people tends to involve sloppiness in the interpretation of the math. Auman was prone to this problem and so are people thinking about his agreement theorem.

This may be pointing at a bias that I don't have a name for-- the belief that the pathway between a possible cause-effect pair can be neglected.

It's believing that all you need is the right laws, without having to pay attention to how they're enforced. It's believing that if you are the right sort of person, your life will automatically work well. It's believing that more education will lead to a more prosperous society without having ways for people to apply what they know.