Followup toMake an Extraordinary Effort, The Meditation on Curiosity, Avoiding Your Belief's Real Weak Points

"It ain't a true crisis of faith unless things could just as easily go either way."
       —Thor Shenkel

Many in this world retain beliefs whose flaws a ten-year-old could point out, if that ten-year-old were hearing the beliefs for the first time.  These are not subtle errors we are talking about.  They would be child's play for an unattached mind to relinquish, if the skepticism of a ten-year-old were applied without evasion. As Premise Checker put it, "Had the idea of god not come along until the scientific age, only an exceptionally weird person would invent such an idea and pretend that it explained anything."

And yet skillful scientific specialists, even the major innovators of a field, even in this very day and age, do not apply that skepticism successfully.  Nobel laureate Robert Aumann, of Aumann's Agreement Theorem, is an Orthodox Jew:  I feel reasonably confident in venturing that Aumann must, at one point or another, have questioned his faith.  And yet he did not doubt successfullyWe change our minds less often than we think.

This should scare you down to the marrow of your bones.  It means you can be a world-class scientist and conversant with Bayesian mathematics and still fail to reject a belief whose absurdity a fresh-eyed ten-year-old could see.  It shows the invincible defensive position which a belief can create for itself, if it has long festered in your mind.

What does it take to defeat an error which has built itself a fortress?

But by the time you know it is an error, it is already defeated.  The dilemma is not "How can I reject long-held false belief X?" but "How do I know if long-held belief X is false?"  Self-honesty is at its most fragile when we're not sure which path is the righteous one.  And so the question becomes:

How can we create in ourselves a true crisis of faith, that could just as easily go either way?

Religion is the trial case we can all imagine.  (Readers born to atheist parents have missed out on a fundamental life trial, and must make do with the poor substitute of thinking of their religious friends.)  But if you have cut off all sympathy and now think of theists as evil mutants, then you won't be able to imagine the real internal trials they face.  You won't be able to ask the question:

"What general strategy would a religious person have to follow in order to escape their religion?"

I'm sure that some, looking at this challenge, are already rattling off a list of standard atheist talking points—"They would have to admit that there wasn't any Bayesian evidence for God's existence", "They would have to see the moral evasions they were carrying out to excuse God's behavior in the Bible", "They need to learn how to use Occam's Razor—"

WRONG!  WRONG WRONG WRONG!  This kind of rehearsal, where you just cough up points you already thought of long before, is exactly the style of thinking that keeps people within their current religions.  If you stay with your cached thoughts, if your brain fills in the obvious answer so fast that you can't see originally, you surely will not be able to conduct a crisis of faith.

Even when it's explicitly pointed out, some people seemingly cannot follow the leap from the object-level "Use Occam's Razor!  You have to see that your God is an unnecessary belief!" to the meta-level "Try to stop your mind from completing the pattern the usual way!"  Because in the same way that all your rationalist friends talk about Occam's Razor like it's a good thing, and in the same way that Occam's Razor leaps right up into your mind, so too, the obvious friend-approved religious response is "God's ways are mysterious and it is presumptuous to suppose that we can understand them."  So for you to think that the general strategy to follow is "Use Occam's Razor", would be like a theist saying that the general strategy is to have faith.  (I've noticed that a large fraction of the population—even technical folk—have trouble following arguments that go this meta.  On my more pessimistic days I wonder if the camel has two humps.)

"But—but Occam's Razor really is better than faith!  That's not like preferring a different flavor of ice cream!  Anyone can see, looking at history, that Occamian reasoning has been far more productive than faith—"

Which is all true.  But beside the point.  The point is that you, saying this, are rattling off a standard justification that's already in your mind.  The challenge of a crisis of faith is to handle the case where, possibly, our standard conclusions are wrong and our standard justifications are wrong.  So if the standard justification for X is "Occam's Razor!", and you want to hold a crisis of faith around X, you should be questioning if Occam's Razor really endorses X, if your understanding of Occam's Razor is correct, and—if you want to have sufficiently deep doubts—whether simplicity is the sort of criterion that has worked well historically in this case, or could reasonably be expected to work, etcetera.  If you would advise a religionist to question their belief that "faith" is a good justification for X, then you should advise yourself to put forth an equally strong effort to question your belief that "Occam's Razor" is a good justification for X.

(Think of all the people out there who don't understand the Minimum Description Length or Solomonoff Induction formulations of Occam's Razor, who think that Occam's Razor outlaws Many-Worlds or the Simulation Hypothesis.  They would need to question their formulations of Occam's Razor and their notions of why simplicity is a good thing.  Whatever X in contention you just justified by saying "Occam's Razor!", I bet it's not the same level of Occamian slam dunk as gravity.)

If "Occam's Razor!" is your usual reply, your standard reply, the reply that all your friends give—then you'd better block your brain from instantly completing that pattern, if you're trying to instigate a true crisis of faith.

Better to think of such rules as, "Imagine what a skeptic would say—and then imagine what they would say to your response—and then imagine what else they might say, that would be harder to answer."

Or, "Try to think the thought that hurts the most."

And above all, the rule:

"Put forth the same level of desperate effort that it would take for a theist to reject their religion."

Because, if you aren't trying that hard, then—for all you know—your head could be stuffed full of nonsense as ridiculous as religion.

Without a convulsive, wrenching effort to be rational, the kind of effort it would take to throw off a religion—then how dare you believe anything, when Robert Aumann believes in God?

Someone (I forget who) once observed that people had only until a certain age to reject their religious faith.  Afterward they would have answers to all the objections, and it would be too late.  That is the kind of existence you must surpass.  This is a test of your strength as a rationalist, and it is very severe; but if you cannot pass it, you will be weaker than a ten-year-old.

But again, by the time you know a belief is an error, it is already defeated.  So we're not talking about a desperate, convulsive effort to undo the effects of a religious upbringing, after you've come to the conclusion that your religion is wrong.  We're talking about a desperate effort to figure out if you should be throwing off the chains, or keeping them.  Self-honesty is at its most fragile when we don't know which path we're supposed to take—that's when rationalizations are not obviously sins.

Not every doubt calls for staging an all-out Crisis of Faith.  But you should consider it when:

  • A belief has long remained in your mind;
  • It is surrounded by a cloud of known arguments and refutations;
  • You have sunk costs in it (time, money, public declarations);
  • The belief has emotional consequences (note this does not make it wrong);
  • It has gotten mixed up in your personality generally.

None of these warning signs are immediate disproofs.  These attributes place a belief at-risk for all sorts of dangers, and make it very hard to reject when it is wrong.  But they also hold for Richard Dawkins's belief in evolutionary biology as well as the Pope's Catholicism.  This does not say that we are only talking about different flavors of ice cream.  Only the unenlightened think that all deeply-held beliefs are on the same level regardless of the evidence supporting them, just because they are deeply held.  The point is not to have shallow beliefs, but to have a map which reflects the territory.

I emphasize this, of course, so that you can admit to yourself, "My belief has these warning signs," without having to say to yourself, "My belief is false."

But what these warning signs do mark, is a belief that will take more than an ordinary effort to doubt effectively.  So that if it were in fact false, you would in fact reject it.  And where you cannot doubt effectively, you are blind, because your brain will hold the belief unconditionally.  When a retina sends the same signal regardless of the photons entering it, we call that eye blind.

When should you stage a Crisis of Faith?

Again, think of the advice you would give to a theist:  If you find yourself feeling a little unstable inwardly, but trying to rationalize reasons the belief is still solid, then you should probably stage a Crisis of Faith.  If the belief is as solidly supported as gravity, you needn't bother—but think of all the theists who would desperately want to conclude that God is as solid as gravity.  So try to imagine what the skeptics out there would say to your "solid as gravity" argument.  Certainly, one reason you might fail at a crisis of faith is that you never really sit down and question in the first place—that you never say, "Here is something I need to put effort into doubting properly."

If your thoughts get that complicated, you should go ahead and stage a Crisis of Faith.  Don't try to do it haphazardly, don't try it in an ad-hoc spare moment.  Don't rush to get it done with quickly, so that you can say "I have doubted as I was obliged to do."  That wouldn't work for a theist and it won't work for you either.  Rest up the previous day, so you're in good mental condition.  Allocate some uninterrupted hours.  Find somewhere quiet to sit down.  Clear your mind of all standard arguments, try to see from scratch.  And make a desperate effort to put forth a true doubt that would destroy a false, and only a false, deeply held belief.

Elements of the Crisis of Faith technique have been scattered over many posts:

  • Avoiding Your Belief's Real Weak Points—One of the first temptations in a crisis of faith is to doubt the strongest points of your belief, so that you can rehearse your good answers.  You need to seek out the most painful spots, not the arguments that are most reassuring to consider.
  • The Meditation on Curiosity—Roger Zelazny once distinguished between "wanting to be an author" versus "wanting to write", and there is likewise a distinction between wanting to have investigated and wanting to investigate.  It is not enough to say "It is my duty to criticize my own beliefs"; you must be curious, and only uncertainty can create curiosity.  Keeping in mind Conservation of Expected Evidence may help you Update Yourself Incrementally:  For every single point that you consider, and each element of new argument and new evidence, you should not expect your beliefs to shift more (on average) in one direction than another—thus you can be truly curious each time about how it will go.
  • Cached Thoughts and Pirsig's Original Seeing, to prevent standard thoughts from rushing in and completing the pattern.
  • The Litany of Gendlin and the Litany of Tarski:  People can stand what is true, for they are already enduring it.  If a belief is true you will be better off believing it, and if it is false you will be better off rejecting it.  You would advise a religious person to try to visualize fully and deeply the world in which there is no God, and to, without excuses, come to the full understanding that if there is no God then they will be better off believing there is no God.  If one cannot come to accept this on a deep emotional level, they will not be able to have a crisis of faith.  So you should put in a sincere effort to visualize the alternative to your belief, the way that the best and highest skeptic would want you to visualize it.  Think of the effort a religionist would have to put forth to imagine, without corrupting it for their own comfort, an atheist's view of the universe.
  • Make an Extraordinary Effort, for the concept of isshokenmei, the desperate convulsive effort to be rational that it would take to surpass the level of Robert Aumann and all the great scientists throughout history who never let go of their religions.
  • The Genetic Heuristic:  You should be extremely suspicious if you have many ideas suggested by a source that you now know to be untrustworthy, but by golly, it seems that all the ideas still ended up being right.  (E.g., the one concedes that the Bible was written by human hands, but still clings to the idea that it contains indispensable ethical wisdom.)
  • The Importance of Saying "Oops"—it really is less painful to swallow the entire bitter pill in one terrible gulp.
  • Singlethink, the opposite of doublethink.  See the thoughts you flinch away from, that appear in the corner of your mind for just a moment before you refuse to think them.  If you become aware of what you are not thinking, you can think it.
  • Affective Death Spirals and Resist the Happy Death Spiral.  Affective death spirals are prime generators of false beliefs that it will take a Crisis of Faith to shake loose.  But since affective death spirals can also get started around real things that are genuinely nice, you don't have to admit that your belief is a lie, to try and resist the halo effect at every point—refuse false praise even of genuinely nice things.  Policy debates should not appear one-sided.
  • Hold Off On Proposing Solutions until the problem has been discussed as thoroughly as possible without proposing any; make your mind hold off from knowing what its answer will be; and try for five minutes before giving up, both generally, and especially when pursuing the devil's point of view.

And these standard techniques are particularly relevant:

But really there's rather a lot of relevant material, here and there on Overcoming Bias.  The Crisis of Faith is only the critical point and sudden clash of the longer isshoukenmei—the lifelong uncompromising effort to be so incredibly rational that you rise above the level of stupid damn mistakes.  It's when you get a chance to use your skills that you've been practicing for so long, all-out against yourself.

I wish you the best of luck against your opponent.  Have a wonderful crisis!

 

Part of the Letting Go subsequence of How To Actually Change Your Mind

Next post: "The Ritual"

Previous post: "Leave a Line of Retreat"

Comments

sorted by
magical algorithm
Highlighting new comments since Today at 11:47 AM
Select new highlight date
All comments loaded

This is an unusually high quality post, even for you Eliezer; congrats!

So here I am having been raised in the Christian faith and trying not to freak out over the past few weeks because I've finally begun to wonder whether I believe things just because I was raised with them. Our family is surrounded by genuinely wonderful people who have poured their talents into us since we were teenagers, and our social structure and business rests on the tenets of what we believe. I've been trying to work out how I can 'clear the decks' and then rebuild with whatever is worth keeping, yet it's so foundational that it will affect my marriage (to a pretty special man) and my daughters who, of course, have also been raised to walk the Christian path.

Is there anyone who's been in this position - really, really invested in a faith and then walked away?

Quite a few. Dan Barker was a Christian minister before he walked away. But the truth is, it is much harder to walk away from a religion when one is married and has a family. And sometimes, it can destroy families to even voice doubts.

Christianity isn't the only religion that has this aspect. Among Orthodox Jews there's a common refrain that too many are leaving the faith and a standard suggested solution for this is to make kids marry earlier because once they are married they are much more likely to stay in the faith.

But whenever this sort of thing comes up it is important to ask how much do the social structures really depend on the religion? Will your husband love you less if you tell you him don't believe? Will your friends no longer be friendly? Will they stop providing social support? And if they will stop being friendly on such a basis, what makes you see them as genuine friends in the first place?

There's no question that these issues are deep and difficult and should probably be handled slowly. I'd recommend maybe sending a version of your question to The Friendly Atheist- one of the writers there has a column (Ask Richard) where he regularly answers questions much like yours and if your question gets to posted then it likely to get a large amount of input in the comments threads from people who went through similar circumstances(it might be worth looking in the archives also to see if they've had similar letters in the past. I think they have but I don't have a link to one off the top of my head).

Atheism is believing that the state of evidence on the God question is similar to the state of evidence on the werewolf question.

For the past three days I have been repeatedly performing the following mental operation:

"Imagine that you never read any documents claimed to be produced by telepathy with extraterrestrials. Now gauge your emotional reaction to this situation. Once calm, ask yourself what you would believe about the world in this situation. Would you accept materialism? Or would you still be seeking mystical answers to the nature of reality?"

I am still asking myself this question. Why? I am struggling to figure out whether or not I am wrong.

I believe things that raise a lot of red flags for "crazy delusion." Things like:

"I came from another planet, vastly advanced in spiritual evolution relative to Earth, in order to help Earth transition from the third dimension to the fourth dimension. My primary mission is to generate as much light and love as possible, because this light and love will diffuse throughout Earth's magnetic fields and reduce the global amount of strife and suffering while helping others to achieve enlightenment. I am being aided in this mission by extraterrestrials from the fourth dimension who are telepathically beaming me aid packages of light and love."

These beliefs, and many others like them, are important to my worldview and I use them to decide my actions. Because I like to think of myself as a rational person, it is a matter of great concern to me to determine whether or not they are true.

I have come across nobody who can put forth an argument that makes me question these beliefs. Noboby except for one person: Eliezer Yudkowsky. This man did what no other could: he made me doubt my basic beliefs. I am still struggling with the gift he gave me.

This gift is that he made me realize, on a gut level, that I might be wrong, and gave me motivation to really figure out the truth of the matter.

So many intelligent people believe patently absurd things. It is so difficult to escape from such a trap once you have fallen into it. If I am deluded, I want to be one of the fortunate ones who escaped from his insanity.

The thing is, I really don't know whether or not I am deluded. I have never before been so divided on any issue. Does anybody have anything they'd like to add, which might stimulate my thinking towards resolving this confusion?

There are several things to ask about beliefs like this:

  1. Do they make internal sense? (e.g. "What is the fourth dimension?")

  2. Do they match the sort of evidence that you would expect to have in the case of non-delusion? (e.g. "Do you have any observable physical traits indicating your extraterrestrial origin? Would someone looking into records of your birth find discrepancies in your records indicating forgery?")

  3. Do they try to defend themselves against testing? (e.g. "Do you expect to illuminate a completely dark room at night by generating light? Would you expect to exist happily in psychological conditions that would harm normal humans by subsisting on aid packages full of love?")

  4. Do they have explanatory power? (e.g. "Has there, as a matter of historical fact, been a sudden and dramatic reduction in global strife and suffering since the date of your supposed arrival?")

  5. Do they have a causal history that can be reasonably expected to track with truth across the entire reference class from an outside view? (e.g. "Did you receive your information via private mental revelation or a belief from as long ago as you can remember, similar to the beliefs of people you do consider crazy?")

Hi, Alicorn!

  1. Yes. They are drawn from the material at http://lawofone.info/ . The philosophy presented there is internally consistent, to the best of my understanding.

  2. There is no physical evidence. All of the "evidence" is in my head. This is a significant point.

  3. There are a variety of points in the source document which could be interpreted as designed to defend its claims against testing. This is a significant point.

  4. I am not aware of any physically testable predictions that these beliefs make. This is a significant point.

  5. The causal history of these beliefs is that I read the aforementioned document, and eventually decided that it was true, mainly on the basis of the fact that it made sense to my intuition and resonated personally with me. This is a significant point.

Thanks for asking!

@Anna:

I mean that you've given up trying to be clever.

@Vassar:

The position that people may be better off deluded in some situations is VERY compelling.

The position that people may be optimally deluded, without a third alternative, is much less compelling.

The position that realistic human students of rationality can be trying to do their best (let alone do the impossible), while trying to deliberately self-delude, strikes me as outright false. It would be like trying to win a hot-dog eating contest while keeping a golf ball in your mouth.

It is this outright falsity that I refer to when I say that by the time you attempt to employ techniques at this level, you should already have given up on trying to be clever.

As someone once said to Brennan:

She reared back in mock-dismay. "Why, Brennan, surely you don't expect me to just tell you!"

Brennan gritted his teeth. "Why not?"

"What you're feeling now, Brennan, is called curiosity. It's an important emotion. You need to learn to live with it and draw upon its power. If I just give you the information, why, you won't be curious any more." Her eyes turned serious. "Not that you should prefer ignorance. There is no curiosity that does not want an answer. But, Brennan, tradition doesn't say I have to hand you knowledge on a silver platter."

It's easy to visualize Jeffreyssai deciding to not say something - in fact, he does that every time he poses a homework problem without telling the students the answer immediately. Can you visualize him lying to his students? (There are all sorts of clever-sounding reasons why you might gain a short-term benefit from it. Don't stop thinking when you come to the first benefit.) Can you imagine Jeffreyssai deliberately deciding that he himself is better off not realizing that X is true, therefore he is not going to investigate the matter further?

Clearly, if everyone was always better off being in immediate possession of every truth, there would be no such thing as homework. But the distinction between remaining silent, and lying, and not wanting to know the truth even for yourself, suggests that there is more at work here than "People are always better off being in immediate possession of every truth."

Bo, the point is that what's most difficult in these cases isn't the thing that the 10-year-old can do intuitively (namely, evaluating whether a belief is credible, in the absence of strong prejudices about it) but something quite different: noticing the warning signs of those strong prejudices and then getting rid of them or getting past them. 10-year-olds aren't specially good at that. Most 10-year-olds who believe silly things turn into 11-year-olds who believe the same silly things.

Eliezer talks about allocating "some uninterrupted hours", but for me a proper Crisis of Faith takes longer than that, by orders of magnitude. If I've got some idea deeply embedded in my psyche but am now seriously doubting it (or at least considering the possibility of seriously doubting it), then either it's right after all (in which case I shouldn't change my mind in a hurry) or I've demonstrated my ability to be very badly wrong about it despite thinking about it a lot. In either case, I need to be very thorough about rethinking it, both because that way I may be less likely to get it wrong and because that way I'm less likely to spend the rest of my life worrying that I missed something important.

Yes, of course, a perfect reasoner would be able to sit down and go through all the key points quickly and methodically, and wouldn't take months to do it. (Unless there were a big pile of empirical evidence that needed gathering.) But if you find yourself needing a Crisis of Faith, then ipso facto you aren't a perfect reasoner on the topic in question.

Wherefore, I at least don't have the time to stage a Crisis of Faith about every deeply held belief that shows signs of meriting one.

I think there would be value in some OB posts about resource allocation: deciding which biases to attack first, how much effort to put into updating which beliefs, how to prioritize evidence-gathering versus theorizing, and so on and so forth. (We can't Make An Extraordinary Effort every single time.) It's a very important aspect of practical rationality.

Matthew C, I read the introduction and chapters 1 and 3. Are you sure you meant chapter 3? It does not seem to say what you think it says. Most of it is a description of placebos and other psychosomatic effects. It also discusses some events that are unlikely in isolation but seem trivially within the realm of chance given 100 years and approaching 7 billion people. There is also a paragraph with no numbers saying it can't just be chance.

It feels kind of like asking everyone in the country to flip a coin 25 times, then calling the 18 or so people who have continuous streaks psychics. And ignoring that all-heads and all-tails both count. And maybe also counting the people who got HHHHHTTTTTHHHHHTTTTTHHHHH or HTHTHTHTHTHTHTHTHTHTHTHTH or such. Survivorship and publication bias and all that.

There were a few things that might have fallen outside those obvious mistakes, but given the quality of analysis, I did not feel a pressing need to check that they reported their sources properly, that their sources reported theirs properly, and that there was no deception etc. involved. This Stevenson fellow might be worth pursuing, but it seems likely that he is just the archivist of the one-in-a-million events that continuously happen with billions of people. I feel compelled to read on, however, by the promise that not only identity but also skills can survive bodily death. I am picturing a free-floating capacity for freecell or ping-pong, just looking for somewhere to reincarnate. Sadly, I do not expect the text to be that fun.

Years later, I'm reading this.

I've been reading through the Major Sequences and I'm getting towards the end of "How to actually change your mind" with this growing sense of unease that there are all these rational principles that I could use but I know not to what end and I think I might finally have found somewhere: My political convictions.

I'm not going to say which direction I lean, but I lean that direction very strongly. My friends lean that way. My parents do not lean that way as much as I do but they are in that general direction.

And I realize that I may not lean that way because it is the rational way to approach a well-run country, but because I am merely used to it.

So perhaps, one of these weeks, I will sit down and have a long hard think on my politics. I think I would prefer to stay leaned the way I currently am - but I would, wouldn't I.

"How do I know if long-held belief X is false?"

Eliezer, I guess if you already are asking this question you are well on your way. The real problem arises when you didn't even manage to pinpoint the possibly false believe. And yes I was a religious person for many years before realizing that I was on the wrong way.

Why didn't I question my faith? Well, it was so obviously true to me. The thing is: did you ever question heliocentrism? No? Why not? When you ask the question "How do I know if Heliocentrism is false?" You are already on your way. The thing is, your brain needs a certain amount of evidence to pinpoint the question.

How did I overcome my religion? I noticed that something was wrong with my worldview like seeing a deja vu in the matrix every now and then. This on an intellectual level, not as a visible thing. But much more subtle and less obvious so you really have to be attentive no notice it, to notice that there is a problem in the pattern. Things aren't the way they should be.

But over time I became more and more aware that the pieces weren't fitting together. But from there to arrive at the conclusion that my basic assumptions where wrong was really not easy. If you live in the matrix and see strange things happening, how will you arrive at the conclusion that this is because you are in a simulation?

Your posts on rationality were a big help, though. They always say: "Jesus will make you free." Unfortunately that didn't work out for me. Well, I finally am free after a decade of false believing, and during all the time I was a believer I never was as happy as I'm now.

It seems that it takes an Eliezer-level rationalist to make an explicit account of what any ten-year-old can do intuitively. For those not quite Eliezer-level or not willing to put in the effort, this is really frustrating in the context of an argument or debate.

@Mattew C.

Do you mean by "remote staring experiments" those of Wiseman/Schlitz?

It seems that when properly controlled, they produced no statistically significant effect: http://forums.randi.org/archive/index.php/t-43727.html

I don't need to read the book. I believe that psi effects are not real, because if they were, they would already be part of accepted science.

It's not a matter of being closed-minded or open-minded. I'm just not accepting your book author as a legitimate authority. Most things I believe, I believe because they are asserted by authorities I accept. For example, I have never personally seen an experiment performed that establishes that the Sun is made primarily of hydrogen and helium, that an atom of gold contains seventy-nine protons, that George Washington was the first President of the United States, that light is quantized, or many other things I learned in school.

My criteria is simple: on matters in which I have no special expertise or direct knowledge, I simply accept the view of the majority of those I consider legitimate experts. If you want to persuade me that "psi: is real, go persuade the Nobel Prize committee; anyone who can establish it through controlled, repeatable experiments would certainly be deserving of the Nobel Prize in Physics.

In other words, I rely on people like James Randi and Joe Nickell to do the investigating for me. Convince them, and I'll believe in psi. Until then, don't go shoving your data in my face, because I'll just conclude that your data, or your interpretation of it, is wrong.

"I am better off deluding myself into believing my cherished late spouse was faithful to me."

1) Do you believe this is true for you, or only other people?

2) If you know that someone's cherished late spouse cheated on them, are you justified in keeping silent about the fact?

3) Are you justified in lying to prevent the other person from realizing?

4) If you suspect for yourself (but are not sure) that the cherished late spouse might have been unfaithful, do you think that you will be better off, both for the single deed, and as a matter of your whole life, if you refuse to engage in any investigation that might resolve your doubts one way or the other? If there is no resolving investigation, do you think that exerting some kind of effort to "persuade yourself", will leave you better off?

5) Would you rather associate with friends who would (a) tell you if they discovered previously unsuspected evidence that your cherished late spouse had been unfaithful, or who would (b) remain silent about it? Which would be a better human being in your eyes, and which would be a better friend to you?

I suspect that there are many people in this world who are, by their own standards, better off remaining deluded. I am not one if them; but I think you should qualify statements like "if a belief is false, you are better off knowing that it is false".

Of course I deliberately did not qualify it. Frankly, if you're still qualifying the statement, you're not the intended audience for a post about how to make a convulsive effort to be rational using two dozen different principles.

My father is an atheist with Jewish parents, and my mother is a (non-practicing) Catholic. I was basically raised "rationalist", having grown up reading my father's issues of Skeptical Inquirer magazine. I find myself in the somewhat uncomfortable position of admitting that I acquired my belief in "Science and Reason" in pretty much the same way that most other people acquire their religious beliefs.

I'm pretty sure that, like everyone else, I've got some really stupid beliefs that I hold too strongly. I just don't know which ones they are!

I suspect that there are many people in this world who are, by their own standards, better off remaining deluded. I am not one if them; but I think you should qualify statements like "if a belief is false, you are better off knowing that it is false".

It is even possible that some overoptimistic transhumanists/singularitarians are better off, by their own standards, remaining deluded about the potential dangers of technology. You have the luxury of being intelligent enough to be able to utilize your correct belief about how precarious our continued existence is becoming. For many people, such a belief is of no practical benefit yet is psychologically detrimental.

This creates a "tradgedy of the commons" type problem in global catastrophic risks: each individual is better off living in a fool's paradise, but we'd all be much better off if everyone faced up to the dangers of future technology.

Eliezer: The position that people may be better off deluded in some situations is VERY compelling. If your audience is people who are literally NEVER better off deluded then I sincerely doubt that it includes you or anyone else. Obviously not every belief need receive all appropriate qualifications every single time, but when someone else points out a plausible qualification you should, as a rationalist, acknowledge it.

I'm very open to Anna's (A1), especially given the special difficulties of this sort of investigation, but only with respect to themselves. I would expect someone as smart as me who knew me well enough to some day come upon a situation where I should, by my values, be deceived, at least for some period.

MichaelG:

On the other hand, I don't think a human race with nerds can forever avoid inventing a self-destructive technology like AI.

The idea is that if we invent Friendly AI first, it will become powerful enough to keep later, Unfriendly ones in check (either alone, or with several other FAIs working together with humanity). You don't need to avoid inventing one forever: it's enough to avoid inventing one as the first thing that comes up.

Eliezer:

If a belief is true you will be better off believing it, and if it is false you will be better off rejecting it.
I think you should try applying your own advice to this belief of yours. It is usually true, but it is certainly not always true, and reeks of irrational bias.

My experience with my crisis of faith seems quite opposite to your conceptions. I was raised in a fundamentalist family, and I had to "make an extraordinary effort" to keep believing in Christianity from the time I was 4 and started reading through the Bible, and finding things that were wrong; to the time I finally "came out" as a non-Christian around the age of 20. I finally gave up being Christian only when I was worn out and tired of putting forth such an extraordinary effort.

So in some cases your advice might do more harm than good. A person who is committed to making "extraordinary efforts" concerning their beliefs is more likely to find justifications to continue to hold onto their belief, than is someone who is lazier, and just accepts overwhelming evidence instead of letting it kick them into an "extraordinary effort." In other words, you are advocating a combative, Western approach; I am bringing up a more Eastern approach, which is not to be so attached to anything in the first place, but to bend if the wind blows hard enough.

From "Twelve virtues of rationality" by Eliezer:

The third virtue is lightness. Let the winds of evidence blow you about as though you are a leaf, with no direction of your own. Beware lest you fight a rearguard retreat against the evidence, grudgingly conceding each foot of ground only when forced, feeling cheated. Surrender to the truth as quickly as you can. Do this the instant you realize what you are resisting; the instant you can see from which quarter the winds of evidence are blowing against you.

Eliezer uses almost the same words as you do.( Oh, and this document is from 2006, so he has not copied your lines.) Some posts earlier Eliezer accused you of not reading his writings and just making stuff up regarding his viewpoints.......

Agreed.

Every time I changed my mind about something, it felt like "quitting," like ceasing the struggle to come up with evidence for something I wanted to be true but wasn't. Realizing "It's so much easier to give up and follow the preponderance of the evidence."

Examples: taking an economics class made it hard to believe that government interventions are mostly harmless. Learning about archaeology and textual analysis made it hard to believe in the infallibility of the Bible. Hearing cognitive science/philosophy arguments made it hard to believe in Cartesian dualism. Reading more papers made it hard to believe that looking at the spectrum of the Laplacian is a magic bullet for image processing. Extensive conversations with a friend made it hard to believe that I was helping him by advising him against pursuing his risky dreams.

When something's getting hard to believe, consider giving up the belief. Just let the weight fall. Be lazy. If you're working hard to justify an idea, you're probably working too hard.

Matthew C.,

You've been suggesting that for a while:

http://www.overcomingbias.com/2007/01/godless_profess.html#comment-27993437 http://www.overcomingbias.com/2008/09/psychic-powers.html#comment-130445874

Those who have read it (or the hundreds of pages available on Google Books, which I have examined) don't seem to be impressed.

Why do you think it's better than Broderick's book? If you want to promote it more effectively in the face of silence (http://www.overcomingbias.com/2007/02/what_evidence_i.html), why not pay for a respected reviewer's time and a written review (in advance, so that you're not accused of bribing to ensure a favorable view)? Perhaps from a statistician?

"Try to think the thought that hurts the most."

This is exactly why I like to entertain religious thoughts. My background, training, and inclination are to be a thoroughgoing atheist materialist, so I find that trying to make sense of religious ideas is good mental exercise. Feel the burn!

In that vein, here is an audio recording of Robert Aumann on speaking on "The Personality of God".

Also, the more seriously religious had roughly the same idea, or maybe it's the opposite idea. The counterfactuality of religious ideas is part of their strength, apparently.