In reply to:

They would study logic, probability theory, argument, scientific method, and other core tools of truth-seeking. They would inquire into epistemology, the study of knowing. They would study artificial intelligence to learn the algorithms, the math, the laws of how an ideal agent would acquire true beliefs. They would study modern psychology and neuroscience to learn how their brain acquires beliefs, and how those processes depart from ideal truth-seeking processes. And they would study how to minimize their thinking errors.

Not necessarily. Hindsight bias is likely at work here. You know that studying these fields helped you to acquire better beliefs, and so you conclude that this consequence should be obvious. But unless a curious but untrained reasoner somehow finds out that studying these fields will help them, we shouldn't expect them to study them. Why on earth would someone try to read The Logic of Science if they didn't already know that it would improve their reasoning skills?

There are a lot more genuinely curious people out there than there are rationalists. But unless those curious people happen to meet a LWer, or stumble across a link to this site, their chances of learning the benefits of studying these subjects are not great. There are a few books that might get them started (Robyn Dawes is coming to mind), but how likely is it that they're going to stumble across one of those books, especially if they aren't explicitly interested in the field of human thought already (like EY?).

I would bet that there are a lot of genuinely curious people out there who have realized that thinking is a skill. But if you were to ask them for the best way they knew of to improve that skill, would say something along the lines of "sudoku puzzles". And that's pretty sad.

I agree in part, though this excuse was stronger before Google. Now people can Google "how to think better" or "how to figure out what's true" and start looking around. One thing leads to another. Almost all the stuff I mention above is discussed in many of the textbooks on thinking and deciding — like, say, Thinking and Deciding.

What Curiosity Looks Like

See also: Twelve Virtues of Rationality, The Meditation on Curiosity, Use Curiosity

What would it look like if someone was truly curious — if they actually wanted true beliefs? Not someone who wanted to feel like they sought the truth, or to feel their beliefs were justified. Not someone who wanted to signal a desire for true beliefs. No: someone who really wanted true beliefs. What would that look like?

A truly curious person would seek to understand the world as broadly and deeply as possible. They would study the humanities but especially math and the sciences. They would study logic, probability theory, argument, scientific method, and other core tools of truth-seeking. They would inquire into epistemology, the study of knowing. They would study artificial intelligence to learn the algorithms, the math, the laws of how an ideal agent would acquire true beliefs. They would study modern psychology and neuroscience to learn how their brain acquires beliefs, and how those processes depart from ideal truth-seeking processes. And they would study how to minimize their thinking errors.

They would practice truth-seeking skills as a musician practices playing her instrument. They would practice "debiasing" techniques for reducing common thinking errors. They would seek out contexts known to make truth-seeking more successful. They would ask others to help them on their journey. They would ask to be held accountable.

They would cultivate that burning itch to know. They would admit their ignorance but seek to destroy it.

They would be precise, not vague. They would be clear, not obscurantist.

They would not flinch away from experiences that might destroy their beliefs. They would train their emotions to fit the facts.

They would update their beliefs quickly. They would resist the human impulse to rationalize.

But even all this could merely be a signaling game to increase their status in a group that rewards the appearance of curiosity. Thus, the final test for genuine curiosity is behavioral change. You would find a genuinely curious person studying and learning. You would find them practicing the skills of truth-seeking. You wouldn't merely find them saying, "Okay, I'm updating my belief about that" — you would also find them making decisions consistent with their new belief and inconsistent with their former belief.

Every week I talk to people who say they are trying to figure out the truth about something. When I ask them a few questions about it, I often learn that they know almost nothing of logic, probability theory, argument, scientific method, epistemology, artificial intelligence, human cognitive science, or debiasing techniques. They do not regularly practice the skills of truth-seeking. They don't seem to say "oops" very often, and they change their behavior even less often. I conclude that they probably want to feel they are truth-seeking, or they want to signal a desire for truth-seeking, or they might even self-deceivingly "believe" that they place a high value on knowing the truth. But their actions show that they aren't trying very hard to have true beliefs.

Dare I say it? Few people look like they really want true beliefs.

Comments

sorted by
magical algorithm
Highlighting new comments since Today at 8:39 PM
Select new highlight date
All comments loaded

They would study logic, probability theory, argument, scientific method, and other core tools of truth-seeking. They would inquire into epistemology, the study of knowing. They would study artificial intelligence to learn the algorithms, the math, the laws of how an ideal agent would acquire true beliefs. They would study modern psychology and neuroscience to learn how their brain acquires beliefs, and how those processes depart from ideal truth-seeking processes. And they would study how to minimize their thinking errors.

Not necessarily. Hindsight bias is likely at work here. You know that studying these fields helped you to acquire better beliefs, and so you conclude that this consequence should be obvious. But unless a curious but untrained reasoner somehow finds out that studying these fields will help them, we shouldn't expect them to study them. Why on earth would someone try to read The Logic of Science if they didn't already know that it would improve their reasoning skills?

There are a lot more genuinely curious people out there than there are rationalists. But unless those curious people happen to meet a LWer, or stumble across a link to this site, their chances of learning the benefits of studying these subjects are not great. There are a few books that might get them started (Robyn Dawes is coming to mind), but how likely is it that they're going to stumble across one of those books, especially if they aren't explicitly interested in the field of human thought already (like EY?).

I would bet that there are a lot of genuinely curious people out there who have realized that thinking is a skill. But if you were to ask them for the best way they knew of to improve that skill, would say something along the lines of "sudoku puzzles". And that's pretty sad.

I agree in part, though this excuse was stronger before Google. Now people can Google "how to think better" or "how to figure out what's true" and start looking around. One thing leads to another. Almost all the stuff I mention above is discussed in many of the textbooks on thinking and deciding — like, say, Thinking and Deciding.

I tried typing those queries (and related ones) into google, to see if someone could easily find some sort of starting point for rationality. "How to think better" yields many lists of tips that are mediocre at best (things like: exercise, become more curious, etc). About halfway down the page, interestingly, is a post on CSA, but it's not a great one. It seems to mostly say that to get better at thinking you first have to realize that you are not naturally a fantastic thinker. This is true, but it's not something that points the way forward towards bayesian rationality. (by the way, "how to figure out what's true" provides essentially nothing of value, at least on the first page).

In order for someone to go down the path you've identified on their own, as a curious individual, they would have to have a substantial amount of luck to get started. Either they would have to have somehow stumbled upon enough of an explanation of heuristics and biases that they realized the importance of them (which is a combination of two fairly unlikely events), or they would have to be studying those subjects for some reason other than their instrumental value. Someone who started off curiously studying AI would have a much better chance at finding this path, for this reason. AI researchers in this instance, have a tremendous advantage when it comes to rationality over researchers in the hard sciences, engineers, etc.

I'm not an expert, but with this in mind it should be a rather simple matter to apply a few strategies so that LW shows up near the top of relevant search results. At the very least we could create wiki pages with titles like "How to Think Better" and "How to Figure Out What's True" with links to relevant articles or sequences. The fact that rationality has little obvious commercial value should work in our favor by keeping competing content rather sparse.

When I search for keyword: rationality, I get HPMoR for #2, yudkowsky.net for #5, and What Do We Mean By "Rationality"? for #7. Not sure how much my search history is affecting this.

Is rationality a common enough word that people would naturally jump to it when trying to figure out how to think better? I'm not sure how often I used it before Less Wrong, but I know that it is substantially more commonplace after reading the sequences.

Thanks MinibearRex.

I've added ads on Google AdWords that will start coming up for this in a couple days when the new ads get approved so that anyone searching for something even vaguely like "How to think better" or "How to figure out what's true" will get pointed at Less Wrong. Not as good as owning the top 3 spots in the organic results, but some folks click on ads, especially when it's in the top spot. And we do need to make landing on the path towards rationality less of a stroke of luck and more a matter of certainty for those who are looking.

"Exercise" is really not a mediocre tip at all.

You're right; mediocre is not the best word for what I meant there. Humans generally function better when they exercise. But it doesn't fundamentally change the way people think. If we use a car metaphor, exercise is things like changing the oil and keeping it well tuned. It can make a big difference. But not as big of a difference as upgrading the engine.

I tried typing those queries (and related ones) into google, to see if someone could easily find some sort of starting point for rationality.

Upvoted for actually trying it out.

So, someone would google "how to think better", find a $38.90 book by an author they've never heard of before, and buy it without suspecting it to be self-help nonsense?

I'd handle shame-flavored incentives with tongs. It's plausible that I have an unusual degree of sensitivity on the subject, but I'm making progress on a very bad case of self-hatred and akrasia, and "is my curiosity good enough?" strikes me as a sort of self-alienation which takes focus away from paying attention to whatever you might be curious about.

"What might I be missing about this?", "How can I increase my enthusiasm for learning?", "How can I spend less time on errors while still taking on difficult projects?" seem much safer. "What am I doing to improve my life? Is it having the desired effect?" should probably be on the list.

I'm having difficulty knowing what level of rationalist this is aimed at. Are the people you talk to every week students of rationality, or 'normal' people?

This post applies to both, I imagine. But because you talk about "people" instead of explicitly talking about people like me, it's easy to see this post as not being aimed at me. (Maybe it's not).

What I mean is: It's easy to praise oneself and one's peers by talking about people of a lower class. When I was young, it was 'dumb people', when I was a bit more sophisticated it was 'theists', when I was an Objectivist, it was 'non-Objectivists', and now that I'm a rationalist the temptation is to criticize those who "know almost nothing of logic, probability theory, argument, scientific method, epistemology, artificial intelligence, human cognitive science, or debiasing techniques." So this post, because it isn't clearly directed at people who have worked hard to do better in the ways prescribed by the Sequences, causes my semiconscious mind to ask: "is this a beginning level post, or something I should actually pay attention to?" Are you telling me to do better, or criticizing outsiders in order to promote group bonding?

Of course my rider knows I should pay attention; I must always work harder to cultivate the virtues. And I don't actually expect that you're just trying to promote group bonding. But what I really want - and what this post may not have been designed for - is a honest criticism of people you see who are a lot like me in that they have access to all the same correct memes as I have, and they have exerted effort to improve in the appropriate areas.

Also, they would seek to personally become an immortal super-intelligence, since many truths simply can't be learned by an unenhanced human, and certainly not within a human lifetime.

(Which is why the Yudkowsky-Armstrong Fun-Theoretic Utopia leaves me cold. Would any curious person not choose to become superintelligent and "have direct philosophical conversations with the Machines" if the only alternative is to essentially play the post-Singularity equivalent of the World of Warcraft?)

On a grand scale, my hunger for truths is probably as limited and easy to satisfy as my hunger for cheeseburgers. I do feel that in a post-Singularity world I'd want to enhance my intelligence, but the underlying motivation seems to be status-seeking, a desire to be significant.

Something I learned viscerally while I was recovering from brain damage is that intelligence is fun. I suspect I'd want to enhance my intelligence in much the same way that I'd want to spend more time around puppies.

Context matters, I suspect. I don't think that having a 140 IQ would be all that fun if everyone one interacted with on a daily basis was in the 90-100 range.

edited to depersonalize the pronoun.

On a grand scale, my hunger for truths is probably as limited and easy to satisfy as my hunger for cheeseburgers.

I have very good reasons to think that my hunger for cheeseburgers is limited and easy to satisfy (e.g., ample evidence from past consumption/satiation of various foods including specifically cheeseburgers). On the other hand, there seems good reason to suspect that if my appetite for truths is limited, the satiation level comes well after what can be achieved at human intelligence level and within a human lifetime (e.g., there are plenty of questions I want answers to that seem very hard, and every question that gets answered seems to generate more interesting and even harder questions).

(It's an interesting question whether all my questions could be answered within 1 second after the Singularity occurs, or if it would require the more than the resources in our entire light cone, or something in between, but the answer to that doesn't affect my point that a curious person would seek to become superintelligent.)

I do feel that in a post-Singularity world I'd want to enhance my intelligence, but the underlying motivation seems to be status-seeking, a desire to be significant.

If Omega offered to enhance your intelligence and/or answer all your questions, but for your private benefit only (i.e., you couldn't tell anyone else or otherwise use your improved intelligence/knowledge to affect the world), would you not be much interested?

It's an interesting question whether all my questions could be answered within 1 second after the Singularity occurs, or if it would require the more than the resources in our entire light cone, or something in between, but the answer to that doesn't affect my point that a curious person would seek to become superintelligent.

Are you at all curious about what the 3^^^3rd digit of Pi is?

Dare I say it? Few people look like they really want true beliefs.

I think otherwise - most people want to have true beliefs. However, they have rather limited trust in the powers of their own logic, as the experience of school has taught them that they are often wrong. They don't have the numerical skills to embark on anything more numerically ambitious than what money requires. They expect to be wrong often, and rarely use formal reason as such. But they still want to have true beliefs, and rely mostly on intuition and experience to decide on that.

For most people, most beliefs are socially acquired - people acquire their beliefs from the people around them, and they tend to acquire large blocks of belief together. One shouldn't underestimate the sheer amount of work needed to do anything different.

Most people never create a new idea (in the sense you're talking about) in their entire lives - they have experiences, yes, and they change beliefs based on experience. But they do not regard themselves as having the basic equipment to generate ideas, or to be sophisticated in judging between them.

In the end I've come to the view that none of us can change this (well, not anytime soon at any rate). Human beings think in groups - and the best most of us can do to help others think better is to do it for them, and talk about it sometimes. Obviously there is a group of people who can do more than that, but they are a minority.

The other comment I have on your post is that all of the above is actually just one idea. Which is that the basis of all knowledge about the world is inductive reasoning, and, in principle, all such reasoning should be based on the sound use of statistics. There are many mathematical ways of messing up such use, and our intuitive reasoning also messes up these stats too. If you really need the right answer, you will need to learn enough to get your statistics right, and compensate for the shortcomings of your wetware. And that's the whole idea in a nutshell.

An example. Did you know that brakes are the most dangerous piece of equipment on your car? In a staggeringly large number of accidents, the driver was using the brake at the time of the collision. Surely we could make driving safer by removing the brakes, then? Of course the thesis is ludicrous, but how many of us are confident that we wouldn't make a similar statistical mistake in a different context? But once you embark on the journey of trying to fix such problems in your own thinking, the road leads all the way to the place your post describes.

Most people think the journey is beyond them, and leave it to people like you and Eliezer to make the journey for them, and report back your findings. And unfortunately I don't think they're wrong about that.

However, they have rather limited trust in the powers of their own logic, as the experience of school has taught them that they are often wrong. They don't have the numerical skills to embark on anything more numerically ambitious than what money requires.

I believe the situation is a good bit worse than that. One of the underlying lessons of conventional schooling is "You can't be trusted to think about what you need to know."

This is a bit like a member of the Kalenjin tribe irritatedly asking me as I gasp and wheeze and double over:

"Why can't you just, like, run faster?"

But it's also really well-written and concise and valuably motivating (by way of shaming).

You're doing that thing where you write like Yudkowsky again. It's kind of hot.

And yet the conclusion is so...Hansonesque.

Every week I talk to people who say they are trying to figure out the truth about something. When I ask them a few questions about it, I often learn that they know almost nothing of logic, probability theory, argument, scientific method, epistemology, artificial intelligence, human cognitive science, or debiasing techniques...I conclude that they probably want to feel they are truth-seeking, or they want to signal a desire for truth-seeking, or they might even self-deceivingly "believe" that they place a high value on knowing the truth. But their actions show that they aren't trying very hard to have true beliefs.

Really? What percent of people are aware of the existence of cognitive biases? One percent? At least I wouldn't expect more than that to realize that probability theory or artificial intelligence bear upon questions in seemingly unrelated fields like philosophy or medicine.

And of people who know of the existence of cognitive biases, how many are even capable of genuinely entertaining the thought that they themselves might be biased, as opposed to Rush Limbaugh or unethical pharmaceutical researchers or all those silly people who disagree with them?

And of people who are worried about cognitive biases, how many have access to "debiasing techniques"? I'm not going to put a percent on this one because it's pretty vague, but outside of Less Wrong and a few ahead-of-the-game finance companies, you can't exactly go on Amazon and buy Debiasing for Dummies.

I think I agree with the conclusion (well, maybe, since I don't know enough psychodynamics to really be able to cash out a phrase like "their actions show that they aren't trying very hard) but this particular argument breaks Hanlon's Razor aka the Generalized Anti-Hanson Principle.

this particular argument breaks Hanlon's Razor aka the Generalized Anti-Hanson Principle.

Cute, but I'm not sure I would call a Hansonian interpretation "malicious". Maybe "differently optimized".

I'd reserve malice for active manipulation, not status-seeking.

FWIW I linked to this though my Twitter feed and got a very negative reaction from my friends, though the reasons they give for disliking it are pretty varied; they said things like

  • it's so arrogant!
  • It's absolutist - not everyone will have time to study these things
  • but what is "truth" really?
  • what about my-favourite-discipline? (Physics in one case)

One person has promised to write up what they felt in a blog post, which I look forward to reading.

They would study artificial intelligence to learn the algorithms, the math, the laws of how an ideal agent would acquire true beliefs.

Really? The others make sense, but it's not clear this will be useful to a human trying to learn things themselves. If I want to notice patterns, "plug all of your information into a matrix and perform eigenvector decompositions" is probably not going to get me very far.

I think you are confusing between wanting to know, and being good at it.

Imagine someone in the stone age, would you say none was genuinely curious because they didn't know about all those fields which weren't invented yet ?

Then, what about someone living in our world, but not knowing about Bayesian reasoning, AI, ... ? How can he know that those fields are fundamental to learn, to satisfy their curiosity on another field, before at least learning the basis of them ? When you don't know about Bayes' theorem, but you are curious (you really want to know the truth) about, say, ancient Rome history or about if there ever was life on Mars, what would drive you to learn probability theory ? How can you know you must learn it to learn other thing, when you don't know much about that other thing ?

Sure, if you are curious, you'll want to learn in all fields. But since we have a limited amount of time, you can't except someone to learn Bayesian reasoning, even if he's really curious, unless there is some kind of trigger that makes him realize how useful it would be to be efficient in being curious.

Genuinely wanting something and being good at doing it are not directly linked. You can't say someone isn't really wanting something just because he pursues it in an efficient way.

Do "curious" people want to learn the (already discovered) truth or to discover heretofore unknown truths? You seem to confound the two. Data on the statistical correlation between these distinct motives would be interesting, but I doubt most scientists are primarily concerned with personally accumulating true beliefs. Preparing to make contribution to human knowledge probably looks a lot different from preparing to absorb the greatest mass of truths. It probably also looks different from preparing to function rationally as far as go quotidian beliefs.

I'm not sure there's an overarching "curiosity" that people have or don't have: I'm very curious about whether a specific kind of database will perform adequately in certain circumstances (long story) but I'm only mildly curious about how to identify which French painter during the 19th century painted which picture. Some art experts, I'm sure, have cultivated the skill to guess within seconds which painter it is for every picture. I wouldn't mind having that skill -- it sounds like a fun skill to have -- but it seems like it would be more resources than it's worth. OTOH, I really want my probability estimations re: the database to reflect reality. Do I need to use AI theory? Doubtful. Probably a little bit of statistics, and even that fairly mild, but I do have to think a lot about how to use my knowledge of databases to design experiments to find the truth out. I'm not sure if that would look "curious" to the lay person (and, of course, there's also a factor of "signaling curiosity" -- I want to make sure that everyone with a stake in the process sees that I've done the due diligence), but nonetheless, I'm truly curious about this (and yes, it could go both ways...I think this is the most important part of curiosity vs. fake curiosity).

When I was genuinely curious about the US immigration laws applied to me (and again, it could have gone both ways -- before running any experiments, I made sure to visualize both options, and realizing I can live with both) I just called an immigration lawyer (and, for a latter question, the paralegal who was involved with my visa). In that case I needed very little knowledge from LW -- I didn't apply my knowledge about Bayes, or about heuristics and biases, and just went and asked a professional (of course, in some cases, like wanting to know if a stock will go up, asking a professional is disasterous, but with immigration law, lawyers can estimate probabilities fairly accurately even if they lack formal rationality training).

Those were the two examples of real curiosity from my life that I could think of, that looked nothing like the description here of "real curiosity"....