Why Don't Rationalists Win?

Here are my thoughts on the "Why don't rationalists win?" thing.

Epistemic

I think it's pretty clear that rationality helps people do a better job of being... less wrong :D

But seriously, I think that rationality does lead to very notable improvements in your ability to have correct beliefs about how the world works. And it helps you to calibrate your confidence. These abilities are useful. And I think rationality deserves credit for being useful in this area.

I'm not really elaborating here because I assume that this is something that we agree on.

However, I should note that rationalists aren't really making new and innovative discoveries (the non-superstar ones anyway), and that this may increase the "why don't rationalists win?" thing. I think that a big reason for this lack of progress is because a) we think about really really really difficult things! And b) we beat around the bush a lot. Big topics are often brought up, but I rarely see people say, "Ok, this is a huge topic so in order to make progress, we're going to have to sit down for many hours and be deliberate about this. But I think we could do it!". Instead, these conversations seem to be just people having fun, procrastinating, and never investing enough time to make real progress.

Altruistic

I also think that rationality is doing a great job in helping people to do a better job at being altruistic. Another thing that:

  • I'm going to assume that we mostly agree on, and thus not really elaborate.
  • I think deserves to be noted and given credit.
  • Is useful.

For people with altruistic goals, rationality is helping them to achieve their goals. And I think it's doing a really good job at this. But I also think that it doesn't quite feel like the gains being made here are so big. I think that a major reason for this is because the gains are so:

  1. High level.
  2. Likely to be realized far in the future.
  3. Are the sort of thing that you don't personally experience (think: buying a poor person lunch vs. donating money to people in Africa).

But we all know that (1), (2), and (3) don't actually make the gains smaller, it just makes them feel smaller. I get the impression that the fact that the gains feel smaller results in an unjustified increase in the "rationalists don't win" feeling.

Success

I get the impression that lack of success plays a big role in the "why don't rationalists win?" thing.

I guess an operational definition of success for this section could be "professional, financial, personal goals, being awesome...". 

I don't know much about this, but I would think and hope that rationality helps people to be notably more successful than they otherwise would be. I don't think rationality is at the point yet where it could make everyone millionaires (metaphorically and/or literally). But I think that a) it could get there, and b) we shouldn't trivialize the fact that it does (I'm assuming) make people notably more successful than they otherwise would be.

But still, I think that there are a lot of other factors that determine success, and given their difficulty/rarity, even with rationality in your toolbox, you won't achieve that much success without these things.

  1. Plain old hard work. I'm a huge believer in working smart, but I also think that given a pretty low and relatively sufficient level of smartness in your work, it's mostly a matter of how hard you work. You may ask yourself, "Take someone who studies really hard, but is lacking big time when it comes to rationality - wouldn't they not be successful?". I think an important (and sad) point to make is that at this point in history, you could be very successful with domain specific knowledge, but no rationality. And so people who work really hard but don't have an ounce of rationality often end up being very good at what they do, and very successful. I think we'll reach a point where things progress enough and rationality does in fact become necessary (the people with domain specific knowledge but no rationality will fail).
  2. Aptitude/starting early. I'm not sure the extent to which aptitude is actually a thing. I sense that a big part of it is simply how early on you started. When your brain was at that "sponge-stage". Regardless, aptitude/starting early seems to be pretty important. Someone who works hard but started too late will certainly be at a disadvantage.
  3. Opportunity. In one sense, not much will help you if you have to work 3 jobs to survive (you won't have much time for self-improvement or other necessary investments of time). In another sense, there's the idea that "you are who you surround yourself with". So people who are fortunate enough to grow up around other smart and hard working people will have had the opportunity to be socially pressured into doing the same. I think this is very underrated, but also very overcommable. In another sense, some people are extremely fortunate and are born into a situation where they have a lot of money and connections.
  4. Ambition/confidence. Example: imagine a web developer who has rationality + (1) + (2) + (3) but doesn't have (4). He'll probably end up being a good web developer. But he might not end up being a great web developer. The reason for that is because he might not have the ambition or confidence to think to pursue certain skills. He may think, "that stuff is for truly smart people, I'm just not one of those people". And he may not have the confidence to pursue the goal of being a great software engineer (more general and wide-ranging). He may not have the confidence to learn C and other stuff. Note that there's a difference between not having the confidence to try, and not having the confidence to even think to try. I think that the latter is a lot more common, and blends into "ambition territory". On that note, this hypothetical person may not think to pursue innovative ideas, or get into UX, or start a startup and do something bigger.
My point in this section is that rationality can help with success, but 1-4 are also extremely important, and probably act as a limiting factor for most of us (I'd guess that most people here are rational enough such that 1-4 probably acts as a barrier to their success, and marginal increases in rationality probably won't have too big a marginal impact).

(I also bet that 1-4 is insufficient and that there are important things I'm missing.)

Happiness

I get the impression that lack of happiness plays a big role in the "why don't rationalists win?" thing.

Luke talked about the correlates of happiness in How to Be Happy:

Factors that don't correlate much with happiness include: age,7 gender,8 parenthood,9 intelligence,10 physical attractiveness,11 and money12 (as long as you're above the poverty line). Factors that correlate moderately with happiness include: health,13 social activity,14 and religiosity.15 Factors that correlate strongly with happiness include: genetics,16 love and relationship satisfaction,17 and work satisfaction.18

One thing I want to note is that genetics seem to play a huge role, and that plus the HORRIBLE hedonic adaptation thing makes me think that we don't actually have that much control over our happiness.

Moving forward... and this is what motivated me to write this article... the big determinants of happiness seem like things that are sort of outside rationality's sphere of influence. I don't believe that, and it kills me to say it, but I thought it'd make more sense to say it first and then amend it (a writing technique I'm playing around with and am optimistic about). What I really believe is:

  • Things like social and romantic relationships are tremendously important factors in one's happiness. So is work satisfaction (in brief: autonomy, mastery and purpose).
  • These are things that you could certainly get without rationality. Non-rationalists, have set a somewhat high bar for us to beat.
  • Rationality certainly COULD do wonders in this area.
  • But the art hasn't progressed to this point yet. Doing so would be difficult. People have been trying to figure out the secrets of happiness for 1000s of years, and though I think we've made some progress, we still have a long way to go.
  • Currently, I'm afraid that rationality might be acting as a memetic immune disorder. There's a huge focus on our flaws and how to mitigate them, and this leads to a lot of mental energy being spent thinking about "bad" things. I think (and don't know where the sources are) that a positive/optimistic outlook plays a huge role in happiness. "Focusing on the good." Rationality seems to focus a lot on "the bad". Rationality also seems to make people feel unproductive and wrong for not spending enough time focusing on and fixing this "bad", and I fear that this is overblown and leads to unnecessary unhappiness. At the same time, focusing on "the bad" is important: if you want to fix something, you have to spend a lot of time thinking about it. Personally, I struggle with this, and I'm not sure where the equilibrium point really is.

Social

Socially, LessWrong seems to be a rather large success to me. My understanding is that it started off with Eliezer and Robin just blogging... and now there are thousands of people having meet-ups across the globe. That amazes me. I can't think of any examples of something similar.

Furthermore, the social connections LW has helped create seem pretty valuable to me. There seem to be a lot of us who are incredibly unsatisfied with normal social interaction, or sometimes just plain old don't fit in. But LW has brought us together, and that seems incredible and very valuable to me. So it's not just "it helps you meet some cool people". It's "it's taken people who were previously empty, and has made them fulfilled".

Still though, I think there's a lot more that could be done. Rationalist dating website?* Rationalist pen pals (something that encourages the development of deeper 1-on-1 relationships)? A more general place that "encourages people to let their guard down and confide in each other"? Personal mentorship? This is venturing into a different area, but perhaps there could be some sort of professional networking?

*As someone who constantly thinks about startups, I'm liking the idea of "dating website for social group X that has a hard time relating to the rest of society". It could start off with X = 1, and expand, and the parent business could run all of it.

Failure?

So, are we a failure? Is everything moot because "rationalists don't win"?

I don't think so. I think that rationality has had a lot of impressive successes so far. And I think that it has

A LOT

of potential (did I forget any other indicators of visual weight there? it wouldn't let me add color). But it certainly hasn't made us super humans. I get an impression that because rationality has so much promise, we hold it to a crazy high standard and sometimes lose sight of the great things it provides. And then there's also the fact that it's only, what, a few decades old?


(Sorry for the bits of straw manning throughout the post. I do think that it lead to more effective communication at times, but I also don't think it was optimal by any means.)

Comments

sorted by
magical algorithm
Highlighting new comments since Today at 3:15 PM
Select new highlight date
All comments loaded

What is the observed and self-reported opinion on LW about "rationalists don't win"? Lets poll! Please consider the following statements (use your definition of 'win'):

I don't win: [pollid:1023]

Rationality (LW-style) doesn't help me win (by my definition of 'win'): [pollid:1024]

Rationality (LW-style) doesn't help people win (by my definition of 'win'): [pollid:1025]

I think rationalists on average don't win more than on average (by my definition of 'win'): [pollid:1026]

I think the public (as far as they are aware of the concept) thinks that rationalists don't win (by their standards): [pollid:1027]

Clarification: 'Don't win' is intended to mean 'lose' thus a negative effect of rationality. If you think that rationality has no effect or is neutral with regard to winning, then please choose the middle option.

Clarification: 'Don't win' is intended to mean 'lose' thus a negative effect of rationality. If you think that rationality has no effect or is neutral with regard to winning, then please choose the middle option.

I wish more surveys had such clarifications. I can never tell whether "Strongly disagree" with "X should be more than Y" means "I strongly believe X should be less than Y", "I strongly believe X should be about the same as Y", or "I strongly believe it doesn't matter whether X is more or less than Y" (and, as a result, what I should pick if my opinion is one of the latter two).

I'm not sure how effective this is considering most people who would see this are rationalists and people like to think good of themselves.

I've never really understood the "rationalists don't win" sentiment. The people I've met who have LW accounts have all seemed much more competent, fun, and agenty than all of my "normal" friends (most of whom are STEM students at a private university).

I should note that rationalists aren't really making new and innovative discoveries (the non-superstar ones anyway)

There have been plenty of Gwern-style research posts on LW, especially given that writing research posts of that nature is quite time-consuming.

I went to an LW meetup once or twice. With one exception the people there seemed less competent and fun than my university friends, work colleagues, or extended family, though possibly more competent than my non-university friends.

That was also true for me until I moved to the bay. I suspect it simply doesn't move the needle much, and it's just a question of who it attracts.

I have the opposite experience! Most people at LW meetups I've been to have tended to be succesful programmers or people with or working on stuff like math phds. Generally more socially awkward but that's not a great proxy for "competence" in this kind of crowd.

Do you think this was caused by their rationality? It seems more likely to me that these people are drawn to rationality because it validates how they already think.

There are plenty of researchers who have never heard of lesswrong, but who manage to produce good work in fields lesswrong respects. So have they...

A over come their lack of rationality,

..or...

B learnt rationality somewhere else?

And, if the answer is B, what is LW adding to rationality? The assumption that rationality will make you good at range of things that aren't academic or technical?

The big reason? Construal theory, or as I like to call it, action is not an abstraction. Abstract construal doesn't prime action; concrete construal does.

Second big reason: the affect (yes, I do mean affect) of being precise, is very much negative. Focusing your attention on flaws and potential problems leads to pessimism, not optimism. But optimism is correlated with success, pessimism is not.

Sure, pessimism has some benefits in a technical career, in terms of being good at what you do. But it's in conflict with other things you need for a successful career. TV's Dr. House is an extreme example, but most real people are not as good at the technical part of their job as House nor are the quality of their results usually as important.

Both of these things combine to create the next major problem: a disposition to non-co-operative behavior, aka the "why can't our kind get along?" problem.

Yes, not everyone has these issues, diverse community, etc. But, as a stereotypical and somewhat flippant summary, the issue is that simply by the nature of valuing truth -- precise truth, rather then the mere idea of truth -- one is treating it as being more important than other goals. That means it's rather unlikely that a person interested in it will be sufficiently interested in other goals to make progress there. I would expect it more likely that a person who is not naturally inclined towards rationalism would be able to put it to good use, than someone who's just intellectually interested in rationalism as a conversation topic or as an ideal to aspire to.

To put it another way, if you already have "something to protect", such that rationality is a means towards that end, then rationality can be of some value. If you value rationality for its own sake, well, then that is your goal, and so you can perhaps be called "successful" in relation to it, but it's not likely that anyone who doesn't value rationality for its own sake will consider your accomplishments impressive.

So, the truth value of "rationalists don't win" depends on your definition of "win". Is it "win at achieving their own, perhaps less-than-socially-valued goals? Or "win at things that are impressive to non-rationalists"? I think the latter category is far less likely to occur for those whose terminal values are aimed somewhere near rationality or truth for its own sake.

So, the truth value of "rationalists don't win" depends on your definition of "win"

Or the definition of rationalism. Maybe epistemic rationalism never had much to do with winning.

Epistemic rationality isn’t about winning?

Demonstrated, context-appropriate epistemic rationality is incredibly valuable and should lead to higher status and -- to the extent that I understand Less-Wrong jargon --“winning.”

Think about markets: If you have accurate and non-consensus opinions about the values of assets or asset classes, you should be able to acquire great wealth. In that vein, there are plenty of rationalists who apply epistemic rationality to market opinions and do very well for themselves. Think Charlie Munger, Warren Buffett, Bill Gates, Peter Thiel, or Jeff Bezos. Winning!

If you know better than most who will win NBA games, you can make money betting on the games. E.g., Haralabos Voulgaris. Winning!

Know what health trends, diet trends, and exercise trends improve your chances for a longer life? Winning!

If you have an accurate and well-honed understanding of what pleases the crowd at Less Wrong, and you can articulate those points well, you’ll get Karma points and higher status in the community. Winning!

Economic markets, betting markets, health, and certain status-competitions are all contexts where epistemic rationality is potentially valuable.

Occasionally, however, epistemic rationality can be demonstrated in ways that are context-inappropriate – and thus lead to lower status. Not winning!

For example, if you correct someone’s grammar the first time you meet him or her at a cocktail party. Not winning!

Demonstrate that your boss is dead wrong in front of a group of peers in way that embarrasses her? Not winning!

Constantly argue about LW-type topics with people who don’t like to argue? Not winning!

Epistemic rationality is a tool. It gives you power to do things you couldn’t do otherwise. But status-games require a deft understanding of when it is appropriate and when it is not appropriate to demonstrate the greater coherence of one’s beliefs to reality to others (which itself strikes me as a form of epistemic rationality of social awareness). Those who get it right are the winners. Those who do not are the losers.

What you less wrong folks call "rationality" is not what everyone else calls "rationality" - you can't say "I also think that rationality is doing a great job in helping people", that either doesn't make sense or is a tautology, depending on your interpretation. Please stop saying "rationality" and meaning your own in-group thing, it's ridiculously offputting.

Also, my experience has been that CFAR-trained folks do sit down and do hard things, and that people who are only familiar with LW just don't. It has also been my experience that they don't do enough hard things to just "win", in the sense defined here, and that the difference between "winning" and not is actually not easily exploitable with slightly more intelligent macro-scale behavior. The branching points that differentiate between the winning and losing paths are the exploitable points - things like deciding whether or not to go to college, or whether to switch jobs - and they're alright at choosing between those, but so are other people. CFAR-trained folks are typically reasonably better than equivalently intelligent folks who have had the same experience so far, but not dramatically so.

you can't say "I also think that rationality is doing a great job in helping people", that either doesn't make sense or is a tautology,

He may have meant that he thinks rationality is effective for altruists.

Side-stepping the issue of whether rationalists actually "win" or "do not win" in the real world, I think a-priori there are some reasons to suspect that people who exhibit a high degree or rationality will not be among the most successful.

For example: people respond positively to confidence. When you make a sales pitch for your company/research project/whatever, people like to see you that you really believe in the idea. Often, you will win brownie points if you believe in whatever you are trying to sell with nearly evangelical fervor.

One might reply: surely a rational person would understand the value of confidence and fake it as necessary? Answer: yes to the former, no to the latter. Confidence is not so easy to fake; people with genuine beliefs either in their own grandeur or in the greatness of their ideas have a much easier time of it.

Robert Kurzbans' book Why Everyone (Else) Is a Hypocrite: Evolution and the Modular Mind is essentially about this. The book may be thought of as a long-winded answer to the question "Why aren't we all more rational?" Rationality skills seem kinda useful for bands of hunter-gatherers to possess, and yet evolution gave them to us only in part. Kurzban argues, among other things, that those who are able to genuinely believe certain fictions have an easier time persuading others, and therefore are likely to be more successful.

Let's say I wanted to solve my dating issues. I present the following approaches:

  1. I endeavor to solve the general problem of human sexual attraction, plug myself into the parameters to figure out what I'd be most attracted to, determine the probabilities that individuals I'd be attracted to would also be attracted to me, then devise a strategy for finding someone with maximal compatibility.

  2. I take an iterative approach: I devise a model this afternoon, test it this evening, then analyze the results tomorrow morning and make the necessary adjustments.

Which approach is more rational? Given sufficient time, Approach 1 will yield the optimal solution. Approach 2 has to deal with the problem of local maxima and in the long run is likely to end up worse than Approach 1. An immortal living in an eternal universe would probably say that Approach 1 is vastly superior. Humans, on the other hand, will die well before Approach 1 bears fruit.

While rationality can lead to faster improvement using Approach 2, a rationalist might try Approach 1, whereas a non-rationalist is unlikely to use Approach 1 at all.

Simple amendments to the general problem such as "find the best way to get the best date for next Saturday" will likely lead to solutions making heavy use of deception. If you want to exclude the Dark Arts from the solution space, then that's going to limit what you can accomplish. The short-term drawbacks of insisting on truth and honesty are well-documented.

"Local rationalist learns to beat akrasia using this one weird trick!"

Human beings are not very interested in truth in itself. They are mostly interested in it to the extent that it can accomplish other things.

Less Wrongers tend to be more interested in truth in itself, and to rationalize this as "useful" because being wrong about reality should lead you to fail to attain your goals.

But normal human beings are extremely good at compartmentalization. In other words they are extremely good at knowing when knowing the truth is going to be useful for their goals, and when it is not. This means that they are better than Less Wrongers at attaining their goals, because the truth does not get in the way. And their errors do not hinder their goals, for the most part, since they know when they need the truth and look for it on those occasions. Less Wrong's rationalization for how being interested in the truth in itself will help you attain your goals, is simply wrong. Not caring about the truth can get you to your goals anyway, and better than caring about the truth, if you make sure to care about the truth exactly on the occasions when you need it, and only then.

However, it is possible to hold Less Wrong's position without any rationalization, by saying that the truth really is that important in itself. In this case reaching the truth is winning, regardless of what else happens.

My life got worse after I found LessWrong, but I can't really attribute that to a causal relationship. I just don't belong in this world, I think.

I can imagine LW-style rationality being helpful if you're already far enough above baseline in enough areas that you would have been fairly close to winning regardless. (I am now imagining "baseline" as the surface of liquids in Sonic the Hedgehog 1-3. If I start having nightmares including the drowning music, ... I'll... ... have a more colorful way to describe despair to the internet, I guess.)