Explicit and tacit rationality

Like Eliezer, I "do my best thinking into a keyboard." It starts with a burning itch to figure something out. I collect ideas and arguments and evidence and sources. I arrange them, tweak them, criticize them. I explain it all in my own words so I can understand it better. By then it is nearly something that others would want to read, so I clean it up and publish, say, How to Beat Procrastination. I write essays in the original sense of the word: "attempts."

This time, I'm trying to figure out something we might call "tacit rationality" (c.f. tacit knowledge).

I tried and failed to write a good post about tacit rationality, so I wrote a bad post instead — one that is basically a patchwork of somewhat-related musings on explicit and tacit rationality. Therefore I'm posting this article to LW Discussion. I hope the ensuing discussion ends up leading somewhere with more clarity and usefulness.

 

Three methods for training rationality

Which of these three options do you think will train rationality (i.e. systematized winning, or "winning-rationality") most effectively?

  1. Spend one year reading and re-reading The Sequences, studying the math and cognitive science of rationality, and discussing rationality online and at Less Wrong meetups.
  2. Attend a CFAR workshop, then spend the next year practicing those skills and other rationality habits every week.
  3. Run a startup or small business for one year.

Option 1 seems to be pretty effective at training people to talk intelligently about rationality (let's call that "talking-rationality"), and it seems to inoculate people against some common philosophical mistakes.

We don't yet have any examples of someone doing Option 2 (the first CFAR workshop was May 2012), but I'd expect Option 2 — if actually executed — to result in more winning-rationality than Option 1, and also a modicum of talking-rationality.

What about Option 3? Unlike Option 2 or especially Option 1, I'd expect it to train almost no ability to talk intelligently about rationality. But I would expect it to result in relatively good winning-rationality, due to its tight feedback loops.

 

Talking-rationality and winning-rationality can come apart

I've come to believe... that the best way to succeed is to discover what you love and then find a way to offer it to others in the form of service, working hard, and also allowing the energy of the universe to lead you.

Oprah Winfrey

Oprah isn't known for being a rational thinker. She is a known peddler of pseudoscience, and she attributes her success (in part) to allowing "the energy of the universe" to lead her.

Yet she must be doing something right. Oprah is a true rags-to-riches story. Born in Mississippi to an unwed teenage housemaid, she was so poor she wore dresses made of potato sacks. She was molested by a cousin, an uncle, and a family friend. She became pregnant at age 14.

But in high school she became an honors student, won oratory contests and a beauty pageant, and was hired by a local radio station to report the news. She became the youngest-ever news anchor at Nashville's WLAC-TV, then hosted several shows in Baltimore, then moved to Chicago and within months her own talk show shot from last place to first place in the ratings there. Shortly afterward her show went national. She also produced and starred in several TV shows, was nominated for an Oscar for her role in a Steven Spielberg movie, launched her own TV cable network and her own magazine (the "most successful startup ever in the [magazine] industry" according to Fortune), and became the world's first female black billionaire.

I'd like to suggest that Oprah's climb probably didn't come merely through inborn talent, hard work, and luck. To get from potato sack dresses to the Forbes billionaire list, Oprah had to make thousands of pretty good decisions. She had to make pretty accurate guesses about the likely consequences of various actions she could take. When she was wrong, she had to correct course fairly quickly. In short, she had to be fairly rational, at least in some domains of her life.

Similarly, I know plenty of business managers and entrepreneurs who have a steady track record of good decisions and wise judgments, and yet they are religious, or they commit basic errors in logic and probability when they talk about non-business subjects.

What's going on here? My guess is that successful entrepreneurs and business managers and other people must have pretty good tacit rationality, even if they aren't very proficient with the "rationality" concepts that Less Wrongers tend to discuss on a daily basis. Stated another way, successful businesspeople make fairly rational decisions and judgments, even though they may confabulate rather silly explanations for their success, and even though they don't understand the math or science of rationality well.

LWers can probably outperform Mark Zuckerberg on the CRT and the Berlin Numeracy Test, but Zuckerberg is laughing at them from atop a huge pile of utility.

 

Explicit and tacit rationality

Patri Friedman, in Self-Improvement or Shiny Distraction: Why Less Wrong is anti-Instrumental Rationality, reminded us that skill acquisition comes from deliberate practice, and reading LW is a "shiny distraction," not deliberate practice. He said a real rationality practice would look more like... well, what Patri describes is basically CFAR, though CFAR didn't exist at the time.

In response, and again long before CFAR existed, Anna Salamon wrote Goals for which Less Wrong does (and doesn't) help. Summary: Some domains provide rich, cheap feedback, so you don't need much LW-style rationality to become successful in those domains. But many of us have goals in domains that don't offer rapid feedback: e.g. whether to buy cryonics, which 40-year investments are safe, which metaethics to endorse. For this kind of thing you need LW-style rationality. (We could also state this as "Domains with rapid feedback train tacit rationality with respect to those domains, but for domains without rapid feedback you've got to do the best you can with LW-style "explicit rationality".)

The good news is that you should be able to combine explicit and tacit rationality. Explicit rationality can help you realize that you should force tight feedback loops into whichever domains you want to succeed in, so that you can have develop good intuitions about how to succeed in those domains. (See also: Lean Startup or Lean Nonprofit methods.)

Explicit rationality could also help you realize that the cognitive biases most-discussed in the literature aren't necessarily the ones you should focus on ameliorating, as Aaron Swartz wrote:

Cognitive biases cause people to make choices that are most obviously irrational, but not most importantly irrational... Since cognitive biases are the primary focus of research into rationality, rationality tests mostly measure how good you are at avoiding them... LW readers tend to be fairly good at avoiding cognitive biases... But there a whole series of much more important irrationalities that LWers suffer from. (Let's call them "practical biases" as opposed to "cognitive biases," even though both are ultimately practical and cognitive.)

...Rationality, properly understood, is in fact a predictor of success. Perhaps if LWers used success as their metric (as opposed to getting better at avoiding obvious mistakes), they might focus on their most important irrationalities (instead of their most obvious ones), which would lead them to be more rational and more successful.


Final scattered thoughts

  • If someone is consistently winning, and not just because they have tons of wealth or fame, then maybe you should conclude they have pretty good tacit rationality even if their explicit rationality is terrible.
  • The positive effects of tight feedback loops might trump the effects of explicit rationality training.
  • Still, I suspect explicit rationality plus tight feedback loops could lead to the best results of all.
  • I really hope we can develop a real rationality dojo.
  • If you're reading this post, you're probably spending too much time reading Less Wrong, and too little time hacking your motivation system, learning social skills, and learning how to inject tight feedback loops into everything you can.

Comments

sorted by
magical algorithm
Highlighting new comments since Today at 4:23 AM
Select new highlight date
Rendering 50/77 comments  show more
  1. I'd particularly expect many people to have good tacit rationality without having good explicit rationality in domains where success is strongly determined by "people skills." This is the kind of thing I expect LWers to be particularly bad at (being neurotypical helps immensely here) and is not the kind of thing that most people can explain how they do (I think it takes place almost entirely in System 1).

  2. When evaluating the relationship between success and rationality it seems worth keeping in mind survivorship bias. For example, a small number of people can be wildly successful in finance through sheer luck due to the large number of people in finance and the randomness of finance. Those people don't necessarily have any rationality, explicit or otherwise, but you're more likely to have heard of them than a random person in finance. But I don't know enough about Oprah to say anything about how much of her being promoted to our collective attention constitutes survivorship bias and how much is genuine evidence of her competence.

  3. One setting where explicit rationality seems instrumentally more useful than tight feedback loops is in determining which tight feedback loops to expose yourself to, e.g. determining whether you should switch from one domain to a very different domain, and if so, which different domain you should switch to. IIRC there are various instances of well-known and respected scientists doing good work in one field and then going on to spout nonsense in another, and this seems like the kind of thing that explicit rationality could help prevent.

Agree that I am spending too much time reading LessWrong, though. I've been quantifying this using RescueTime and the numbers aren't pretty.

When evaluating the relationship between success and rationality it seems worth keeping in mind survivorship bias.

An interesting case is that Will Smith seems likely to be explicitly rational in a way that other people in entertainment don't talk about -- he'll plan and reflect on various movie-related strategies so that he can get progressively better roles and box office receipts.

For instance, before he started acting in movies, he and his agent thought about what top-grossing movies all had in common, and then he focused on getting roles in those kinds of movies.

http://www.time.com/time/magazine/article/0,9171,1689234,00.html

An interesting case is that Will Smith seems likely to be explicitly rational in a way that other people in entertainment don't talk about

In the same venue, I've been impressed by Greene's account of 50 Cent he made in the book "The 50th law". If that's really 50's way of thinking, it's brutally rational and impressively strategical.

Do you want to get more specific about what you mean by "tight feedback loops"? I spent a few years focusing on startup things, and I don't think "tight feedback loops" are a good characterization. It can take a lot of work to figure out whether a startup idea is viable. That's why it's so valuable to gather advance data when possible (hence the lean startup movement). If you want "tight feedback loops", it seems like trying to master some flash game would offer a much better opportunity.

As far as I can tell, what actual entrepreneurs have that wannabee entrepreneurs don't is the ability to translate their ideas in to action. They're bold enough to punch through unendorsed aversions, they're not afraid to make fools of themselves, they don't procrastinate, they actually try stuff out, and they push on without getting easily discouraged. You could think of these skills as being multipliers on rationality... if your ability to act on your ideas is 0, it doesn't matter how good your ideas are and you should focus on improving your ability to act, not improving your ideas. It might help to start distrusting yourself whenever you say "I'll do X" and think "hm... am I really going to do X? What's the first step? When and how am I going to take that step? Why am I not taking it now? If I'm not going to take it now, will I ever take it?" (Relevant.)

BTW, one possible explanation for why some people are able to make good decisions in practice but not in theory could be the near/far thing Robin Hanson likes to bring up.

Lots of people are successful at many things, but that doesn't mean that for any particular person, like Oprah, there will be generalizable insights about success to be gathered from their life. For example, maybe what caused Oprah to skyrocket to billionaire status (instead of being a regular old driven, fairly successful person) was that she came up with a great gimmick. I'm not sure studying her example would provide much insight in to how to be successful for non-talk show people. But if you think it would, there are lots of biographies of famous, successful people you could mine for success insights.

They're bold enough to punch through unendorsed aversions, they're not afraid to make fools of themselves, they don't procrastinate, they actually try stuff out, and they push on without getting easily discouraged.

For what it's worth, I'm a pretty successful entrepreneur and I'd say this more like:

They manage on the whole to punch through many of their unendorsed aversions (at least the big ones that look like they're getting in the way), they're just as afraid to make fools of themselves as you are but they have ways of making themselves act anyway most of the time, they keep their procrastination under control and manage to spend most of their time working, they actually try stuff out, and they have ways to push through their discouragement when it strikes.

(Your version scans better.)

I'm commenting mostly against a characterisation of this stuff being easy for successful entrepreneurs. If you try something entrepreneurial and find that it's hard, that's not very useful information and it doesn't mean that you're not one of the elect and should give up - it's bloody hard for many successful people, but you can keep working on your own systems until they work (if you try to just keep working I think you'll fail - go meta and work on both what's not working to make it work better and on what is working to get more of it).

Thanks! Yes, I agree that it's possible to get better at most of those things through deliberate effort, which includes system-building, and it's a good point that people shouldn't be dissuaded just 'cause it doesn't seem to come to them naturally.

Here's something I heard about Oprah which is consistent with the wikipedia article but not included in it. People had been talking about wanting more positive talk shows, so Oprah decided to have one. This is a rationality skill because she explored giving people what they said they wanted instead of being offended that they didn't like what she was already doing.

It's possible that her gimmick was the result of some thought about the question of how to do a positive talk show while keeping it interesting.

Seems plausible. "Figure out what people want and give it to them" is a widely repeated success principle for salespeople and entrepreneurs. See Paul Graham on making something people want.

What's the evidence that rationality leads to winning? I don't think that claim has been demonstrated.

This whole post seems very circular to me. You believe that rationality leads to winning, in fact you seem to believe that rationality is a necessary condition for winning, so when you see someone win, you conclude that therefore they are rational. And by extension whatever they are doing is rational. And if what the winners do doesn't look rational to us at first glance, we look deeper and rationalize as necessary until we can claim they are rational.

This reminds me not a little of classical economists who believe that people are rational consumers, and therefore treat anything consumers do, no matter how ridiculous, as expressing hidden preferences. I.e. they believe that rational consumers maximize their utility, so they contort utility functions such that they are maximized by whatever consumers choose, rather than recognizing the fact that consumers are often irrational. For instance, if consumers are willing to pay $50 for a bottle of bourbon they won't pay $10 for, they assert that consumers are buying a status symbol rather than a bottle of bourbon. You're proposing an equivalent sort of hidden rationality.

But rationality is not winning. It has a specific and reasonable definition. Epistemic rationality is the ability to form correct beliefs about the world. Instrumental rationality is the ability to choose and take reasonable actions toward ones goals.

Whether either or both of these characteristics leads one to "win" or accumulate massive piles of utility, is an empirical question. If demonstrably irrational people nonetheless manage to win, then we have good evidence that rationality is not a necessary condition for winning. Perhaps it is not even correlated to winning. Worse yet, perhaps it is anti-correlated; and irrational people are more likely to win than irrational people. Or perhaps not. Maybe there are just a lot more irrational people in the world than rational, so more winners are drawn from this larger pool. Perhaps more rational people have a higher probability of winning. But either way, whether rationality leads to winning or losing and if so by how much, is an empirical question to be answered by measurement and research, not something to be blithely asserted.

But either way whether rationality leads to winning is an empirical question to be answered by measurement and research, not something to be blithely asserted.

As with anything, but if you unpack "rationality" to "having the right model of the world", the opposite of "rationality leads to winning" is "the world is made of an explicitly anti-rational force", that is "magic". I would assign a very low probability to it: counter-examples like Oprah seems to elevate more the probability of "people compartimentalize" than "the universe guides her energy".

Also, we should not neglect the base rates. If more than 99% of people on this planet are irrational by LW standards, then we should not be surprised by seeing irrational people among the most successful ones, even if rationality increases the probability of success.

In other words, if you would find that (pulling the numbers out of hat) 99% of all people are irrational, but "only" 90% of millionaires are irrational, that would be an evidence that rationality does lead to (increased probability of) winning.

Also, in real humans, rationality isn't all-or-nothing. Compare Oprah with an average person from her reference group (before she became famous). Is she really less rational? I doubt it.

Also, in real humans, rationality isn't all-or-nothing. Compare Oprah with an average person from her reference group (before she became famous). Is she really less rational? I doubt it.

That seems entirely possible. Consider the old chestnut that entrepreneurs are systematically overoptimistic about their chances of success and that startups and similar risks are negative expected value. Rational people may well avoid such risks precisely because they do not pay, but of the group of people irrational enough to try, a few will become billionaires. Voila! Another example: any smart rational kid will look at career odds and payoffs to things like being a musician or a talk show host, and go 'screw that! I'm going to become a doctor or economist!', and so when we look at mega-millionaire musicians like Michael Jackson or billionaire talk show hosts like Oprah... (We are ignoring all the less rational kids who wanted to become an NFL quarterback or a rap star and wind up working at McDonald's.)

Another point I've made in the past is that since marginal utility seems to diminish with wealth, you have to seriously question the rationality of anyone who does not diversify out of whatever made them wealthy, and instead go double or nothing. Did Mark Zuckerberg really make the rational choice to hold onto Facebook ownership percentages as much as possible even when he was receiving offers of hundreds of millions? Yes, he's now currently a billionaire because he held onto it and worth increased some orders of magnitude, but social networks have often died - as he ought to know, having crushed more than his fair share of social networks under his heel! In retrospect, we know that no one (like Google+) has killed Facebook the way Facebook killed Myspace. But only in retrospect.

Or since using these past examples may not be convincing to people since it's too easy to think "obviously holding onto Facebook was rational, gwern, don't you remember how inevitable it looked back in 2006?" (no, I don't, but I'm not sure how I could convince you otherwise), let's use a more current example... Bitcoin.

At least one LWer currently holds something like >500 bitcoins, which at the current MtG price could be sold for ~$120,000. His net worth independent of his bitcoins is in the $1-10,000 range as best as I can estimate. I am sure you are seeing where I am going with this: if bitcoin craters, he will lose something like 90% of his current net worth, but if bitcoin gains another order, he could become a millionaire.

So here's my question for you, if you think that it's obvious that Oprah must have been rational, and was not merely an irrational risk-seeker who among other things got lucky: right now, without the benefit of hindsight or knowledge of inevitability of Bitcoin's incredibly-obvious-success/obviously-doomed-to-failure, is it rational for him to sell or to keep his bitcoins? Is he more like Zuckerberg, who by holding makes billions; or more like all the failed startup founders who reject lucrative buyouts and wind up with nothing?

It is rational for him to:

[pollid:428]

Suppose he holds, and Bitcoin craters down to the single dollar range or less for an extended time period; do you think people will regard his decision as:

[pollid:429]

Suppose he holds, and Bitcoin gains another order of magnitude (>$1000) for an extended time period; do you think people will regard his decision as:

[pollid:430]

Suppose he sells, and Bitcoin craters down to the single dollar range or less for an extended time period; do you think people will regard his decision as:

[pollid:431]

Suppose he sells, and Bitcoin gains another order of magnitude (>$1000) for an extended time period; do you think people will regard his decision as:

[pollid:432]

Do I think people will regard his decision, or would I regard his decision? Are these people general population, or LW? How much do they know about his reasoning process?

I intended general people, and I don't think they would much care. If you want more detailed scenarios and hypotheticals, feel free to reply to my comment with your preferred poll questions.

It is rational for him to:

Expanding on RomeoStevens' comment... Maths time! Suppose that he has now 10,000 dollars and 500 bitcoins, each bitcoin now costs $100, and that by the end of the year a bitcoin will cost $10 with probability 1/3, $100 with probability 1/3, and $1000 with probability 1/3. Suppose also that his utility function is the logarithm of his net worth in dollars by the end of the year. How many bitcoins should he sell to maximize his expected utility? Hint: the answer isn't close to 0 or to 500. And I don't think that a more realistic model would change it by that much.

Khoth suggests modeling it as starting with an endowment of $60k and considering the sum of the 3 equally probable outcomes plus or minus the difference between the original price and the closing price, in which case the optimal number of coins to hold seems to be 300:

last $ sort $ map (\x -> (log(60000 - 90*x) + log(60000) + log(60000 + 900*x), x)) [0..500]
(34.11321061509552,300.0)

Of course, your specific payoffs and probabilities imply that one should be buying bitcoins since in 1/3 of the outcomes the price is unchanged, in 1/3 one loses 90% of the invested money, and in the remaining 1/3, one instead gains 1000% of the invested money...

if you think that it's obvious that Oprah must have been rational

I wrote she is probably more rational than an average person from her reference group (before she became famous); by which I meant: a poor black woman pregnant at age 14. Being overoptimistic does not contradict that.

Being overoptimistic does not contradict that.

No, but it does put pressure on your claim. You have to be very optimistic or very risk-seeking to ride your risky career all the way up past instant-retirement/fuck-you money levels (a few millions) to the billions point, and not sell out at every point before then to enjoy your gains. What fraction of the general population ever founds a startup or new company or takes an equivalent risk? Her career pushes Oprah way out onto the tail.

Now, maybe the average black pregnant teenager is so irrational in so many ways that their average problems make Oprah on net more rational even though she's lunatically optimistic or risk-seeking (although here we should question how irrational having a kid is, given issues like welfare and local cultures and issues discussed in Promises I Can Keep and marriage gambits and that sort of thing), but it's going to be much harder to establish that about an Oprah-with-lunatic-risk-appetite rather than what we started with, the Oprah-who-is-otherwise-looking-pretty-darn-rational.

Is retiring relatively young a more rational choice than continuing to work at something you like?

It seems like pretty remarkable luck if the thing you want to do most in the world is also what you're currently being paid to do.

On the other hand, how good are people who retire at finding what they want most to do?

A person who's more rational than average (especially about introspection) might do well to retire, but most people might be rationally concerned that they'd just drift.

I don't know what population-wide aggregates might look like. At least in Silicon Valley, there apparently are many people who have retired early and have the ability and inclination to express any dissatisfaction online in places where I might read them, but I can't think of any who have said things like "My life has been miserable since I cashed out my millions of dollars of Google shares and I have nothing to do with myself."

Retiring early means you have the money for doing a great many things, and you are still in physical & mental shape to enjoy it; Twain:

“The whole scheme of things is turned wrong end to. Life should begin with age & its privileges and accumulations, & end with youth & its capacity to splendidly enjoy such advantages. As things are now, when in youth a dollar would bring a hundred pleasures, you can’t have it. When you are old, you get it & there is nothing worth buying with it then. It’s an epitome of life. The first half of it consists of the capacity to enjoy without the chance; the last half consists of the chance without the capacity.”

And what factors enabled this early retirement in the first place? A motivated intelligent person (albeit with a bad appetite for risk and inability to cash out) can find plenty of rewarding things to occupy themselves with, like charity or education. Steve Woziak and Cliff Stoll immediately come to mind, but I'm sure you can name others.

So, I said he'd be considered rational in all cases except hold/fail. That's because people will take his success as evidence that he knows what he's doing, and if he sells then he's doing what 'everyone else' (i.e. > 99.9% of the world) would do, so even if it doesn't work out that way they'd probably give him some slack.

Also, I think it's rational for him to diversify, but it's not a bad idea for him to maintain significant holdings.

why is buying and selling binary? he should clearly rebalance.

Basic statistics question: if we find that 99% of all people are irrational, but "only" 90% of millionaires are irrational, is that evidence that rationality does lead to (increased probability of) winning, or is it only evidence that rationality is correlated with winning? For instance, how do I know that millionaires aren't more rational simply because they can afford to go to CFAR workshops and have more freetime to read LessWrong?

I.e. knowing only that 99% of all people are A but "only" 90% of millionaires are A, how do I adjust my respective probabilities that

  1. A --> millionaires
  2. Millionaires --> A
  3. Unknown factor C causes both A and millionaires

It feels like I ought to assign some additional likelihood to each of these 3 cases, but I'm not sure how to split it up. Maybe the answer is simply, "gather more evidence to attempt to tease out the proper causal relationship".

This is a causal question, not a statistical question. You answer by implementing the relevant intervention, usually by randomization, or maybe you find a natural experiment, or maybe [lots of other ways people thought of].

You can't in general use observational data (e.g. what you call "evidence") to figure out causal relationships. You need causal assumptions somewhere.

It feels like I ought to assign some additional likelihood to each of these 3 cases, but I'm not sure how to split it up.

Two things:

1) Your prior probabilities. If before getting your evidence you expect that hypothesis H1 is twice as likely as H2, and the new evidence is equally likely under both H1 and H2, you should update so that the new H1 remains twice as likely as H2.

2) Conditional probabilities of the evidence under different hypotheses. Let's suppose that hypothesis H1 predicts a specific evidence E with probability 10%, hypothesis H2 predicts E with probability 30%. After seeing E, the ratio between H1 and H2 should be multiplied by 1:3.

The first part means simply: Before the (fictional) research about rationality among millionaires was made, which probability would you assign to your hypotheses?

The second part means: If we know that 99% of all people are irrational, what would be your expectation about % of irrational millionaires, if you assume that e.g. the first hypothesis "rationality causes millionaires" is true. Would you expect to see 95% or 90% or 80% or 50% or 10% or 1% of irrational millionaires? Make your probability distribution. Now do the same thing for each one of the remaining hypotheses. -- Ta-da, the research is over and we know that the % of irrational millionaires is 90%, not more, not less. How good were the individual hypotheses at predicting this specific outcome?

(I don't mean to imply that doing either of these estimates is easy. It is just the way it should be done.)

Maybe the answer is simply, "gather more evidence

Gathering more evidence is always good (ignoring the costs of gathering the evidence), but sometimes we need to make an estimate based on data we already have.

It does not follow that the opposite of "rationality leads to winning" is "the world is made of an explicitly anti-rational force". Were I to discover that rationality does not lead to winning, or worse yet that irrationality leads to winning, I would find it much more likely that incorrect beliefs enable people to take actions that lead them to winning rather than that the world is made of an explciitly irrational force. For example, if people believe they are average or below, they may be less aggressive and settle for less than if, by virtue of Dunning-Krueger, they believe they are exceptional and try for more. I don't know that this is true--I might not assign it even a 50% probablility of being true-- but it's not self-evidently false. The question of whether rationality leads to winning, or which parts of rationality lead to winning, is an empirical question, not a logical question.

Another example: it depends on where you set the bar for winning. For example, suppose we set the bar for winning at a billion dollars. Rational people, acting to maximize their own utility, may well choose to plug away in a Fortune 500 company for 40 years, putting away a nice chunk of change in a 401K, investing in index funds, spend no more than 30% of take home pay on housing, and retire at 60 with a few million dollars in the bank. But by this standard they haven't "won" even though they maximized their expected utility at low risk. Irrational people may sink their savings into a startup, and just might hit it big and "win". Then again they may lose it all and die alone in a hole. But the winners will still be comprised of irrational people. I.e. playing the lottery is almost as irrational as you can get, and every single lottery winner is irrational.*

Perhaps the mistake here is in in looking at winners individually. Dare I say it, could this be selection bias? What we need to do to figure out if rationality helps us win or not, is not look at the winners and ask what they did to help them win. Rather we need to get a large sample of rational and irrational people, and sum the winnings and losing of each and see who comes out ahead per capita. Perhaps the mean rational person finishes life a few million utilons ahead, and the average irrational person finishes life a few million utilons behind, but the people who are a few billion utilons ahead are all irrational. I don't know the answer to this question, but I'd be really curious to find out.

  • Actually there are some sometimes rational reasons to play the lottery, especially the big Powerball types. It's uncommon; but I do know of one case where a rational investment group played the lottery and won. I know of no rational reasons for playing the smaller daily Pick 3/Pick 4 lotteries.

I'd love to hear a story (or maybe stories) of someone becoming an LW regular, improving their "rationality skills" and going on to "win" (define and achieve their personal goals), thanks to those skills.

Here I explicitly exclude LW-related goals, such as understanding the sequences, attending a CFAR workshop or being hired by CFAR, signing up for cryonics or figuring out how to donate more to GiveWell. Instead, I'd love to hear how people applied what they learned on this site to start a business, make money, improve their love life (hooking up with an LW poly does not count for this purpose), or maybe to take over the world.

Hopefully some of the stories have already been posted here, so links would be appreciated.

When I found LW, I was confused and nonambitious; my goal was to survive on as little money as possible (to ironically humiliate the people who say $17/hr is the minimum living wage), and maybe make a few video games or something, and I spent most of my free time on 4chan and arguing about radial politics on the internet.

Since coming to LW, I've used LW-far-epistemic rationality to figure out a great deal of philosophical confusions and understand a great deal more about those big questions. (this doesn't count, but it should be mentioned)

More specifically and interestingly: it took explicit LW rationality for me to:

  • Think rationally about balancing my resources (time, money) and marginal utility, to great productivity benefit.

  • Step up to run the vancouver LW meetup.

  • Make and maintain a few really valuable friends (mostly through the meetup).

  • Respond positively to criticism at work, so that I've become much more valuable than I was 6 months ago, in a way that has been recognized and pointed out.

  • Achieve lightness and other rationality virtues in exploring design concepts at work, taking design criticism, not getting caught in dead ends. I explicitly apply much of what I've learned at LW to my work, though I'm unsure how much of that is just how I verally describe things I'd do anyways.

  • Become poly with my wife rationally and in a controlled manner.

  • Switch my goals from unambitious to working on the biggest problems I can find, like taking over the world. (this is actually hard).

  • start using beeminder, get a smartphone, use pomodoro, and use remember the milk, for large measurable (just look at my beeminder graphs) improvement in personal project productivity, sex life, excercise, etc.

  • Actually put in the hard work and strategic criticism-seeking that it took to actually get a really good job.

  • Take more rational risks and make better small decisions every day

  • Actually ask and get a cute girl's number today (yay)

Of course, my ambition has scaled way faster than my achievement, so despite the semi-impressive list above, I feel like I'm way behind where I should be.

The Motivation Hacker is one such story, though it focuses on the relevance of the stuff in my procrastination post rather than on the relevance of the rest of LW.

If Rationality is Winning, or perhaps more explicitly Making The Decisions That Best Accomplish Whatever Your Goals Happen to Be, then Rationality is so large that it swallows everything. Like anything else, spergy LW-style rationality is a small part of this, but it seems to me that anything which one can meaningfully discuss is going to be one such small portion. One could of course discuss Winning In General at a sufficiently high level of abstraction, but then you'd be discussing spergy LW stuff by definition - decision theory, utility, and so on.

If businessfolk are rather rational at running businesses, but no rational than anyone else about religion, or if people who have become experts on spergy LW stuff are no more winningful about their relationships, &c. &c. this (to my mind) brings into question the degree to which a General Rationality-as-Winningness skill exists. You acknowledge the distinction between explicit and tacit rationality, but do you expect successful entrepreneurs to be relatively more successful in their marital life? When you say you want to teach tacit rationality, do you mean something distinct from Teaching People How To Do Things Good?

I never did find out if any sizable fraction of Less Wrongers would bite this bullet. That is to say, to affirm the claim that, all else equal, a person with more physical strength is necessarily more rational.

I don't see a bullet. Obviously, other things matter as well as rationality. Rationality, even instrumental rationality, is not defined as winning. Those who speak as if it was are simply wrong, and your example is the obvious refutation of such silliness.

I think your thought experiment illustrates well that often the "Rationality is Winning" meme doesn't quite carve the space too well. Here, rationality is using the right tactics or, if only one is available, spending the right amount of time on the right tasks proportional to how much they value their goals and how achievable they are.

If we resurrect Alice and Bob as hypothetical monovalue agents who only exclusively value deadlifting X, and who have only one method of attempting/training to deadlift X, then the game tree is skewed, Bob wins faster, Alice is screwed and wins slower. Both are fully rational if they spend all available resources on this goal (since it's all these hypothetical agents value), even though Alice spends more resources for longer before achieving the goal.

For more game theory mumbo-jumbo: I view "rationality" more in terms of how you build and navigate the game tree, rather than a post-hoc analysis of who ended up in the best cell of the payoff matrices. Or, to put it differently, rationality is ending up at the best cell of your payoff matrix, regardless of whether someone else just has +5 on all cells of their matrix or has more options or whatever.

So if my understanding that you were making with this a critique of the "Rationality is winning" meme, I agree that it's a bit misleading and simplistic, but it still is "taking the best course of action with the resources and options available to you, reflectively and recursively including how much you spend figuring out which courses of action are better" - Expected Winning Within Available Possible Futures

This is a really good point and it is also related to Manfred's comment that I don't personally know how to reconcile with some of the points in the article. On one hand, I would like to have a lot of money because a lot of inconvenient things would suddenly become much easier. On the other hand, I would have to do other inconvenient things, like manage a lot of money. Also, I don't think I would be happy doing Oprah's job, even if it resulted in a lot of money. Basically, I would not mind lots of money but it is not currently a priority. So I don't know if I'm actually winning or not, oops.

Therefore, a poll!

How successful are you? [pollid:426]

From a fame, money or bragging rights perspective, how ambitious are your current goals?[pollid:427]

Explicit rationality can help you realize that you should force tight feedback loops into whichever domains you want to succeed in, so that you can have develop good intuitions about how to succeed in those domains.

A realization:

PUA, or at least what seems to me to be the core concept of some schools of PUA, makes a ton of sense when viewed in this light. Trying to pick up a stranger in a bar is probably the tightest feedback loop possible for social skills, and like the OP says, social skills are massively important for success and happiness. Therefore, going out every Friday night and brazenly flirting with as many attractive people as possible seems like an incredibly good way to rapidly improve your life and chances of success. I was always baffled and repelled by the "self-actualization teachings disguised as advice on how to get laid" nature of some PUA, but now it seems really really desirable.

Which of these three options do you think will train rationality (i.e. systematized winning, or "winning-rationality") most effectively?

One of these things is not like the others. I can read the Sequences for free, and I can attend a workshop relatively cheaply, but the time and money investments into a startup are quite significant. Most people cannot afford them. Eating cake is not an option for them; they can barely afford bread.

You say that, "I know plenty of business managers and entrepreneurs who have a steady track record of good decisions and wise judgments, and yet they are religious, or they commit basic errors in logic and probability when they talk about non-business subjects."

You must know different business managers and entrepreneurs than I do. I can think of few if any business managers and entrepreneurs who have a steady track record of good decisions and wise judgments. There are some common positive characteristics I see in the business managers I know, and another group of common characteristics I see in the entrepreneurs I know (nor do the two groups share the same set of common characteristics, I might add) but in neither group are good decisions and wise judgments part of those common characteristics.

I do see a lot of hindsight bias and survivorship bias in both groups though. Out of a large pool of managers and entrepreneurs, the successful ones inevitably attribute their success to personal characteristics and skill, but it's not at all obvious they aren't just the lucky ones who happened to stumble into a profitable opportunity. One frequent characteristic of successful entrepreneurs is that they have tried many things, and usually failed at more of them they've succeeded at. If they were both rational and able to apply rationality to their plans, you'd expect them to succeed a lot more often.

In short, she had to be fairly rational, at least in some domains of her life.

Not a comment about Oprah. Repeat, not a comment about Oprah. Once more: not a comment about Oprah.

But a comment about the idea that rationality leads to success. Deception and violence also lead to success. These problem solvers are systemized winning: IF (application of fraud) THEN (goal met) ELSE (blame others). "Violence isn’t the only answer, but it is the final answer." - Jack Donovan. Violence and deception are social skills. When talking rationality comes apart, these two are means for winning rationality.

"When you are not practicing, remember, someone somewhere is practicing, and when you meet him he will win." - Ed Macauley. Practice the means to not be taken in by deception or undone by violence. Of course neither I nor anyone anywhere would advocate deceiving others or being violent, ever, under any circumstances, no matter how rational and life-saving and winning.

Deception and violence also lead to success. These problem solvers are systemized winning.

Martial-Art-Of-Rationality-Wise, this reminds me of people in epistemically vicious arts who say that western boxers couldn't beat them "on the street," because they could just gouge their eyes, bite them, and kick them in the cojones. It turns out that, if a strategy is available to everyone, it gets exploited until it's no longer an overwhelming advantage.

Whether that's because everyone's increased their use of violence and deception, or because they've coordinated to lower the marginal effectiveness of an additional unit of violence, is immaterial. Either way, violence and deception aren't a $20 bill lying on the ground, waiting for someone to pick it up. That wouldn't be a nash equilibrium.

One simplification I think you're making that raises some problems is money. Why Oprah? Why not Charles Wuorinen, who makes excellent musical decisions and has many learned skills? Who has access to tight feedback loops as soon as someone else listens to what he writes? Who is really good at what he does? Because Oprah's skills are better for collecting slips of green paper.

Now, one can collect quite a few slips of green paper and still, say, suffer from depression, or just generally be unhappy. Perhaps we could even claim that Wuorinen is happier than Oprah - I wouldn't know, but it doesn't sound outlandish. But you, you are someone trying to save the world, and you have excellent uses for slips of green paper. So maybe you were just focused on the skills related to slips of green paper because of what you (or other world-savers) could do with those skills.

And so I propose a definition of tacit rationality that takes this into account: The skills that to you would be highly valuable.

Another interesting example of the utility of tight feedback loops, this time as applied to education, is extreme apprenticeship. I've been taking one math class built around the XA method, and it has felt considerably more useful and rewarding than ordinary math classes.

Among other things, XA employs bidirectional feedback loops - student-to-teacher and teacher-to-student. Students are given a lot of exercises to do from day one, but the exercises are broken into small chunks so that the students can get a constant sense of making progress, and so that they can clearly articulate the thing that they didn't understand in case they run into trouble. While students can just do the exercises by themselves if they wish, there are also scheduled exercise sessions during which constant help is available. When the exercises do get done, they are checked by the teaching staff and the students are requested to redo them with corrections in case there are major flaws.

Because the exercises are also returned each week, the teaching staff gets constant feedback on the things that the students are having difficulties with, and the content of the lectures can be modified on the fly. In general, lectures are kept to a minimum, and tend to build on content that the students already learned from the exercises rather than introduce entirely new material.

Some reports of the method: 1, 2, more.

Hypothesis for what tacit rationality might be: glomming onto accurate premises about what actions are likely to achieve one's goals without having a conscious process for how one chooses premises.