LessWrongers as a group are often accused of talking about rationality without putting it into practice (for an elaborated discussion of this see Self-Improvement or Shiny Distraction: Why Less Wrong is anti-Instrumental Rationality). This behavior is particularly insidious because it is self-reinforcing: it will attract more armchair rationalists to LessWrong who will in turn reinforce the trend in an affective death spiral until LessWrong is a community of utilitarian apologists akin to the internet communities of anorexics who congratulate each other on their weight loss. It will be a community where instead of discussing practical ways to "overcome bias" (the original intent of the sequences) we discuss arcane decision theories, who gets to be in our CEV, and the most rational birthday presents (sound familiar?).
A recent attempt to counter this trend or at least make us feel better about it was a series of discussions on "leveling up": accomplishing a set of practical well-defined goals to increment your rationalist "level". It's hard to see how these goals fit into a long-term plan to achieve anything besides self-improvement for its own sake. Indeed, the article begins by priming us with a renaissance-man inspired quote and stands in stark contrast to articles emphasizing practical altruism such as "efficient charity"
So what's the solution? I don't know. However I can tell you a few things about the solution, whatever it may be:
- It wont feel like the right thing to do; your moral intuitions (being designed to operate in a small community of hunter gatherers) are unlikely to suggest to you anything near the optimal task.
- It will be something you can start working on right now, immediately.
- It will disregard arbitrary self-limitations like abstaining from politics or keeping yourself aligned with a community of family and friends.
- Speaking about it would undermine your reputation through signaling. A true rationalist has no need for humility, sentimental empathy, or the absurdity heuristic.
Whatever you may decide to do, be sure it follows these principles. If none of your plans align with these guidelines then construct a new one, on the spot, immediately. Just do something: every moment you sit hundreds of thousands are dying and billions are suffering. Under your judgement your plan can self-modify in the future to overcome its flaws. Become an optimization process; shut up and calculate.
I declare Crocker's rules on the writing style of this post.
It is probably simply structural that the LessWrong community tends to be about armchair philosophy, science, and math. If there are people who have read through Less Wrong, absorbed its worldview, and gone out to "just do something", then they probably aren't spending their time bragging about it here. If it looks like no one here is doing any useful work, that could really just be sampling bias.
Even still, I expect that most posters here are more interested to read, learn, and chat than to thoroughly change who they are and what they do. Reading, learning, and chatting is fun! Thorough self-modification is scary.
Thorough and rapid self-modification, on the basis of things you've read on a website rather than things you've seen tested and proven in combination, is downright dangerous. Try things, but try them gradually.
And now, refutation!
To, um, what, exactly? I think the question whose solution you're describing is "What ought one do?" Of these, you say:
That depends largely on your moral intuitions. I honestly think of all humans as people. I am always taken aback a little when I see evidence that lots of other folks don't. You'd think I stop being surprised, but it often catches me when I'm not expecting it. I'd suggest that my intuitions about my morals when I'm planning things are actually pretty good.
That said, the salient intuitions in an emotionally-charged situation certainly are bad at planning and optimization. And so, if you imagine yourself executing your plan, I would honestly expect it to feel oddly amoral. It won't feel wrong, necessarily, but it might not feel relevant to morality at all.
This is ... sort of true, depending on what you mean. You might need to learn more, to be able to form a more efficient or more coherent plan. You might need to sleep right now. But, yes, you can prepare to prepare to prepare to change the world right away.
Staying aligned with a community of family and friends is not an arbitrary limitation. Humans are social beings. I myself am strongly introverted, but I also know that my overall mood is affected strongly by my emotional security in my social status. I can reflect on this fact, and I can mitigate its negative consequences, but it would be madness to just ignore it. In my case - and, I presume, in the case of anyone else who worries about being aligned with their family and friends - it's terrifying to imagine undermining many of those relationships.
You need people that you can trust for deep, personal conversations; and you need people who would support you if your life went suddenly wrong. You may not need these things as insurance, you may not need to use friends and family in this way, but you certainly need them for your own psychological well-being. Being depressed makes one significantly less effective at achieving one's goals, and we monkeys are depressed without close ties to other monkeys.
On the other hand, harmless-seeming deviations probably won't undermine those relationships; they're far less likely to ruin relationships than they seem. Rather, they make you a more interesting person to talk to. Still, it is a terrible idea to carelessly antagonize your closest people.
No! If we're defining a "true rationalist" as some mythical entity, then probably so. If we want to make "true rationalists" out of humans, no! If you completely disregard common social graces like the outward appearance of humility, you will have real trouble coordinating world-changing efforts. If you disregard empathy for people you're talking to, you will seem rather more like a monster than a trustworthy leader. And if you ever think you're unaffected by the absurdity heuristic, you're almost certainly wrong.
People are not perfect agents, optimizing their goals. People are made out of meat. We can change what we do, reflect on what we think, and learn better how to use the brains we've got. But the vast majority of what goes on in your head is not, not, not under your control.
Which brings me to the really horrifying undercurrent of your post, which is why I stayed up an extra hour to write this comment. I mean, you can sit down and make plans for what you'll learn, what you'll do, and how you'll save billions of lives, and that's pretty awesome. I heartily approve! You can even figure out what you need to learn to decide the best courses of action, set plans to learn that, and get started immediately. Great!
But if you do all this without considering seemingly unimportant details, like having fun with friends and occasionally relaxing, then you will fail. Not only will you fail, but you will fail spectacularly. You will overstress yourself, burn out, and probably ruin your motivation to change the world. Don't go be a "rationalist" martyr, it won't work very well.
So, if you're going to decompartmentalize your global aspirations and your local life, then keep in mind that only you are likely to look out for your own well-being. That well-being has a strong effect on how effective you can be. So much so that attempting more than about 4 hours per day of real, closely-focused mental effort will probably give you not just diminishing returns, but worse efficiency per day. That said, almost nobody puts in 4 hours a day of intense focus.
So, yes, billions are miserable, people die needlessly, and the world is mad. I am still going out tomorrow night and playing board games with friends, and I do not feel guilty about this.