Open thread, Apr. 17 - Apr. 23, 2017

This is the (late) weekly open thread. See the tag. You'd think we could automate this. The traditional boilerplate follows.


If it's worth saying, but not worth its own post, then it goes here.


Notes for future OT posters:

1. Please add the 'open_thread' tag.

2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)

3. Open Threads should start on Monday, and end on Sunday.

4. Unflag the two options "Notify me of new top level comments on this article" and "

Comments

sorted by
magical algorithm
Highlighting new comments since Today at 3:10 PM
Select new highlight date
All comments loaded

Hey guys, I'm fairly new to the rationality community (only at page 350 of the book), but I think I might have experienced a belief in belief in belief. I'm trying not to spend too much time online and this story is a bit embarrassing, but I remember that Eliezer wondered about it so I figured I might as well share.

I have a pretty bad relationship with my father, and I don't think very highly of him. But one thing I notice is that whenever he does something that hurt me/I consider selfish, I'm always scandalized. I tried to figure out why I keep reacting that way, because if you asked me to predict my father's behavior I'll probably come up with something pretty negative. So even if a part of me still hope for a better relationship, It makes no sense for me to be surprised by his behavior.

Then I thought, what if I keep that surprise and anger because a thought of me not being surprised by it, of me being so indifferent to my own father, is monstrous to me? Thinking that I might not be sad at his funeral (not that it's close or anything like that) actually scares me. I don't know how I could live with myself if I truly one hundred percent gave up on my father.

So, it's not that I believe he's a good father, It's not that I believe I should believe he's a good father, It's that I believe I should believe I should believe he's a good father.

To explain: First level of belief – I expect my father to be a good parent. Second level of belief – Believing that my father is a good parent has some benefit, so I'll "believe" it to get the benefit, or the placebo effect of the benefit. Third level of belief – TRYING to believe that my father is a good parent have the benefit of me not having to think about myself as cold hearted. It's making the effort that count, not the result, so it never needed to go as far as changing what I think about my father, or changing what I think I should think. It's not that I think I should think he's a good parent, is that I think I should try to think that.

Or maybe I just haven't truly accepted that that's the way he is. Can you accurately predict a situation and still not accept it? I usually think about the world in terms of "believing in your heart" and "believing in your mind", but shouldn't a complete understanding in your mind also change your heart?

Congratulations; what you wrote here makes a lot of sense! It is probably very frequent that people cling to a belief because of what having this belief means about them. "Am I a good person or a bad person for believing X?"

A word of warning though: we cannot easily revert this stupidity, because it can work both ways. For example, both "I believe in X, because I am a good person" and "I don't believe in X, because I am a sophisticated person" are ultimately about your image. At the end, the only thing relevant to making correct beliefs about X is, well, the evidence about X. Not what it means about us.

Also, words like "bad" are probably too general. Your father can be doing a good thing A, and a bad thing B (and a morally neutral thing C) -- these facts are not mutually exclusive. It might make more sense to be more specific about the ways he disappoints you, and the ways he doesn't.

Everything I wanted to say you said better and before, so have a karma (the original post too).

It's almost like we read the same Sequences. :D

OMG, it's a brainwashing cult, run for your lives!!!

TL;DR: What are some movements you would put in the same reference class as the Rationality movement? Did they also spend significant effort trying not to be wrong?

Context: I've been thinking about SSC's Yes, We have noticed the skulls. They point out that aspiring Rationalists are well aware of the flaws in straw Vulcans, and actively try to avoid making such mistakes. More generally, most movements are well aware of the criticisms of at least the last similar movement, since those are the criticisms they are constantly defending against.

However, searching "previous " in the comments doesn't turn up any actual exemples.

Full question: I'd like to know if anyone has suggestions for how to go about doing reference class forcasting to get an outside view on whether the Rationality movement has any better chance of succeeding at it's goals than other, similar movements. (Will EA have a massive impact? Are we crackpots about Cryonics, or actually ahead of the curve? More generally, how much weight should I give to the Inside View, when the Outside View suggests we're all wrong?)

The best approach I see is to look at past movements. I'm only really aware of Logical Positivism, and maybe Aristotle's Lyceum, and I have a vague idea that something similar probably happened in the enlightenment, but don't know the names of any smaller schools of thought which were active in the broader movement. Only the most influential movements are remembered though, so are there good examples from the past ~century or so?

And, how self-critical were these groups? Every group has disagreements over the path forward, but were they also critical of their own foundations? Did they only discuss criticisms made by others, and make only shallow, knee-jerk criticisms, or did they actively seek out deep flaws? When intellectual winds shifted, and their ideas became less popular, was it because of criticisms that came from within the group, or from the outside? How advanced and well-tested were the methodologies used? Were any methodologies better-tested than Prediction Markets, or better grounded than Bayes' theorem?

Motive: I think on average, I use about a 50/50 mix of outside and inside view, although I vary this a lot based on the specific thing at hand. However, if the Logical Positivists not only noticed the previous skull, but the entire skull pile, and put a lot of effort into escaping the skull-pile paradigm, then I'd probably be much less certain that this time we finally did.

Just a few groups that have either aimed at similar goals, or have been culturally influential in ways that keep showing up in these parts —

  • The Ethical Culture movement (Felix Adler).
  • Pragmatism / pragmaticism in philosophy (William James, Charles Sanders Peirce).
  • General Semantics (Alfred Korzybski).
  • The Discordian Movement (Kerry Thornley, Robert Anton Wilson).
  • The skeptic/debunker movement within science popularization (Carl Sagan, Martin Gardner, James Randi).

General Semantics is possibly the closest to the stated LW (and CFAR) goals of improving human rationality, since it aimed at improving human thought through adopting explicit techniques to increase awareness of cognitive processes such as abstraction. "The map is not the territory" is a g.s. catchphrase.

The SSC article about omega-6 surplus causing criminality brought to my attention the physiological aspect of mental health, and health in general. Up until now, I prioritized mind over body. I've been ignoring the whole "eat well" thing because 1) it's hard, 2) I didn't know how important it was and 3) there's a LOT of bullshit literature. But since I want to live a long life and I don't want my stomach screwing with my head, the reasonable thing to do would be to read up. I need book (or any other format, really) recommendations on nutrition 101. Something practical, the do's and don'ts of food and research citations to back it up. On a broader note, I want to learn more about biodeterminism, also from a practical perspective. There might be conditions in my environment causing me issues that I don't even know of. It goes beyond nutrition.

You Can’t Trust What You Read About Nutrition

http://fivethirtyeight.com/features/you-cant-trust-what-you-read-about-nutrition/

"Some populations today thrive on very few vegetables, while others subsist almost entirely on plant foods. The takeaway, Archer said, is that our bodies are adaptable and pretty good at telling us what we need, if we can learn to listen."

I put together a PDF of my notebook scrawls from EAG 2016 and the February 2017 CFAR workshop!

If anyone wants to check it out, here's the link on Google Drive.

There are uncountably many sentences of countably infinite length. The sentence at hand is countably long and thus cannot contain them all. It cannot even contain two countably infinite strings that don't have a common infinite suffix.