Scope insensitivity — When you don't care enough to use mouthwash.
Availability heuristic — People in open relationships are hotter.
Endowment effect — People with large attributes tend to think size matters more.
Hyperbolic discounting — Black Friday, for instance.
Observer-expectancy effect — Being able to see that someone is pregnant.
Fundamental attribution error — Blaming everything on the Religious Right.
Halo effect — Blaming everything on the Covenant.
Primacy effect — It's easy to remember apes and monkeys.
Availability cascade — When your whole social circle becomes polyamorous.
Curse of knowledge — What you get for reading the Necronomicon.
Denomination effect — Catholics spend more money than Methodists.
Restraint bias — Favoritism shown towards bondage practitioners.
Illusion of transparency — Ignoring the gunk on your windshield.
System justification — When your computer lines up your text for you.
Peak-end rule — The king who stands on the mountaintop will fall.
Reminiscence bump — A feature of phrenology.
Rosy retrospection — Remember how much fun you had with Rose?
Bounded rationality — The logic of kangaroos.
You Must Try, And Then You Must Ask. Note the definition of "try" here is 15 minutes, not the locally-canonical 5. (I think 15 is about right for the context - programming and sysadmin stuff.)
I was discussing recently cryonics with my girlfriend, who is highly uncomfortable with the notion. We identified what may (tentatively) be part of the underlying objection by people, especially loved ones, to cryonics. Essentially, it comes down to a lack of closure. When someone dies, you can usually mourn and move on. But if there' a genuine chance of resurrection, then the ability to more or less move on to some extent goes away.
If this is the case, then one might ask why the same thing doesn't happen with religions that believe in an afterlife. That could be because they believe that everyone will be resurrected. But it may also be that in part, people often don't at some level believe there necessarily will be an afterlife, or if they do, their version of an afterlife is highly abstracted. If that's the case, cryonics may be being hurt by its own plausibility.
That's...that's terrible. That it would feel worse to have a chance of resurrection than to have closure. It sounds depressingly plausible that that's people's true rejection, but I hope it's not.
Religion doesn't have the same problem, and in my experience it's because of the certainty. People believe themselves to be absolutely certain in their belief in the afterlife. So there's no closure problem, because they simply know that they'll see the person again. If you could convince people that cryonics would definitely result in them being resurrected together with their loved ones, then I'd expect this particular problem to go away.
Read the comments and weep-- it's almost as though there are a lot of people who resent specificity.
I wouldn't have been surprised if there were people who said "but randomizing means that half the people aren't getting the obviously valuable help!", but nobody said that.
A person A uses "common sense" to suggest a solution X, which seems obvious to them.
Research shows that in reality, Y is much better than X, and sometimes X is actually not even helpful.
A person B says: See? This is why you should always collect and evaluate data!
A person C says: No, the real lesson is that we should use common sense which obviously recommends Y! Why didn't anybody use their common sense?
An FBI hostage negotiator buys a car Anecdote starts at 8:30.
I've been worried that I'm not very good at negotiating about money. Recently, I had evidence to update in that direction. As part of a course, we paired up and did a negotiation roleplay exercise. I was one of two massive outliers out of thirty who agreed an outcome much, much worse than the rest of the group.
The exercise was structured so that there was quite a lot of space between the two negotiator's bottom lines. I was clear of my bottom line. I got everything I had to get. But almost all of the money that was available for negotiation went to the other person. This seems very familiar from other, real-life contexts I've been in.
I don't like the idea that I'm losing money that I could have just by negotiating better. What can I do to get better?
I've read lots and lots of books and articles, and been on lots of courses. I could write a very convincing guide to negotiation skills. I think that I would find doing more of this very interesting, but it wouldn't make me any better at it in practice.
I've explored more training courses, but haven't found any that offer more than a handful of role plays, and none that promise the sort of feedback that would make deliberate practice#Deliberate_practice) work. Do such courses exist?
My other idea was to practice with friends using games that involve a lot of negotiation. But I'm good at those games already. The skill doesn't seem to transfer over to real-world contexts. Are there games that are very close to real world negotiations?
What have I not thought of?
There are plenty of real world situations where you can negotiate prices. If you buy groceries at a chain such as Walmart you can't negotiate prices.
If you however buy them at a farmers market you can.
Even if you buy a hot dog at a street vendor you can negotiate for price.
Are there games that are very close to real world negotiations?
A friend of mine learned negotiating by trading world of Warcraft items. I think he brought WoW-gold for 50€ and used that as capital basis for trading the items.
When you want some form of interacting with your friends, you could bet with them frequently. Betting means that you can negotiate the odds.
At NYU a month or so back, I saw a flier for paid participation in an experiment which assesses your negotiation skills, and vitally, offers feedback. I don't know if you're in the correct area, but if you are, it might be worth looking into whether it's still running (I don't know for a fact that it is, but I do know that other advertized studies have run for significantly longer periods than that.)
I've been thinking about scope insensitivity, and wonder whether it can be mistaken for decreasing marginal value. Suppose a slice of pizza costs $1, and you're willing to buy it for that much. That doesn't mean that you'd be willing to buy a million slices of pizza for a million dollars - in fact, if you're only hungry enough for one slice and don't have a refrigerator handy, you may not want to pay more than $1 for any amount of pizza greater than or equal to once slice. The same can apply to donating to charity to save lives. You may value saving one life for $200, but maybe you're not willing to pay $400 to save two lives.
You may value saving one life for $200, but maybe you're not willing to pay $400 to save two lives
This is what scope insensitivity is. The original paper calls it "purchase of moral satisfaction" -- the revealed preference in these experiments is for an internal state of moral satisfaction as opposed to the actual lives you're saving. Like hunger, the internal state is quickly satiated and so exhibits diminishing returns, but actual lives do not exhibit diminishing returns (in the relevant range, for humans, on reflection).
That would be an effective demonstration of scope insensitivity in an ideal scenario where money has a flat conversion to utility in that range for the individual in question. If $200 is in the subject's budget, but $400 is not, this may be entirely rational behavior. A donation which puts you into debt will have a much more dramatic effect on your own utility than one which leaves you solvent.
Someone on Reddit attempted to apply cognitive biases, based on Eliezer Yudkowsky's "Cognitive Biases Potentially Affecting Judgment of Global Risks", to Bitcoin: http://www.reddit.com/r/Bitcoin/comments/1qua30/cognitive_biases_that_make_people_skeptical/
I don't think I agree with all of it (is 'black swan' really a cognitive bias? and I don't think bystander apathy applies to investments), but still interesting.
A long while ago there was a discussion about helping people write posts, and I seem to remember it ended with a mailing list to which anyone who wanted feedback or help with polishing their draft could send it. At least that's how I remember it, but I can't find that discussion now. Whatever happened to that idea, is it still current?
I created a subreddit "Rozum" for rationalists who would like to have LW-style rationality discussions in Slovak (or Czech) language. Anyone interested, just send me a private message with your username.
Note: I am unfamiliar with the Reddit system; I just assume there are enough similarities with LW system, especially the upvotes and downvotes. And I am completely unfamiliar with the moderator's options, so there is a chance I set something wrong, in which case I will try to fix it later.
At this moment, the subreddit settings are "anyone can read, only approved members can contribute". (I hope it also means that only the approved members can vote.) It is supposed to be a temporary solution, and the reason is to protect the fresh new community from the possibility of being overrun and overvoted by a small group of trolls or insane people.
In the evening I will write some introduction there, I just wanted to announce it now here, to see how many people are interested.
So I've been reading Worm ( parahumans.wordpress.com ), and there's this tiny thing that's been growing ever-more annoying, and I can't hold off asking about it any longer.
I keep seeing passages like this: "Realizing the position he had me in, feeling the pressure of his thighs against my hips, his weight resting partially on my lower body, I must’ve blown a synapse. My thought process ground to a halt. It didn’t help that the first place my mind went was interpreting his ‘start’ as being this position leading to something else."
Do people actually think like this? Seems like it would be really inconvenient.
I pretty rarely wrestle with people (or otherwise have such close physical contact), so if someone I have a crush on is literally sitting on top of me pinning me on my back, I have trouble staying calm. (I'm thinking of a particular time this happened to me. I felt about like Taylor does in your Worm quote.)
Someone on Reddit attempted to apply cognitive biases, based on Eliezer Yudkowsky's "Cognitive Biases Potentially Affecting Judgment of Global Risks", to Bitcoin: http://www.reddit.com/r/Bitcoin/comments/1qua30/cognitive_biases_that_make_people_skeptical/
I don't think I agree with all of it (is 'black swan' really a cognitive bias? and I don't think bystander apathy applies to investments), but still interesting.