This is a special post for quick takes by zlrth. Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.

This is my shortform feed (h/t Hazard and Raemon, thanks!). This is for thoughts that are short or half-baked or both.

12 comments, sorted by Click to highlight new comments since:

I sometimes hear rationalist-or-adjacent people say, "I don't have any sacred-cow-type beliefs." This is the perspective of this commenter who says, "lesswrong doesn't scandalize easily." Agreed: rationalists-and-adjacents entertain a wide variety of propositions.

The conventional definition of sacred-cow-belief is: A falsifiable belief about the world you wouldn't want falsified, given the chance. For example: If a theist had the opportunity to open a box to see whether God existed, and refused, and wouldn't let anyone else open the box, that belief is a sacred cow.

A more interesting (to me) definition of sacred cow is: a belief that causes you to not notice mistakes you make. The advantage of this definition is that it's easier to catch rationalists and non-rationalists doing it. Rationalists are much better than average at evaluating well-formed propositions, so they won't be baited by a passionately ignorant person talking about homeopathy.

But there are pre-propositional beliefs[1]. Here are some examples:

Talking about it makes it better.
I'm the man of the house, and the man of the house should be strong.
I'm unworthy.
I can tolerate anything but the outgroup.
Worrying solves nothing.
People enjoy my presence.
People need structure.

You walk into a conversation and start talking and miss what your interlocutor was talking about because you believe things like these.

An example rant: I am damn-near unwilling to falsify my belief that "people need structure." When they ask me for help what am I supposed to say? Anything I suggest is additional structure. Were I to suggest taking a break, that's still a change, and change is structure. Maybe change itself is a problem, but that's beside the point; my interlocutor is asking for advice. That means change, and change entails a change in structure. You might as well make it good structure.

What I just said is intended to show that sacred cows are terse beliefs that constrain decisions in social interaction, and offer easy post hoc rationalization. Say I find out my belief is false. Now, I've been making mistakes all my life. I blundered into conversations adding structure. Without that sacred cow, I don't know what to say.

[1] It's probably a spectrum. All those examples can be made falsifiable. They all have in common vagueness, which begets unfalsifiability.

Note: This is a shorter version of a comment I wrote here, written as a comment on an episode of The Bayesian Conspiracy, a podcast I like.

A more interesting (to me) definition of sacred cow is: a belief that causes you to not notice mistakes you make. The advantage of this definition is that it’s easier to catch rationalists and non-rationalists doing it.

Basically this:

“The conventional definition of [thing widely agreed-upon to be bad] is ‘[normal definition]’. But a more interesting definition is ‘[completely different definition, of which it’s not at all clear that it’s bad]’. The advantage of this definition is that it’s easier to catch people doing it.”

Examples:

“The conventional definition of ‘stealing’ is ‘taking something, without permission, which doesn’t belong to you’. But a more interesting definition is ‘owning something which is unethical to own’. The advantage of this definition is that it’s easier to catch people doing it.”

“The conventional definition of ‘fraud’ is ‘knowingly deceiving people, for profit’. But a more interesting definition is ‘making money by doing something which runs counter to people’s expectations’. The advantage of this definition is that it’s easier to catch people doing it.”

“The conventional definition of ‘adultery’ is ‘having sexual relations with a person other than the one with whom you have a monogamous marriage’. But a more interesting definition is ‘doing something which causes your spouse to experience jealousy’. The advantage of this definition is that it’s easier to catch people doing it.”

Yeah, no kidding it’s easier to catch people doing it—because it’s a completely different thing! Why would you call it by the same term (“sacred cow”, “theft”, “fraud”, “adultery”)—unless you wanted to sneak in negative affect, without first doing the work of demonstrating (or, indeed, even explicitly claiming) that the thing described by your new definition is, in fact, bad?

To be honest, I like all of your new definitions better that conventional ones.

They are more alligned with the actual wrongness of an act.

I confess to some perplexity, as I specifically constructed them to be less aligned with the actual wrongness of an act!

First: Yes I agree that my thing is a different thing, different enough to warrant a new name. And I am sneaking in negative affect.

Yeah, no kidding it’s easier to catch people doing it—because it’s a completely different thing!

Indeed, I am implicitly arguing that we should be focused on faults-we-actually-have[0], not faults-it's-easy-to-see-we-don't. My example of this is the above-linked podcast, where the hosts hem and haw and, after thinking about it, decide they have no sacred cows, and declare that Good (full disclosure: I like the podcast).

"Sacred-cow" as "well-formed proposition about the world you'd choose to be ignorant of" is clearly bad to LWers, so much so that it's non-tribal.

[0] And especially, faults-we-have-in-common-with-non-rationalists! I said, "The advantage of this definition is that it’s easier to catch rationalists and non-rationalists doing it." Said Achmiz gave examples using the word "people," but I intended to group rationalists with non-rationalists.

I'm with Said Achmiz on not liking your phrasing/word repurposing, but I do like and agree with "Sacred Cows? That's too easy, what's next to hunt for?"

Ex. I've noticed recently that "Don't worry about/keep score of the little things" made it hard for me to have a strong and clear picture of where the pain points in my life are. Now I'm trying to keep the "Don't harmfully ruminate on negatives" and simultaneously record everyday a list of "things that happened which I don't like or that I want to change".

Some time ago I stopped telling people I'd be somewhere at ish-o'clock. 4PM-ish for example. I really appreciate when people tell me they'll be somewhere at an exact time, and they're there.

I've heard that people are more on-time for a meeting that starts at 4:05 than one at 4:00, and I've used that tactic (though I'd pick the less-obviously-sneaky 4:15).

I also like picking sneaky times! I used to be a big fan of starting things a 3:17, or 4:41.

I think you might have meant this as a reply to Said Achimz as opposed to a new comment. I think you can edit that.

Oh! Thanks.