This is the first in a series of posts about improving group dynamics within the rationality community. (The previous "checking for interest post" is here).
The Berkeley Hufflepuff Unconference is on April 28th. RSVPing on this Facebook Event is helpful, as is filling out this form.
Project Hufflepuff: Planting the Flag
"Clever kids in Ravenclaw, evil kids in Slytherin, wannabe heroes in Gryffindor, and everyone who does the actual work in Hufflepuff.”
- Harry Potter and the Methods of Rationality, Chapter 9
“It is a common misconception that the best rationalists are Sorted into Ravenclaw, leaving none for other Houses. This is not so; being Sorted into Ravenclaw indicates that your strongest virtue is curiosity, wondering and desiring to know the true answer. And this is not the only virtue a rationalist needs. Sometimes you have to work hard on a problem, and stick to it for a while. Sometimes you need a clever plan for finding out. And sometimes what you need more than anything else to see an answer, is the courage to face it…
- Harry Potter and the Methods of Rationality, Chapter 45
I’m a Ravenclaw and Slytherin by nature. I like being clever. I like pursuing ambitious goals. But over the past few years, I’ve been cultivating the skills and attitudes of Hufflepuff, by choice.
I think those skills are woefully under-appreciated in the Rationality Community. The problem cuts across many dimensions:
-
Many people in rationality communities feel lonely (even the geographically tight Berkeley cluster). People want more (and deeper) connections than they currently have.
-
There are lots of small pain points in the community (in person and online) that could be addressed fairly easily, but which people don’t dedicate the time to fix.
-
People are rewarded for starting individual projects more than helping to make existing ones succeed, which results in projects typically depending on a small number of people working unsustainably. (i.e. a single person running a meetup who feels like if they left, the meetup would crumble apart)
-
Some newcomers often find the culture impenetrable and unwelcoming.
-
Not enough “real-time operational competence” - the ability to notice problems in the physical world a and solve them.
-
Even at events like EA Global where enormous effort is put into operations and logistics, we scramble to pull things together at the last minute in a way that is very draining.
-
Many people communicate in a way that feels disdainful and dismissive (to many people), which makes both social cohesion as well as intellectual understanding harder.
-
We have a strong culture of “make sure your own needs are met”, that specifically pushes back against broader societal norms that pressure people to conform. This is a good, but I think we’ve pushed too far in the opposite direction. People often make choices that are valuable to them in the immediate term, but which have negative externalities on the people around them.
In a nutshell, the emotional vibe of the community is preventing people from feeling happy and and connected, and a swath of skillsets that are essential for group intelligence and ambition to flourish are undersupplied.
If any one of these things were a problem, we might troubleshoot it in isolated way. But collectively they seem to add up to a cultural problem, that I can’t think of any way to express other than “Hufflepuff skills are insufficiently understood and respected.”
There are two things I mean by “insufficiently respected”:
-
Ravenclaw and Slytherin skills come more naturally to many people in the community, and it doesn’t even occur to people that emotional and operational skills are something they should cultivate. It feels like a separate magisteria that specialists should do. They’re also quick to look at social niceties and traditions that seem silly, make a cursory attempt to understand them, and then do away with them without fully understanding their purpose.
-
People who might join the community who value emotional and operational skills more highly, feel that the community is not for them, or that they have to work harder to be appreciated.
And while this is difficult to explain, it feels to me that there is a central way of being, that encompasses emotional/operational intelligence and deeply integrates it with rationality, that we are missing as a community.
This is the first in a series of posts, attempting to plant a flag down and say “Let’s work together to try and resolve these problems, and if possible, find that central way-of-being.”
I’m decidedly not saying “this is the New Way that rationality Should Be”. The flag is not planted at the summit of a mountain we’re definitively heading towards. It’s planted on a beach where we’re building ships, preparing to embark on some social experiments. We may not all be traveling on the same boat, or in the exact same direction. But the flag is gesturing in a direction that can only be reached by multiple people working together.
A First Step: The Hufflepuff Unconference, and Parallel Projects
I’ll be visiting Berkeley during April, and while I’m there, I’d like to kickstart things with a Hufflepuff Unconference. We’ll be sharing ideas, talking about potential concerns, and brainstorming next actions. (I’d like to avoid settling on a long term trajectory for the project - I think that’d be premature. But I’d also like to start building some momentum towards some kind of action)
My hope is to have both attendees who are positively inclined towards the concept of “A Hufflepuff Way”, and people for whom it feels a bit alien. For this to succeed as a long-term cultural project, it needs to have buy-in from many corners of the rationality community. If people have nagging concerns that feel hard to articulate, I’d like to try to tease them out, and address them directly rather than ignoring them.
At the same time, I don’t want to get bogged down in endless debates, or focus so much on criticism that we can’t actually move forward. I don’t expect total-consensus, so my goal for the unconference is to get multiple projects and social experiments running in parallel.
Some of those projects might be high-barrier-to-entry, for people who want to hold themselves to a particular standard. Others might be explicitly open to all, with radical inclusiveness part of their approach. Others might be weird experiments nobody had imagined yet.
In a few months, there’ll be a followup event to check in on how those projects are going, evaluate, and see what more things we can try or further refine.
If you’re interested in attending the Hufflepuff Unconference on the 28th, please RSVP on this Facebook Event, and fill out this form.
Thanks to Duncan Sabien, Lauren Horne, Ben Hoffman and Davis Kingsley for comments
One of the skills to talk about would be the skill of actively proselytizing and getting people into rationality. I don't mean onboarding people who are already interested, I mean actually going up to people who you wish were rationalists and trying to make them.
Successful communities do this, although the specifics vary widely. EA does it, which I think is why EA is growing while LW isn't. We've been largely coasting on Eliezer's wave.
Thus is difficult because LW rationality arose in the tech culture of California, I.e. an unusually individualistic culture within an unusually individualistic part of the most individualistic country ever. Only in California could one be called a "cult" for seeking a consensus philosophy. Any active proselytizing would definitely encounter the "cult" charge again.
But proselytizing works. It keeps a movement young - we're already noticably older on average than we were ten years ago and we're starting to look like a cohort of tech nerds who were in their impressionable college age when Eliezer wrote the sequences. And it keeps a movement dynamic - if new people are coming in all the time, you don't have to suffer the ossification that it takes to retain people as they get older. LW rationality has no less need of this than other movements.
And there are definitely people who are much better at it than others, so a systematic study of what works is eminently doable. I think this fits squarely into Project Hufflepuff.
Personally, I think cohorts happen automatically, and LW is "yet another cohort" and if we want to be part of a movement with inter-generational significance then maybe we should pause to consider why we think we should be "the first generation" in a movement that lasts forever...
In this vein, I appreciate previous places and groups like:
If I was going to name the entire thing, I think I might call it "Outsider Science" (taking a cue from "Outsider Art" and contrasting it with "Vannevarian Science").
So if you wanted to be so Hufflepuff that you sacrificed the whole group on the altar of being social (rather than just sacrificing yourself for the group) I'd argue that it would be a natural progression to work on reconnecting, resuscitating, archiving, and generally paying attention to these older places and communities, and putting yourself in service to their long term goals.
The hard thing here is that the diagnostic criteria looking backwards seems to be having a certain mindset towards physical reality and being a kind of a cultural orphan at the same time. The standoffishness and founding a tiny little institutes is part of what this movement seems to do by default?
Thus, projecting forward, you would predict that new instances of "the outsider science movement" would form on their own, start their own thing, and reject the notion of intellectual parentage, as much as we (the hypothetical intellectual parents) try to bring them into the loose confederation of previous attempts at self organized scientific research aiming at eternal intellectual challenges.
A lot of the future people you'd be trying to bring into the fold might very well prefer to struggle on alone.
Arguably, Vanevarian Science (with government credentialed universities doing government funded research) is already doing what you would evolve into anyway, and has succeeded so far and so thoroughly that its "highest mid level hierarchs" have become members of the deep government of the world? So maybe the right thing to do is just let all the various orphans struggle on by themselves, and just go try to get a job at NSF while retaining fond feelings for the strugglers?
So my guess is that Bacon's Effecting Of All Things Possible has run for a long while now, and maybe "the orphans" who might have belonged to the high church version (but somehow never connected with the central culture) were never really noticed until the internet came along and then could start to find each other and form little clumps and clusters.
So maybe the most Hufflepuff thing possible would be to somehow be encourage a larger internet culture that finds and welcomes these autonomous orphan clusters, while also extending an olive branch to the high church "Heirs of Bacon" who exist in the deep government, and see if there is some way to establish a state of communion between the main tree and all the little saplings :-)