One of the skills to talk about would be the skill of actively proselytizing and getting people into rationality. I don't mean onboarding people who are already interested, I mean actually going up to people who you wish were rationalists and trying to make them.

Successful communities do this, although the specifics vary widely. EA does it, which I think is why EA is growing while LW isn't. We've been largely coasting on Eliezer's wave.

Thus is difficult because LW rationality arose in the tech culture of California, I.e. an unusually individualistic culture within an unusually individualistic part of the most individualistic country ever. Only in California could one be called a "cult" for seeking a consensus philosophy. Any active proselytizing would definitely encounter the "cult" charge again.

But proselytizing works. It keeps a movement young - we're already noticably older on average than we were ten years ago and we're starting to look like a cohort of tech nerds who were in their impressionable college age when Eliezer wrote the sequences. And it keeps a movement dynamic - if new people are coming in all the time, you don't have to suffer the ossification that it takes to retain people as they get older. LW rationality has no less need of this than other movements.

And there are definitely people who are much better at it than others, so a systematic study of what works is eminently doable. I think this fits squarely into Project Hufflepuff.

Project Hufflepuff: Planting the Flag

This is the first in a series of posts about improving group dynamics within the rationality community. (The previous "checking for interest post" is here).

 

The Berkeley Hufflepuff Unconference is on April 28th. RSVPing on this Facebook Event is helpful, as is filling out this form.

Project Hufflepuff: Planting the Flag


"Clever kids in Ravenclaw, evil kids in Slytherin, wannabe heroes in Gryffindor, and everyone who does the actual work in Hufflepuff.”

- Harry Potter and the Methods of Rationality, Chapter 9


“It is a common misconception that the best rationalists are Sorted into Ravenclaw, leaving none for other Houses. This is not so; being Sorted into Ravenclaw indicates that your strongest virtue is curiosity, wondering and desiring to know the true answer. And this is not the only virtue a rationalist needs. Sometimes you have to work hard on a problem, and stick to it for a while. Sometimes you need a clever plan for finding out. And sometimes what you need more than anything else to see an answer, is the courage to face it…

- Harry Potter and the Methods of Rationality, Chapter 45

 

I’m a Ravenclaw and Slytherin by nature. I like being clever. I like pursuing ambitious goals. But over the past few years, I’ve been cultivating the skills and attitudes of Hufflepuff, by choice.


I think those skills are woefully under-appreciated in the Rationality Community. The problem cuts across many dimensions:


  • Many people in rationality communities feel lonely (even the geographically tight Berkeley cluster). People want more (and deeper) connections than they currently have.

  • There are lots of small pain points in the community (in person and online) that could be addressed fairly easily, but which people don’t dedicate the time to fix.

  • People are rewarded for starting individual projects more than helping to make existing ones succeed, which results in projects typically depending on a small number of people working unsustainably. (i.e. a single person running a meetup who feels like if they left, the meetup would crumble apart)

  • Some newcomers often find the culture impenetrable and unwelcoming.

  • Not enough “real-time operational competence” - the ability to notice problems in the physical world a and solve them.

  • Even at events like EA Global where enormous effort is put into operations and logistics, we scramble to pull things together at the last minute in a way that is very draining.

  • Many people communicate in a way that feels disdainful and dismissive (to many people), which makes both social cohesion as well as intellectual understanding harder.

  • We have a strong culture of “make sure your own needs are met”, that specifically pushes back against broader societal norms that pressure people to conform. This is a good, but I think we’ve pushed too far in the opposite direction. People often make choices that are valuable to them in the immediate term, but which have negative externalities on the people around them.

In a nutshell, the emotional vibe of the community is preventing people from feeling happy and and connected, and a swath of skillsets that are essential for group intelligence and ambition to flourish are undersupplied.


If any one of these things were a problem, we might troubleshoot it in isolated way. But collectively they seem to add up to a cultural problem, that I can’t think of any way to express other than “Hufflepuff skills are insufficiently understood and respected.”

There are two things I mean by “insufficiently respected”:

  • Ravenclaw and Slytherin skills come more naturally to many people in the community, and it doesn’t even occur to people that emotional and operational skills are something they should cultivate. It feels like a separate magisteria that specialists should do. They’re also quick to look at social niceties and traditions that seem silly, make a cursory attempt to understand them, and then do away with them without fully understanding their purpose.

  • People who might join the community who value emotional and operational skills more highly, feel that the community is not for them, or that they have to work harder to be appreciated.

And while this is difficult to explain, it feels to me that there is a central way of being, that encompasses emotional/operational intelligence and deeply integrates it with rationality, that we are missing as a community.


This is the first in a series of posts, attempting to plant a flag down and say “Let’s work together to try and resolve these problems, and if possible, find that central way-of-being.”


I’m decidedly not saying “this is the New Way that rationality Should Be”. The flag is not planted at the summit of a mountain we’re definitively heading towards. It’s planted on a beach where we’re building ships, preparing to embark on some social experiments. We may not all be traveling on the same boat, or in the exact same direction. But the flag is gesturing in a direction that can only be reached by multiple people working together.

A First Step: The Hufflepuff Unconference, and Parallel Projects


I’ll be visiting Berkeley during April, and while I’m there, I’d like to kickstart things with a Hufflepuff Unconference. We’ll be sharing ideas, talking about potential concerns, and brainstorming next actions. (I’d like to avoid settling on a long term trajectory for the project - I think that’d be premature. But I’d also like to start building some momentum towards some kind of action)

My hope is to have both attendees who are positively inclined towards the concept of “A Hufflepuff Way”, and people for whom it feels a bit alien. For this to succeed as a long-term cultural project, it needs to have buy-in from many corners of the rationality community. If people have nagging concerns that feel hard to articulate, I’d like to try to tease them out, and address them directly rather than ignoring them.

At the same time, I don’t want to get bogged down in endless debates, or focus so much on criticism that we can’t actually move forward. I don’t expect total-consensus, so my goal for the unconference is to get multiple projects and social experiments running in parallel.


Some of those projects might be high-barrier-to-entry, for people who want to hold themselves to a particular standard. Others might be explicitly open to all, with radical inclusiveness part of their approach. Others might be weird experiments nobody had imagined yet.

In a few months, there’ll be a followup event to check in on how those projects are going, evaluate, and see what more things we can try or further refine.

If you’re interested in attending the Hufflepuff Unconference on the 28th, please RSVP on this Facebook Event, and fill out this form.

 

Thanks to Duncan Sabien, Lauren Horne, Ben Hoffman and Davis Kingsley for comments


Comments

sorted by
magical algorithm
Highlighting new comments since Today at 6:26 AM
Select new highlight date
Rendering 50/104 comments  show more

Fellow Hufflepuff / startupper / business getting-stuff-done-er / CFAR / Bay-arean here. Can we talk about the elephant in the room?

  • Geeks, MOPs, and sociopaths in subculture evolution <- describes the idea of the role of parasites in subculture evolution; specifically, that once group-surplus achieves a threshold, it is immediately soaked up by parasites funneling it to agendas of their own
  • There are, by my count, at least 3 such parasites in the Bay community; and specifically they position themselves as the broken stair step right at onboarding, making the community feel "impenetrable and unwelcoming". The way how this happens operationally, is when I admit to some level of operational surplus (language skills, software development, business building), from these specific persons I get immediately asks of "Would you like to do free translation for me?" / "Would you like to build $website-idea$ for me?" / "Would you like to donate to $my-cause$?". I also notice that they don't do it this overtly to long-term members.
  • Note, the problem here isn't the ask. We do asks in entrepreneur-topia all the time. The problem is the lack of dealcraft: the asks are asymmetrically favouring the asker, and only offer vague lipservice-waving-towards-nice-things as return.
  • Presence of these parasites, and lack of dealcraft by these people reached equilibrium at having 'a strong culture of “make sure your own needs are met”, that specifically pushes back against broader societal norms that pressure people to conform.' , because people who have been valuepumped hard enough can not sustain themselves in the Bay.

You are attempting to increase the group-surplus of the community. This is very cool. My pre-mortem says, that any such surplus created by the sweat of your brow will be soaked up by this parasitic behavior, and hence fail to achieve long-term changes in admitted competence of the community.

There might be several ways to work around this problem. I want to be upfront about the evaluation criterias for it:

  • not talking, or taking action about this problem will not make it go away;
  • parasites' aim is value-pumping: that is, closing deals in which they get the maximum amount of value with the least amount of work on their own;
  • parasites participate in the culture like everyone else; for this reason, any plan you might come up with must be reflection-complete: that is, it needs to work, even if everyone in the community knows that such plan is in motion.

A few candidate solutions which sticks out:

  • Level up dealcraft: cultivate, and enforce a culture of mutually beneficial asks.
  • Level up quantitiy of dealcraft: elicit members -all members- of their goals / objectives / needs, and focus on coincidence of wants. There's a pretty cool model of this in the book Wishcraft: "Barn-raising"
  • Systematically post-mortem newbs, elicit list of parasites ("was there someone who made you uncomfortable, and describe the exact specificities of the situation"), and systematically intervene in the onboarding process.

Edit note: originally, this post used the word "sociopath", incorrectly -thanks for Viliam's comment below for pointing it out- fixed.

Everyone, could we please stop using the word "sociopath" to mean things other than... you know... sociopathy?

I also like the linked article and I believe it does a great job at describing social dynamic at subcultures. I shared that article many times. But while it is funny to use exaggerations for shocking value, making the exaggerated word a new normal is... I guess in obvious conflict with the goal of rationality and clear communication. Sometimes I don't even know how many people are actually aware that "trying to make profit from things you don't deeply care about" and "being diagnosed as a sociopath" are actually two different things.

To explain why I care about this, imagine a group that decides that it is cool to refer to "kissing someone for social reasons, not because you actually desire to", as "rape". Because, you know, there are some similarities; both are a kind of an intimate contact, etc. Okay, if you write an article describing the analogies, that's great, and you have a good point. It just becomes idiotic when the whole community decides to use "rape" in this sense, and then they keep talking like this: "Yesterday we visited Grandma. When we entered the house, she raped us, and then we raped her back. I really don't like it when old people keep raping me like this, but I don't want to create conflicts in the family. But maybe I am just making a mountain out of a molehill, and being raped is actually not a big deal." Followed by dozen replies using the same vocabulary.

First, this is completely unnecessarily burning your weirdness points. Weird jargon makes communication with outsiders more difficult, and makes it more difficult for outsiders to join the group, even if they would otherwise agree with the group's values. After this point, absurdity heuristics works against anything you say. Sometimes there is a good reason for using jargon (it can compress difficult concepts), but I believe in this case the benefits are not proportional to the costs.

More importantly, imagine that if talking like this would become the group norm, how difficult it would be to have a serious discussion about actual rape. Like, anytime someone would mention being actually raped by a grandparent as a child, there would be a guaranteed reaction from someone "yeah, yeah, happens to me when we visit Grandma every weekend, not a big deal". Or someone would express concern about possible rape at community weekend, and people would respond by making stickers "kisses okay" and "don't like kissing", believing they are addressing the issue properly.

I believe it would be really bad if rationalist community would lose the ability to talk about actual sociopathy rationally. Because one day this topic may become an important one, and we may be too busy calling everyone who sells Bayes T-shirts without having read the Sequences a "sociopath". But even if you disagree with me on the importance of this, I hope you can agree that using words like this is stupid. How about just calling it "exploiting"? As in: "some people are only exploiting the rationalist community to get money for their causes, or to get free work from us, without providing anything to our causes in return -- we seriously need to put stop to this". Could words like this get the message across, too?

Also, if you want to publicly address these people "hey guys, we suspect you are just using us for free resources; how about demonstrating some commitment to our causes first?", it will probably help to keep the discussion friendly, if you don't call them "sociopaths". Similarly, imagine LessWrong having an article saying (a) "vegans as a group benefit from the rationalist community, but don't contribute anything to the art of Bayes in return", or (b) "vegans are sociopaths". Regardless of whether you personally happen to be a vegan or not, this is obviously harmful.

tl;dr -- we are in the rationality business here, not in the clickbait business; talk accordingly

(EDIT: Just to be explicit about this, ignoring the terminology issue, I completely agree with the parent comment.)

Thank you. This was really bothering me but it didn't occur that I should say anything about it.

Agreed. Recommend a non-verbed descriptive noun, and I'll update the post above.

Thank you!

Uhm, I guess "exploiters" or "free riders"? (Or "parasites" if one wants to offend. Or "moochers" when talking to Randians.)

Sorry, not a native English speaker, I may be missing something more fitting.

I think it's important that what the original post is warning about is not people who show up and mooch off the group - it's people who show up and begin to take over the group so thoroughly that they distort what the group is about. I think "exploiter" works pretty well, but "free rider" doesn't really convey it to me.

This, and problems similar to this, are indeed a pretty major issue I foresee Project Hufflepuff needing to resolve. I'd read the Mops/Fanatics/Sociopaths essay but hadn't thought about this particular issue from this angle before, thanks.

I can trace an arc, over the past ten years, of my attitude towards communities:

  • "Yay communities! Let's all share event invites and do everything together and everything will be great!"
  • "Hm, I'm organizing events for people but I'm not really enjoying them, and it doesn't really make me feel fulfilled"
  • "Inviting people to events doesn't seem to cause them to reciprocate by sending me invites back"
  • "I think the people in my community actually are having a lot of events, they're just not inviting me to most of them"
  • "I seem to have more fun interacting with people who aren't in my community. What's up with that?"
  • "Communities are okay but friends are better."

I never found a solution for how to get people to invite me to things. I think the problem is that I personally am really picky about the sorts of events I enjoy (ie, I don't like drinking or sports), so if I want to have an event that I will enjoy I have to make it myself.

But I did find a solution for how to have good events: make sure that all the people that I invite to my event are people who specifically want to do that event. Don't invite people because "they're part of the community" or "I want to make sure they're not lonely"; the risk is that they might show up because it's their only social outlet, and then they might not participate in the way that I wanted.

Nowadays I think of communities as places to meet people who could be my friends.

I think that almost everyone vastly underestimates the importance of friends, and especially the importance of a few close friends. In terms of not being lonely, of having good times and good events, or even of having a good time at the events that the community organizes, a few close friends are the key. I started enjoying group events far more when I realized that there is no need to try and 'make the rounds' of the 20-100 people there - find the handful that interest you tonight, and spend the night with them.

Raemon's response is key too, though. Communities are still super important because they provide anchors around which things can be organized, friends can coordinate and new friends can be found. What you do not want is for smaller groups to be only friendships and withdraw from their communities, or for some outside community to steal the best community members, because then the original community stops drawing in new people (or stop drawing in good new people) and slowly dies.

A great question, and one I hope is asked at the conference, is "how do we encourage more formation of close friendships?"

I endorse this as a healthy transition, with the caveat that what seems to happen, in practice, is that people clump off and form friendships, and then the community-mechanism by which people were able to form those friendships fades, so that future generations are not able to form friendships of their own. (Also, it seems like people end up not forming especially close friendships because people are too busy)

I'll be making sure there are notes from the Berkeley unconference. If you're interested in doing something vaguely-similar in your own neck of the woods, I recommend commenting here to see if others are interested (and/or reaching out through whatever your usual community-channels are).

My past experience is it hasn't been been worth it to arrange skyping in for this sort of event, but I think it'd be worth collaborating on ideas beforehand and sharing notes afterwards between people in different geographic locations.

One of the skills to talk about would be the skill of actively proselytizing and getting people into rationality. I don't mean onboarding people who are already interested, I mean actually going up to people who you wish were rationalists and trying to make them.

Successful communities do this, although the specifics vary widely. EA does it, which I think is why EA is growing while LW isn't. We've been largely coasting on Eliezer's wave.

Thus is difficult because LW rationality arose in the tech culture of California, I.e. an unusually individualistic culture within an unusually individualistic part of the most individualistic country ever. Only in California could one be called a "cult" for seeking a consensus philosophy. Any active proselytizing would definitely encounter the "cult" charge again.

But proselytizing works. It keeps a movement young - we're already noticably older on average than we were ten years ago and we're starting to look like a cohort of tech nerds who were in their impressionable college age when Eliezer wrote the sequences. And it keeps a movement dynamic - if new people are coming in all the time, you don't have to suffer the ossification that it takes to retain people as they get older. LW rationality has no less need of this than other movements.

And there are definitely people who are much better at it than others, so a systematic study of what works is eminently doable. I think this fits squarely into Project Hufflepuff.

Personally, I think cohorts happen automatically, and LW is "yet another cohort" and if we want to be part of a movement with inter-generational significance then maybe we should pause to consider why we think we should be "the first generation" in a movement that lasts forever...

In this vein, I appreciate previous places and groups like:

If I was going to name the entire thing, I think I might call it "Outsider Science" (taking a cue from "Outsider Art" and contrasting it with "Vannevarian Science").

So if you wanted to be so Hufflepuff that you sacrificed the whole group on the altar of being social (rather than just sacrificing yourself for the group) I'd argue that it would be a natural progression to work on reconnecting, resuscitating, archiving, and generally paying attention to these older places and communities, and putting yourself in service to their long term goals.

The hard thing here is that the diagnostic criteria looking backwards seems to be having a certain mindset towards physical reality and being a kind of a cultural orphan at the same time. The standoffishness and founding a tiny little institutes is part of what this movement seems to do by default?

Thus, projecting forward, you would predict that new instances of "the outsider science movement" would form on their own, start their own thing, and reject the notion of intellectual parentage, as much as we (the hypothetical intellectual parents) try to bring them into the loose confederation of previous attempts at self organized scientific research aiming at eternal intellectual challenges.

A lot of the future people you'd be trying to bring into the fold might very well prefer to struggle on alone.

Arguably, Vanevarian Science (with government credentialed universities doing government funded research) is already doing what you would evolve into anyway, and has succeeded so far and so thoroughly that its "highest mid level hierarchs" have become members of the deep government of the world? So maybe the right thing to do is just let all the various orphans struggle on by themselves, and just go try to get a job at NSF while retaining fond feelings for the strugglers?

So my guess is that Bacon's Effecting Of All Things Possible has run for a long while now, and maybe "the orphans" who might have belonged to the high church version (but somehow never connected with the central culture) were never really noticed until the internet came along and then could start to find each other and form little clumps and clusters.

So maybe the most Hufflepuff thing possible would be to somehow be encourage a larger internet culture that finds and welcomes these autonomous orphan clusters, while also extending an olive branch to the high church "Heirs of Bacon" who exist in the deep government, and see if there is some way to establish a state of communion between the main tree and all the little saplings :-)

I agree that "cohorts happen automatically", and the organisations that prevent this usually care explicitly about the next generations, whether we are talking about the Scout movement, religious groups, or academia. Ignoring this would be detrimental to the rationalist movement in long term.

Understandably, most of us have negative connotations associated with "spreading the word". It is yet another "motte and bailey" situation, where on some level it's true that increasing the number of people who e.g. read Less Wrong is not our terminal value, that gaining followers is almost orthogonal to being 'less wrong', and that trying to be attractive for too many people could dilute the message; but on the other hand, it can easily become reversed stupidity, something like people refusing to eat food just because Hitler did that.

There are two basic ways how can rationality movement could disappear from the world. One is gradual shrinking: people individually deciding that e.g. Pascal's wager actually makes sense, or that making their political faction win is more important than getting statistics and logic right, or otherwise trade rationality for something more appealing. The other is gradually becoming a group of old farts, whose debates are gradually reduced to talking over and over again about the things that happened decades ago. -- Where do we see ourselves, as a group, 50 years from now? (Conditional on Singularity not happening, humanity not going extinct, etc., or course.)

Of course, if we are not willing to enter a "loose confederation" with the previous generations, we should not expect a different approach from the next generations. Telling them to "read the Sequences" would be like telling us to "read Science and Sanity"; maybe one in a hundred would do, but nothing would change as a result, anyway.

Seems like two things need to be done, probably in this order:

1) Agree on a larger definition of "confederation of reason", "scions of Bacon", or whatever we decide to call it. Yes, this will be difficult, it goes against our nitpicking instinct, and it is going to rub many people the wrong way.

2) Make a strategic effort to recruit people, a lot of them (not just a few mathematical prodigies), into the "confederation of reason". This could mean joining what other organisations are already doing, instead of reinventing the wheel. This again goes against our instincts.

I expect that many rationalists will be not able to overcome their insticts on these matters, so we should not expect a wide consensus here. Instead, a few people who like this idea should just create a team, and do it. Which is how generally things get done.

I actually think it's important for a given project to have a fairly narrow focus in order to make progress, and I see Project Hufflepuff as related to outreach, but not directly about outreach. (I also don't think proselytizing is the right word - we don't have Good News to share - we have a bunch of ideas and models we're in the process of figuring out.)

Right now, the community has something of a backlog of people who want to get more involved, but aren't sure how, and people who are hanging out on the periphery and have value to contribute, but various things about the culture make them not want to. As well as people in the community who aren't succeeding/thriving at the things they want to.

Project Hufflepuff is about making internal community infrastructure better. This will hopefully remove bottlenecks that make outreach harder, but isn't the same thing.

Some of this reminds me of a talk by Sumana Harihareswara, a friend of mine in the free software community, where she tries to exmaine which strange and offputting things are necessary and which are needlessly driving people away: Inessential Weirdnesses in Free Software

I think there are in fact a lot of parallels between issues in free software and the rationalist community--similarly devaluing Hufflepuff skills even when they're necessary to get the full value out of everyone's contributions, similarly having concerns about not watering down the core philosophical commitments while still being open to newcomers and people who are not yet committed to the entire culture.

(FWIW, I am a weakly-connected member of the Bay Area rationalist community--it's not what I think of as my primary community so I'm not particularly present.)

This might be too obvious to mention, but Eliezer's2009 post "Why our kind can't cooperate" seems quite relevant to this. http://lesswrong.com/lw/3h/why_our_kind_cant_cooperate/

Let me tell you about a specific thing that I saw in a different community, that I thought was a good way to make the community more welcoming.

I was in a meetup community about D&D. There was a guy who did a great thing there: every four or five months, he would create a meetup called "Meet And Greet For Players And DMs". You could show up to the meetup and talk about the specific game you wanted to play in (or run). You could meet other people who wanted to do the same thing, and you could trade contact information, and after the event you could send people messages: "hey, come over to my place and let's play that game that we both want to play!"

This is a great way to forge intercommunity connections of the sort that you're talking about. It's also remarkably low-effort: the guy would say "yeah, we're going to meet in X location", and then he'd show up at X location to proctor the event and make sure everyone got a chance to speak, and that was all he had to do.

I think it'd be interesting to have an online unconference, as well. Maybe put up a post here on the day, and people can write in comments with a time, topic, and google hangout link.

My feedback on these points:

Many people in rationality communities feel lonely (even the geographically tight Berkeley cluster). People want more (and deeper) connections than they currently have.

"I am feeling lonely because I am needing connection."

In the NVC sense this is a very clear and valid statement. Except within NVC no one else is responsible for your feelings other than yourself. Other people cannot make you feel lonely. They can take actions that cause you to feel lonely but they cannot force or guarantee you will feel lonely following the actions they take. In this sense I can drop you in the desert where there is no one around but I cannot force the loneliness out from within you. You can be thousands of miles away from anyone and still not be made to feel alone. You can equally be in the same room as all your friends and they cannot force you to feel "not alone any more".

I found myself in some position of not being happy with my situation. I run my local lesswrong group. I started the slack and can now be found on the discord. I blame no one else for my feelings, nor do I expect anyone else to rectify the problems. If the problem is loneliness, then the solution is not to force people to feel un-lonely. It is to get people to work it out for themselves. (Which may be a valid but different project)

There are lots of small pain points in the community (in person and online) that could be addressed fairly easily, but which people don’t dedicate the time to fix.

This would be easier if there were concrete and clear examples of them. Particularly because with concrete and clear examples the solutions become more obvious. I don't see a systemic problem, other than perhaps with agency or communication. For example the recent arbital post. Many people would have contributed earlier if we had known it needed help. But the way Alexie talks; it was not in need of help.

People are rewarded for starting individual projects more than helping to make existing ones succeed, which results in projects typically depending on a small number of people working unsustainably. (i.e. a single person running a meetup who feels like if they left, the meetup would crumble apart)

I agree. Part of this is about being proactive about describing what you are doing and what you are working on. To this end - a weekly thread about "this week I am working on" seems like a good idea, one I am willing to implement inside the next open thread, and if it takes off - separate to a new thread.

Some newcomers often find the culture impenetrable and unwelcoming.

Each and every one of us was once a newcomer. Each and every one of us lurked for a time. Our surveys show that our ratios of lurkers to commenters to posters are like 100:10:1. Perhaps a strategy of

  1. telling people why they should step up, or
  2. asking why they don't, then removing all the barriers.

In addition - more responses on the welcome thread would be good. The current thread - http://lesswrong.com/r/discussion/lw/ogw/welcome_to_less_wrong_11th_thread_january_2017/ can be subscribed to so that you too can welcome people and engage with what they have to say.

Not enough “real-time operational competence” - the ability to notice problems in the physical world and solve them.

That's a thing. I and others (i.e. lifelonglearner) are working towards more instrumental writing about noticing personal problems and fixing them. It takes time to write. We welcome other interest in drafting ideas. \

Future post topics:

evaluating interpersonal relationships - the good, the bad, the defaults, the ones you want to curate. time management - 168hours and breaking down where it goes. financial planning - runways, wealth and growth. health/fitness management, and having good vices.

Even at events like EA Global where enormous effort is put into operations and logistics, we scramble to pull things together at the last minute in a way that is very draining.

I think this is inherent to planning major events. I imagine a friend of mine who is an engineer involved in big projects like switching on a power plant and such. With those projects, the "flipping the switch" moment is probably the most calm, boring, ordinary event in the scheme of the process of planning and organising things. That is - no one is running around, no sparks are flying. Someone might have a clipboard and tick a box as it happens. As ordinary as drinking a cup of tea. With experience comes the skill to plan a conference like one might go about drinking a cup of tea. But you start with people who have never done these things, and we have to learn somewhere.

Many people communicate in a way that feels disdainful and dismissive (to many people), which makes both social cohesion as well as intellectual understanding harder.

In recently investigating emotions and vulnerability I came across this talk:

https://www.youtube.com/watch?v=iCvmsMzlF7o

In it she talks about how when we seek to numb the negative emotions we also numb all emotion and this gets in the way of connection. I currently stand on the fence of - numbing emotions is neccesary to a certain amount because for people in the category of having difficulty managing emotions (either having too many or not enough), having them not there (very cold communications) makes it very easy to see the communication. Yes, this means coming across cold and unwelcoming at first. But if you look very carefully at that "cold an unwelcoming" it's not unfriendly, it's also not friendly. It just is. I can't say what I will believe in the future (that talk is persuasive to the value of not numbing things). But right now I can explain the coldness for the safe feeling that it permits to communication. There is less misunderstanding emotions when there is no emotion. There is also less connection.

We have a strong culture of “make sure your own needs are met”, that specifically pushes back against broader societal norms that pressure people to conform. This is a good, but I think we’ve pushed too far in the opposite direction. People often make choices that are valuable to them in the immediate term, but which have negative externalities on the people around them.

I disagree. getting one's own needs met is crucial to being able to function in this world. In addition EA came about to do things that are helping others.


This was a lot of work to write out, but I want to say that I feel like you are off in many small and nuanced ways. Not sure if I got that across quite right.

If the problem is loneliness, then the solution is not to force people to feel un-lonely. It is to get people to work it out for themselves. (Which may be a valid but different project)

I'm not sure how much of a disagreement we have regarding the NVC paradigm. I very much didn't mean for the solution to be "force people to be un-lonely", so if that's how it came across, sorry for that miscommunication. (The OP was just a high level summary of what sort of problems I'm trying to address)

I very much agree with NVC that it's important to have an internal locus of control. But you can still look at an overall situation, notice that a lot of people are struggling to make things work, and notice things about a culture that are making it harder to solve a problem.