If funding were available, the Centre for Effective Altruism would consider hiring someone to work closely with Prof Nick Bostrom to provide anything and everything he needs to be more productive. Bostrom is obviously the Director of the Future of Humanity Institute at Oxford University, and author of Superintelligence, the best guide yet to the possible risks posed by artificial intelligence.
Nobody has yet confirmed they will fund this role, but we are nevertheless interested in getting expressions of interest from suitable candidates.
The list of required characteristics is hefty, and the position would be a challenging one:
- Willing to commit to the role for at least a year, and preferably several
- Able to live and work in Oxford during this time
- Conscientious and discreet
- Trustworthy
- Able to keep flexible hours (some days a lot of work, others not much)
- Highly competent at almost everything in life (for example, organising travel, media appearances, choosing good products, and so on)
- Will not screw up and look bad when dealing with external parties (e.g. media, event organisers, the university)
- Has a good personality 'fit' with Bostrom
- Willing to do some tasks that are not high-status
- Willing to help Bostrom with both his professional and personal life (to free up his attention)
- Can speak English well
- Knowledge of rationality, philosophy and artificial intelligence would also be helpful, and would allow you to also do more work as a research assistant.
The research Bostrom can do is unique; to my knowledge we don't have anyone who has made such significant strides clarifying the biggest risks facing humanity as a whole. As a result, helping increase Bostrom's output by say, 20%, would be a major contribution. This person's work would also help the rest of the Future of Humanity Institute run smoothly.
I think this ad makes LW and EA look cultish, because this ad sounds like hero worship and sexual innuendo. I was especially troubled to see this link on the EA Facebook page, where many potential/new EAs who don't know who Bostrom is, have lower weirdness tolerance, and have still-forming understanding of effective altruism, could see it.
I showed this to a few smart young people, the type EAs want to reach out to, and they said it sounded "sketchy" "unprofessional" and "kind of like prostitution." Maybe it's totally fine and even attractive for LW, but I think EA leaders trying to recruit really need to be more thoughtful about their language. I think a different description should have been written up for that forum.
At the very least, it's very unconventional. Ads for personal assistants usually mention specific duties like "answering emails" and "preparing food," not just all-purpose service, so that people know what they are getting into.
tl:dr This ad sounds sketchy to me, and I really wish it wasn't linked on the EA Facebook group, where it can scare off new/potential EAs
Thank you for voicing your worries. It's important we discuss this aspect as well, and I hadn't taken that into account when I posted the comment about "sidekicks".
The "sexual innuendo" part was surprising to me - I (female, 24y) didn't get that impression from reading the post and neither did any of the smart, young people I showed it to. Maybe we were talking with people in different social circles (my friends are already EAs for the most part). You're right that phrasing it as "sidekick" makes it look more cultish. I'm not sure what the tradeoff is between making a joke/attracting people for whom this sounds appealing and not appearing sketchy/culty.
I would assume that it's phrased in a very general way because it's not actually determined yet. It depends on what the candidate is able to do well. Nonetheless, more examples could have been given. Would you have preferred that?
Do you have suggestions to rephrase the quoted parts? I have trouble coming up with something more professional that says the same thing myself.