EA - Taking prioritisation within 'EA' seriously by CEvans

The Nonlinear Library: EA Forum - Ein Podcast von The Nonlinear Fund

Kategorien:

Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Taking prioritisation within 'EA' seriously, published by CEvans on August 19, 2023 on The Effective Altruism Forum.NB: This post is arguably slightly info hazardous (for lack of a better term) for anyone in the community who might feel particular anxiety about questioning their own career decisions. Perhaps consider reading this first: You have more than one goal, and that's fine - EA Forum (effectivealtruism.org). This piece is about the importance of having non-impact focussed goals, and that it is extremely ok to have them. This post is intended to suggest what people should do in so far as they want to have more impact, rather than being a suggestion of what everyone in the EA community should do.Your decisions matter. The precise career path you pick really matters for your impact. I think many people in the EA community would say they believe this if asked, but haven't really internalised what this means for them. I think many people would have great returns for impact to thinking more carefully about prioritisation for their career, even within the "EA careers space. Here are some slight caricatures of statements I hear regularly:"I want an 'impactful' job""I am working on a very important problem, so within that I will do what I think is interesting""I was already interested in something mentioned on 80,000 Hours, so I will work on that""People seem to think this area is important, so I suppose I should work on that""I am not a researcher, so I shouldn't work on that problem"I think these are all major mistakes for those who say them, in so far as impact is their primary career goal. My goals for this post are to make the importance of prioritisation feel more salient to members of the community (which is the first half of my post), and to help making progress feel and be more attainable (the second half from "What does thinking seriously about prioritisation look like".Key ClaimsFor any given person, their best future 'EA career paths' are at least an order of magnitude more impactful than their median 'EA career path'.For over 50% of self identifying effective altruists, in their current situation:Thinking more carefully about prioritisation will increase their expected impact by several times.There will be good returns to thinking more about the details of prioritising career options for yourself, not just uncritically deferring to others or doing very high-level "cause prioritisation".They overvalue personal fit and prior experience when determining what to work on.I think the conclusions of my argument should excite you.Helping people is amazing.. This community has enabled many of us to help orders of magnitude more people than we otherwise would have. I am claiming that you might be able to make a similar improvement again with careful thought, and that as a community we might be able to achieve a lot more.Defining prioritisation in terms of your careerPrioritisation: Determining which particular actions are most likely to result in your career having as much impact as possible. In practice, this looks like a combination of career planning and cause/intervention prioritisation. So, "your prioritisation" means something like "your current best guess of what precisely you should be doing with your career for impact".Career path: The hyper-specific route which you take through your career, factoring decisions in such as which specific issues/interventions to work on, which organisations to work at, and which specific roles to take. I do not mean to be as broad as "AI alignment researcher" or "EA community builder".'EA' person/career path: By this I mean a person or choice which is motivated by impact, not necessarily in so-called 'EA' organisations or explicitly identifying as part of the community.For any given person, the best "EA"...

Visit the podcast's native language site