EA - Update on cause area focus working group by Bastian Stern
The Nonlinear Library: EA Forum - Ein Podcast von The Nonlinear Fund

Kategorien:
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Update on cause area focus working group, published by Bastian Stern on August 10, 2023 on The Effective Altruism Forum.Prompted by the FTX collapse, the rapid progress in AI, and increased mainstream acceptance of AI risk concerns, there has recently been a fair amount of discussion among EAs whether it would make sense to rebalance the movement's portfolio of outreach/recruitment/movement-building activities away from efforts that use EA/EA-related framings and towards projects that instead focus on the constituent causes. In March 2023, Open Philanthropy's Alexander Berger invited Claire Zabel (Open Phil), James Snowden (Open Phil), Max Dalton (CEA), Nicole Ross (CEA), Niel Bowerman (80k), Will MacAskill (GPI), and myself (Open Phil, staffing the group) to join a working group on this and related questions.In the end, the group only ended up having two meetings, in part because it proved more difficult than expected to surface key action-relevant disagreements. Prior to the first session, participants circulated relevant memos and their initial thoughts on the topic. The group also did a small amount of evidence-gathering on how the FTX collapse has impacted the perception of EA among key target audiences. At the end of the process, working group members filled in an anonymous survey where they specified their level of agreement with a list of ideas/hypotheses that were generated during the two sessions. This included many proposals/questions for which this group/its members aren't the relevant decision-makers, e.g. proposals about actions taken/changes made by various organisations.The idea behind discussing these wasn't for this group to make any sort of direct decisions about them, but rather to get a better sense of what people thought about them in the abstract, in the hope that this might sharpen the discussion about the broader question at issue.Some points of significant agreement:Overall, there seems to have been near-consensus that relative to the status quo, it would be desirable for the movement to invest more heavily in cause-area-specific outreach, at least as an experiment, and less (in proportional terms) in outreach that uses EA/EA-related framings. At the same time, several participants also expressed concern about overshooting by scaling back on forms of outreach with a strong track-record and thereby "throwing out the baby with the bathwater", and there seems to have been consensus that a non-trivial fraction of outreach efforts that are framed in EA terms are still worth supporting.Consistently with this, when asked in the final survey to what extent the EA movement should rebalance its portfolio of outreach/recruitment/movement-building activities away from efforts that use EA/EA-related framings and towards projects that instead focus on the constituent causes, responses generally ranged from 6-8 on a 10-point scale (where 5=stick with the status quo allocation, 0=rebalance 100% to outreach using EA framings, 10=rebalance 100% to outreached framed in terms of constituent causes), with one respondent selecting 3/10.There was consensus that it would be good if CEA replaced one of its (currently) three annual conferences with a conference that's explicitly framed as being about x-risk or AI-risk focused conference. This was the most concrete recommendation to come out of this working group. My sense from the discussion was that this consensus was mainly driven by people agreeing that there would be value of information to be gained from trying this; I perceived more disagreement about how likely it is that this would prove a good permanent change.In response to a corresponding prompt (" . at least one of the EAGs should get replaced by an x-risk or AI-risk focused conference ."), answers ranged from 7-9 (mean 7.9), on a scale where 0=ve...