EA - Why EA Community building by Rob Gledhill
The Nonlinear Library: EA Forum - Ein Podcast von The Nonlinear Fund

Kategorien:
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Why EA Community building, published by Rob Gledhill on June 15, 2023 on The Effective Altruism Forum.I have heard people who are uncertain about whether EA community building is the right move for them, given the increased prominence of AI Safety. I think that EA community building is the right choice for a significant number of people, and wanted to lay out why I believe this.AI Safety Community building seems importantIâm excited to see AI Safety specific community building and I hope it continues to grow. This piece is not intended to claim that no-one should be working on AIS community building. Although CEAâs groups team is at an EA organisation, not an AI-safety organisation. I hope we can collaborate with AI Safety groups, as:It would likely benefit both parties to synch on issues like data collectionI think there are lessons learned from EA community building that would be relevant and valuable to shareThe reasons that I think the case for AI Safety community building is strong, are:If we want people to work in AI Safety, directly talking about AI Safety seems the most straightforward way to do thisThere are talented people who will find the AI Safety framing attractive, but would not like the EA framingEarly AIS community building efforts have managed to attract significant numbers of talented individuals (although I donât think itâs inevitable that these early wins will scale, or successfully avoid causing accidental harm)EA community building is also importantI think EA community building is still very valuable, for five reasonsEA groups have been successfulIn the 2020 EA and Longtermist survey, local groups were mentioned by 42% of respondentsI care about EA values in decision makers during crunch time. E.g., I think people in the EA movement have thought unusually deeply about what catastrophes would and wouldnât lead to the loss of humanityâs future potentialHaving a compelling answer to the question âhow do I do the most goodâ or âhow do I live a good lifeâ has been something that has historically attracted a lot of talent, and talent that would not necessarily have been attracted by AI Safety (conversely I expect AI Safety groups to attract people who wouldnât be drawn to discussions of âhow do I do the most goodâ)Specific, talented, organisers can be a better fit for either AIS or EA, and I want both options to existI think for both options to exist, both options need to have great organisers. If all of the best organisers went for a single option, I think the other option would either become irrelevant, or cease to exist.It seems important to note that it is still possible that the risks from AI donât manifest in the way that EAs widely expect, in which case weâll be glad that we have a network of people that care about EA ideasI want to see collaboration between EA and AIS community buildingAlthough this isnât the reason Iâd like to see AIS CB, there is an extra benefit: I think work on the most pressing problems could go better if EAs did not form a super-majority (but still a large enough faction that EA principles are strongly weighted in decision making)Since the FTX crisis there has been increasing discussion about trustingness amongst EAs. Although I think the FTX crisis could have happened in less trusting communities (e.g., Many VCs also lost money in FTX) - I think it is true that there are areas where high trust is harmful. I think operating in an environment where EAs arenât a super-majority would improve certain processes that currently overly rely on trust. Additionally I think having EA form a part of your identity can cause in-group effects, where ideas from the outgroup arenât taken seriously enough. I suspect this would be lessened if people identifying as EA didnât form a majorityBased on the above, EA commu...