EA - CEA: still doing CEA things by Ben West
The Nonlinear Library: EA Forum - Ein Podcast von The Nonlinear Fund

Kategorien:
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: CEA: still doing CEA things, published by Ben West on July 15, 2023 on The Effective Altruism Forum.This is a linkpost for our new and improved public dashboard, masquerading as a mini midyear updateIt's been a turbulent few months, but amidst losing an Executive Director, gaining an Interim Managing Director, and searching for a CEO, CEA has done lots of cool stuff so far in 2023.The headline numbers4,336 conference attendees (2,695 EA Global, 1,641 EAGx)133,041 hours of engagement on the Forum, including 60,507 hours of engagement with non-Community posts (60% of total engagement on posts)26 university groups and 33 organizers in UGAP622 participants in Virtual ProgramsThere's much more, including historical data and a wider range of metrics, in the dashboard!UpdatesThe work of our Community Health & Special Projects and Communications teams lend themselves less easily to stat-stuffing, but you can read recent updates from both:Community Health & Special Projects: Updates and Contacting UsHow CEA's communications team is thinking about EA communications at the momentWhat else is new?Our staff, like many others in the community (and beyond), have spent more time this year thinking about how we should respond to the rapidly evolving AI landscape. We expect more of the community's attention and resources to be directed toward AI safety at the margin, and are asking ourselves how best to balance this with principles-first EA community building.Any major changes to our strategy will have to wait until our new CEO is in place, but we have been looking for opportunities to improve our situational awareness and experiment with new products, including:Exploring and potentially organizing a large conference focussed on existential risk and/or AI safetyLearning more about and potentially supporting some AI safety groupsSupporting AI safety communications effortsThese projects are not yet ready to be announcements or commitments, but we thought it worth sharing at a high level as a guide to the direction of our thinking. If they intersect with your projects or plans, please let us know and we'll be happy to discuss more.It's worth reiterating that our priorities haven't changed since we wrote about our work in 2022: helping people who have heard about EA to deeply understand the ideas, and to find opportunities for making an impact in important fields. We continue to think that top-of-funnel growth is likely already at or above healthy levels, so rather than aiming to increase the rate any further, we want to make that growth go well.You can read more about our strategy here, including how we make some of the key decisions we are responsible for, and a list of things we are not focusing on. And it remains the case that we do not think of ourselves as having or wanting control over the EA community. We believe that a wide range of ideas and approaches are consistent with the core principles underpinning EA, and encourage others to identify and experiment with filling gaps left by our work.Impact storiesAnd finally, it wouldn't be a CEA update without a few #impact-stories:OnlineTraining for Good posted about their EU Tech Policy Fellowship on the EA Forum. 12/100+ applicants they received came from the Forum, and 6 of these 12 successfully made it on to the program, out of 17 total program slots.Community Health & Special ProjectsFollowing the TIME article about sexual misconduct, people have raised a higher-than-usual number of concerns from the past that they had noticed or experienced in the community but hadn't raised at the time. In many of these cases we've been able to act to reduce risk in the community, such as warning people about inappropriate behavior and removing people from CEA spaces when their past behavior has caused harm.Communicati...