EA - Open Philanthropy is hiring for multiple roles across our Global Catastrophic Risks teams by Open Philanthropy

The Nonlinear Library: EA Forum - Ein Podcast von The Nonlinear Fund

Kategorien:

Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Open Philanthropy is hiring for multiple roles across our Global Catastrophic Risks teams, published by Open Philanthropy on September 30, 2023 on The Effective Altruism Forum.It's been another busy year at Open Philanthropy; after nearly doubling the size of our team in 2022, we've added over 30 new team members so far in 2023. Now we're launching a number of open applications for roles in all of our Global Catastrophic Risks (GCR) cause area teams (AI Governance and Policy, Technical AI Safety, Biosecurity & Pandemic Preparedness, GCR Cause Prioritization, and GCR Capacity Building).The application, job descriptions, and general team information are available here. Notably, you can apply to as many of these positions as you'd like with a single application form!We're hiring because our GCR teams feel pinched and really need more capacity. Program Officers in GCR areas think that growing their teams will lead them to make significantly more grants at or above our current bar. We've had to turn down potentially promising opportunities because we didn't have enough time to investigate them; on the flip side, we're likely currently allocating tens of millions of dollars suboptimally in ways that more hours could reveal and correct.On the research side, we've had to triage important projects that underpin our grantmaking and inform others' work, such as work on the value of Open Phil's last dollar and deep dives into various technical alignment agendas. And on the operational side, maintaining flexibility in grantmaking at our scale requires significant creative logistical work. Both last year's reduction in capital available for GCR projects (in the near term) and the uptick in opportunities following the global boom of interest in AI risk make our grantmaking look relatively more important; compared to last year, we're now looking at more opportunities in a space with less total funding.GCR roles we're now hiring for include:Program associates to make grants in technical AI governance mechanisms, US AI policy advocacy, general AI governance, technical AI safety, biosecurity & pandemic preparedness, EA community building, AI safety field building, and EA university groups.Researchers to identify and evaluate new areas for GCR grantmaking, conduct research on catastrophic risks beyond our current grantmaking areas, and oversee a range of research efforts in biosecurity. We're also interested in researchers to analyze issues in technical AI safety and (separately) the natural sciences.Operations roles embedded within our GCR grantmaking teams: the Biosecurity & Pandemic Preparedness team is looking for an infosec specialist, an ops generalist, and an executive assistant (who may also support some other teams); the GCR Capacity Building team is looking for an ops generalist.Most of these hires have multiple possible seniority levels; whether you're just starting in your field or have advanced expertise, we encourage you to apply.If you know someone who would be great for one of these roles, please refer them to us. We welcome external referrals and have found them extremely helpful in the past. We also offer a $5,000 referral bonus; more information here.How we're approaching these hiresYou only need to apply once to opt into consideration for as many of these roles as you're interested in. A checkbox on the application form will ask which roles you'd like to be considered for. We've also made efforts to streamline work tests and use the same tests for multiple roles where possible; however, some roles do use different work tests, so it's possible you'll still have to take different work tests for different roles, especially if you're interested in roles across a wide array of skillsets (e.g., both research and operations). You may also have interviews with mu...

Visit the podcast's native language site