EA - Announcing the Longtermism Fund by Michael Townsend

The Nonlinear Library: EA Forum - Ein Podcast von The Nonlinear Fund

Podcast artwork

Kategorien:

Link to original articleWelcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Announcing the Longtermism Fund, published by Michael Townsend on August 11, 2022 on The Effective Altruism Forum. Longview Philanthropy and Giving What We Can would like to announce a new fund for donors looking to support longtermist work: the Longtermism Fund. In this post, we outline the motivation behind the fund, reasons you may (or may not) choose to donate using it, and some questions we expect donors may have. What work will the Longtermism Fund support? The fund supports work that: Reduces existential and catastrophic risks, such as those coming from misaligned artificial intelligence, pandemics, and nuclear war. Promotes, improves, and implements key longtermist ideas. The Longtermism Fund aims to be a strong donation option for a wide range of donors interested in longtermism. The fund focuses on organisations that: Have a compelling and transparent case in favour of their cost effectiveness that most donors interested in longtermism will understand; and/or May benefit from being funded by a large number of donors (rather than one specific organisation or donor) — for example, organisations promoting longtermist ideas to the broader public may be more effective if they have been democratically funded. There are other funders supporting longtermist work in this space, such as Open Philanthropy and the FTX Future Fund. The Longtermism Fund's grantmaking is managed by Longview Philanthropy, which works closely with these other organisations, and is well positioned to coordinate with them to efficiently direct funding to the most cost-effective organisations. The fund will make grants approximately once each quarter. To give donors a sense of the kind of work within the fund’s scope, here are some examples of organisations the fund would likely give grants to if funds were disbursed today: The Johns Hopkins Center for Health Security (CHS) — CHS is an independent research organisation working to improve organisations, systems, and tools used to prevent and respond to public health crises, including pandemics. Council on Strategic Risks (CSR) — CSR analyses and addresses core systemic risks to security. They focus on how different risks intersect (for example, how nuclear and climate risks may exacerbate each other) and seek to address them by working with key decision-makers. Centre for Human-Compatible Artificial Intelligence (CHAI) — CHAI is a research organisation aiming to shift the development of AI away from potentially dangerous systems we could lose control over, and towards provably safe systems that act in accordance with human interests even as they become increasingly powerful. Centre for the Governance of AI (GovAI) — GovAI is a policy research organisation that aims to build “a global research community, dedicated to helping humanity navigate the transition to a world with advanced AI.” The vision behind the Longtermism Fund We think that longtermism as an idea and movement is likely to become significantly more mainstream — especially with Will MacAskill’s soon-to-be-released book, What We Owe The Future, and popular creators becoming more involved in promoting longtermist ideas. But what’s the call to action? For many who want to contribute to longtermism, focusing on their careers (perhaps by pursuing one of 80,000 Hours’ high-impact career paths) will be their best option. But for many others — and perhaps for most people — the most straightforward and accessible way to contribute is through donations. Our aim is for the Longtermism Fund to make it easier for people to support highly effective organisations working to improve the long-term future. Not only do we think that the money this fund will move will have significant impact, we also think the fund will provide another avenue for the broader community to engage with and implement these...

Visit the podcast's native language site