EA - Monetary and social incentives in longtermist careers by Vaidehi Agarwalla

The Nonlinear Library: EA Forum - Ein Podcast von The Nonlinear Fund

Kategorien:

Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Monetary and social incentives in longtermist careers, published by Vaidehi Agarwalla on September 24, 2023 on The Effective Altruism Forum.In this post I talk about several strong non-epistemic incentives and issues that can influence people to pursue longtermistcareer paths (and specifically x-risk reduction careers and AI safety) for EA community members.For what it's worth, I personally I am sympathetic to longtermism, and to people who want to create more incentives for longtermist careers, because of the high urgency some assign to AI Safety and the fact that longtermism is a relatively new field. I am currently running career support pilots to support early-career longtermists.) However I think it's important to think carefully about career choices, even when it's difficult. I'm worried that these incentives lead people to feel (unconscious & conscious) pressure to pursue (certain) longtermist career paths even if it may not be the right choice for them. I think it's good for to be thoughtful about cause prioritization and career choices, especially for people earlier in their careers.IncentivesGood pay and job securityIn general, longtermist careers pay very well compared to standard nonprofit jobs, and early career roles are sometimes competitive with for-profit jobs (

Visit the podcast's native language site