EA - Why Wasting EA Money is Bad by Jordan Arel
The Nonlinear Library: EA Forum - Ein Podcast von The Nonlinear Fund
Kategorien:
Link to original articleWelcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Why Wasting EA Money is Bad, published by Jordan Arel on September 22, 2022 on The Effective Altruism Forum. The thought crossed my mind today, “should I take the BART or Uber to the airport on the way to EAG DC..?” Among other considerations, I thought “well the BART would be much cheaper, but EA will compensate me for the Uber, so maybe cost shouldn’t be much of a consideration.” After thinking this, I thought “wow, what a sketchy line of logic.” Yet I don’t think this way of thinking is entirely uncommon among EAs. Shortly after this I came across this article about how EA Berkeley is wasting money in the EA UC Berkeley Slack channel. While I found the article a little bit confused and it seems to have some factual errors, and some of the claims were made somewhat less credible by the fact that the author then proceeded to post some somewhat aggressive comments toward people in the slack, I nonetheless find the criticism that EAs waste money to be alarming and valid and think it is important to address before the issue balloons out of hand. Basically, I think this argument has a few levels. On the first level, you could say that money is really valuable and since we can say that something like $200 (please correct me if this number is inaccurate) could save a year of someone’s life via GiveWell top charities, we should take this as a real consideration and have a very high bar for wasting money. Against that you could argue that, well, we have an insane amount of money for the size of the movement, if we very roughly have something like $50 billion and 2000 highly engaged EAs which have both been relatively stable over the past few years, if all of that money was spent by current EAs in our lifetime of ~50 years that’s about $500,000 per person, PER YEAR. That’s a lot. So even if it makes me only a minuscule amount more efficient, if the work I’m doing is high value enough in contributing to the community, then maybe it’s worth it. But then that only makes sense if the work I’m doing is extremely extremely valuable, because I still have to compare it against the bar of $200 equals ~1 year of life saved. So if a $50 Uber ride saves me half an hour, my half an hour must be more valuable than a three months of someone else’s life. That’s a pretty big claim. But, then, the claims of longtemism are quite big indeed. Bostrom calculates that a one second delay in colonizing space may be equivalent to something like the loss of 100 trillion human lives, due to galaxies we could potentially colonize moving away from us in every direction at fast speeds. Working on existential risk reduction, rather than speeding up technological progress and space colonization, likely increases this expected value by several orders of magnitude.. So if I am one of the very small number of people who is most obsessed with these ideas and competent/privileged enough to make a difference, and in expectation it seems that people explicitly working to reduce existential risk are most likely to succeed at doing so, then yes maybe saving half an hour of my time may actually have, in expectation, an un-intuitively massive positive impact. But then what about the article above and other criticisms? Couldn’t the reputation risk to EA from this way of thinking be very dangerous, both because it attracts people who want to mooch money off of the community, and repels potential collaborators who don’t want to be seen as wasteful? Yes, maybe it does repel certain people, but then again, perhaps it attracts the type of people who understand and agree with our logic, and if our logic is in fact correct and good, then perhaps the type of people who really look at our ideas and actions and evaluate them carefully, and then decide they agree, are exactly the type of people we are trying to attract. Perhaps w...
