EA - My emotional reaction to the current funding situation by Sam Brown

The Nonlinear Library: EA Forum - Ein Podcast von The Nonlinear Fund

Podcast artwork

Kategorien:

Link to original articleWelcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: My emotional reaction to the current funding situation, published by Sam Brown on September 11, 2022 on The Effective Altruism Forum. I’m allowed to spend two days a week at Trajan House, a building in Oxford which houses the Center for Effective Altruism (CEA), along with a few EA-related bodies. Two days is what I asked for, and what I received. The rest of the time I spend in the Bodleian Library of the University of Oxford (about £30/year, if you can demonstrate an acceptable “research need”), a desk at a coworking space in Ethical Property (which houses Refugee Welcome, among other non-EA bodies, for £200/month), Common Ground (a cafe/co-working space which I’ve recommended to people as a place where the staff explicitly explain, if you ask, that you don’t need to order anything to stay as long as you like), a large family house I’m friends with, and various cafes and restaurants where I can sit for hours while only drinking mint tea. I’m allowed to use the hot-desk space at Trajan House because I’m a recipient of an EA Long Term Future Fund grant, to research Alignment. (I call this “AI safety” to most people, and sometimes have to explain that AI stands for Artificial Intelligence.) I judged that 6 months of salary at the level of my previous startup job, with a small expenses budget, came to about £40,000. This is what I asked for, and what I received. At my previous job I thought I was having a measurable, meaningful impact on climate change. When I started there, I imagined that I’d go on to found my own startup. I promised myself it would be the last time I’d be employed. When I quit that startup job, I spent around a year doing nothing-much. I applied to Oxford’s Philosophy BPhil, unsuccessfully. I looked at startup incubators and accelerators. But mostly, I researched Alignment groups. I visited Conjecture, and talked to people from Deep Mind, and the Future of Humanity Institute. What I was trying to do, was to discern whether Alignment was “real” or not. Certainly, I decided, some of these people were cleverer than me, more hard-working than me, better-informed. Some seem deluded, but not all. At the very least, it’s not just a bunch of netizens from a particular online community, whose friend earned a crypto fortune. During the year I was unemployed, I lived very cheaply. I’m familiar with the lifestyle, and – if I’m honest – I like it. Whereas for my holidays while employed I’d hire or buy a motorbike, and go travelling abroad, or scuba dive, instead my holidays would be spent doing DIY at a friend’s holiday home for free board, or taking a bivi bag to sleep in the fields around Oxford. The exceptions to this thrift were both EA-related, and both fully-funded. In one, for which my nickname of “Huel and hot-tubs” never caught on, I was successfully reassured by someone I found very smart that my proposed Alignment research project was worthwhile. In the other, I and others were flown out to the San Francisco Bay Area for an all-expenses-paid retreat to learn how to better build communities. My hotel room had a nightly price written on the inside of the door: $500. Surely no one ever paid that. Shortly afterwards, I heard that the EA-adjacent community were buying the entire hotel. While at the first retreat, I submitted my application for funding. While in Berkeley for the second, I discovered my application was successful. (“I should hire a motorbike, while I’m here.” I didn’t have time, between networking opportunities.) I started calling myself an “independent alignment researcher” to anyone who would listen and let me into offices, workshops, or parties. I fit right in. At one point, people were writing plans on a whiteboard for how we could spend the effectively-infinite amount of money we could ask for. Somehow I couldn’t take it any more, so I ...

Visit the podcast's native language site