EA - EAG talks are underrated IMO by Chi

The Nonlinear Library: EA Forum - Ein Podcast von The Nonlinear Fund

Podcast artwork

Kategorien:

Link to original articleWelcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: EAG talks are underrated IMO, published by Chi on May 20, 2023 on The Effective Altruism Forum.Underrated is relative. My position is something like "most people should consider going to >1 EAG talk" and not "most people should spend most of their EAG in talks." This probably most applies to people who are kind of like me. (Been involved for a while, already have a strong network, don't need to do 1-1s for their job.)There's a meme that 1-1s are clearly the most valuable part of EAG(x) and that you should not really go to talks. (See e.g. this, this, this, they don't say exactly this but I think push in the direction of the meme.)I think EAG talks can be really interesting and are underrated. It's true that most of them are recorded and you could watch them later but I'm guessing most people don't actually do that. It also takes a while for them to be uploaded.I still think 1-1s are pretty great, especially if you'renew and don't know many people yet (or otherwise mostly want to increase the number of people you know),have a very specific thing you're trying to get out of EAG and talking to lots of people seems to be the right thing to achieve it.I'm mostly writing this post because I think the meme is really strong in some parts of the EA community. I can imagine that some people in the EA community would feel bad for attending talks because it doesn't feel "optimal." If you feel like you need permission, I want to give you permission to go to talks without feeling bad. Another motivation is that I recently attended my first set of EAG talks in years (I was doing lots of 1-1s for my job before) and was really surprised by how great they were. (That said, it was a bit hit or miss.) I previously accidentally assumed that talks and other prepared sessions would give me ~nothing.See also the rule of equal and opposite advice (1, 2) although I haven't actually read the posts I linked.My best guess is that people in EA are more biased towards taking actions that are part of a collectively "optimal" plan for [generic human with willpower and without any other properties] than taking actions that are good given realistic counterfactuals.Thanks for listening. To help us out with The Nonlinear Library or to learn more, please visit nonlinear.org.

Visit the podcast's native language site