EA - The community health team’s work on interpersonal harm in the community by Julia Wise
The Nonlinear Library: EA Forum - Ein Podcast von The Nonlinear Fund
Kategorien:
Link to original articleWelcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: The community health team’s work on interpersonal harm in the community, published by Julia Wise on August 19, 2022 on The Effective Altruism Forum. This post aims to explain the work the community health team at the Centre for Effective Altruism does about particular kinds of community problems. The team does several kinds of work aimed at supporting the EA community and reducing risks to EA’s ability to have positive impact. We spend most of our time on those other kinds of work, but this post only focuses on work on interpersonal harm. We think this is likely the part of our work people have the most questions and confusion about, so we wanted to share more about it. In short, we try to reduce risk of harm to members of the community while being fair to people who are accused of wrongdoing. That’s a tricky balance, particularly when the need for confidentiality limits our ability to speak to everyone involved, and we sometimes get parts wrong. This post describes both some general principles and a year’s worth of specific examples. I’m not writing this now because anything in particular is going on. My goal here is to provide some transparency about how these things work in general, rather than commenting on any particular current situation. What kind of situation is this post about? It’s about actions people sometimes take that cause harm to others in the community, for example: Someone pushes past another person’s boundaries. This ranges from accidental discomfort, to sexual harassment, to deliberate sexual assault. Needlessly harsh or mean behavior. Erratic behavior that causes disruption or harm for others Deception / dishonesty Internally, we call this “risky actor” work. Concrete examples are below. Responses the community health team might make no action talking with the person about how to improve their behavior restricting them from CEA events Informing other EA groups / projects / organizations about the problem (very rarely) publicly warning others about the problem Often it’s very unclear what the best response would be, and people will disagree about whether we handled something well. Who works on these situations?The community health team is Nicole Ross (manager), Catherine Low, Chana Messinger, Eve McCormick, and me. I’ve done this kind of work at CEA since 2016. Currently Catherine and I are the main people on the team handling these kinds of situations. More background. Other people including group organizers also end up handling such situations. Difficult trade-offs Balances where I think both sides are valuable, and I’m not sure if we’ve got the balance right: Avoid false negatives: take action if there’s reason to think someone is causing problemsAvoid false positives: don’t unfairly harm someone’s reputation / ability to participate in EA Keep the community health team’s scope within what we can realistically handle; don’t take on too muchDon’t stand idly by while people do harm in the communityEncourage the sharing of research and other work, even if the people producing it have done bad stuff personallyDon’t let people use EA to gain social status that they’ll use to do more bad stuffTake the talent bottleneck seriously; don’t hamper hiring / projects too muchTake culture seriously; don’t create a culture where people can predictably get away with bad stuff if they’re also producing impactTry to improve the gender balance / not make it worse; take strong action on behavior that makes women uncomfortable in EA spacesDon’t crack down too much on spontaneity / dating / socializing; don’t make men feel that a slip-up or distorted accusation will ruin their lifeLet people know we take action against bad behavior and we care about thisDon’t create the impression that EA spaces are fully screened and safe - that’s not the caseGive people a second o...
