EA - Why I Spoke to TIME Magazine, and My Experience as a Female AI Researcher in Silicon Valley with Sexual Harassment/Abuse by Lucretia
The Nonlinear Library: EA Forum - Ein Podcast von The Nonlinear Fund

Kategorien:
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Why I Spoke to TIME Magazine, and My Experience as a Female AI Researcher in Silicon Valley with Sexual Harassment/Abuse, published by Lucretia on June 11, 2023 on The Effective Altruism Forum.Crossposted on Medium here.Twitter: @lucreti_aThank you to the supportive EA members who encouraged me to publicly share this difficult experience, to my friends and research collaborators for your kindness, and to the courageous women who helped me in writing this post, who I hope can someday speak publicly.To those who know me, please call me Lucretia.This is a megapost. Each section has a distinct purpose and may evolve into its own standalone post. For the full picture, I recommend reading to the end.0. OverviewIntroduction. I was one of the women who spoke to TIME magazine about sexual harassment and abuse in EA last winter. Here is my story without media distortions.Advice for Female Founders and AI Researchers in the Valley. Silicon Valley can be a brutal place for women. This is what I wish I knew five years ago.My Case Study. I am an AI researcher. I believe my AI alignment research career was needlessly encumbered by:My experience with the sexually abusive red pill and pickup artist sphere, which entwined with a branch of AI safety in Cambridge, MA and Silicon Valley. I describe the unethical core of red pill ideology, including the running of ârape scripts.âThe recent retaliation by a Silicon Valley AI community to my report of harm. This communityâs aggressive reaction showed many gender biases latent in AI culture.Systemic Sexual Violence in Silicon Valley. I believe the male-dominated environment, nepotistic connections to investor money, extreme power disparities between wealthy AI researchers and aspiring young women in the AI and startup sectors, hacker house party culture, psychedelics misused as date rape drugs, cults of personality, substantial population of low empathy, risk-seeking, and/or narcissistic men, and lack of functional policing mechanisms make sexual violence a systemic problem in a critical X-risk industry.Why I Spoke to TIME. I address some misconceptions about the original TIME article on sexual harassment, and why I spoke to TIME in the first place.Helpful Books and Movies. I share learnings about sexual harassment and abuse after ~15 months of focusing on the problem, including my favorite books and movies about sexual harassment/abuse to flesh out more conceptual space. For all the seriousness of this post, these books and movies are entertaining, gorgeous, and healing!Future Sequences? Depending on the reactions to this post, I would love to write a Sequence of sexual harassment and abuse from first principles.Call to Action: Recovery and Litigation Funds. AGI should neither be built nor aligned in environments of deceit. We propose a call-to-action for a Recovery Fund and Sociological AI Alignment Fund / Litigation Fund to counteract the sexual predation Moloch in Silicon Valley, which is a sociological AI safety problem.AppendixExcerpts from red pill literatureRape vs Consent Culture1. IntroductionSome recent posts on the EA forum have thoughtfully and earnestly addressed sexual harassment and abuse. Thank you to the EA community for your insightful posts and comments, and for genuinely trying to address the problem, which made my distressing experience of speaking to the TIME magazine journalist more worth it.Given the mix of emotions generated by the TIME magazine piece published last winter, I want to clarify some points, avoid tribalism, and focus on first principles, mechanisms, and my own experience. This post isn't meant to be EA-hating. On the contrary, I deeply love parts of EA and AI alignment research, which has made these experiences somewhat heartbreaking.I believe my AI research career has been needl...