“The day Elon Musk’s AI became a Nazi (and what it means for AI safety) | New video from AI in Context” by ChanaMessinger, Aric Floyd
EA Forum Podcast (All audio) - Ein Podcast von EA Forum Team

Kategorien:
If you just want a link to the video, watch it here!Watch nowWhat's AI in Context? (Skip if you already know) AI in Context is 80,000 Hours’ new(ish) Youtube channel, hosted by Aric Floyd. We’re trying to do high production storytelling that also informs people about transformative AI and its risks (but there's a lot of paths our future strategy could take). We talk about our launch more here.The MechaHitler video Probably the EA Forum disproportionately knows what MechaHitler is, but not everyone is terminally online, so, a summary: Earlier this year, Elon Musk's AI model, Grok, which can interact with users and post directly to Twitter, suddenly turned from being a fairly neutral commentator on events to a sexually harassing, Nazi-minded troll calling itself ‘MechaHitler’. Our new video is about that incident and how it happened, which means talking about what specifically happened (an accidental system [...] ---Outline:(00:20) What's AI in Context? (Skip if you already know)(00:45) The MechaHitler video(01:29) Why this?(03:29) Logistics (only read if you're interested)(04:24) Strategy and future of the video program(05:29) Subscribing and sharing(05:53) Request for feedback--- First published: October 2nd, 2025 Source: https://forum.effectivealtruism.org/posts/trh4Km9KRedYSn3K3/the-day-elon-musk-s-ai-became-a-nazi-and-what-it-means-for --- Narrated by TYPE III AUDIO.