EA - Australians call for AI safety to be taken seriously by AlexanderSaeri

The Nonlinear Library: EA Forum - Ein Podcast von The Nonlinear Fund

Kategorien:

Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Australians call for AI safety to be taken seriously, published by AlexanderSaeri on July 21, 2023 on The Effective Altruism Forum.The Australian Government is considering how to regulate AI in Australia, has published a discussion paper ("Safe and Responsible AI"), and has invited feedback by 26 July 2023:"We want your views on how the Australian Government can mitigate any potential risks of AI and support safe and responsible AI practices."Good Ancestors Policy (goodancestors.org.au/policy), with the support of EA and AI Safety community organisers in Australia, have coordinated Australians' submissions to the feedback process.Today, the website Australians for AI Safety launched with a co-signed letter (media release). The letter called on the relevant Australian Federal Minister, Ed Husic, to take AI safety seriously by:recognising the catastrophic and existential risksaddressing uncertain but catastrophic risks alongside other known risksworking with the global community on international governancesupporting research into AI safetyGood Ancestors Policy have also held community workshops across Australia (e.g., Brisbane, Perth) to support members of the EA and AI Safety community in understanding the feedback process and preparing submissions, including access to some of the best evidence and arguments for acknowledging and addressing risks from AI. Policy ideas are drawn from the Global Catastrophic Risk Policy database (), the AI Policy Ideas database (aipolicyideas.com), and expert community input.So far, about 50 members of the community have attended a workshop, and feedback we've received is that the workshops have been very helpful, the majority (~75% people) are likely or very likely (>80% likelihood) to make a submission, and that most (~70% people) would be unlikely or very unlikely (

Visit the podcast's native language site