Built-In Bias: Existing Real-World Inequality in AI and Other Technology

In this episode of Phishy Business, we talk about various important discussions around AI, including the concerning issue of built-in bias and stereotypes. Imagine AI thinking that all doctors must be male, and all nurses must be female? Well, according to ChatGPT, they are. Our special guest is Ivana Bartoletti, Global Privacy Officer at Wipro. Ivana has a human rights background and is an internationally recognized thought leader in privacy, data protection, and responsible technology. She’s a fellow at Virginia Tech, a published author, and the founder of the Women Leading in AI Network. Ivana says that she works at the intersection of technology and law and focuses on privacy advocacy. Ivana concentrates her efforts on the collection of data and how that data is used in technology such as AI. In ‘Built-In Bias: Existing Real-World Inequality in AI and Other Technology’, we discuss: How Ivana’s book came about, the themes covered, and how much has changed in this space since it was written. Built-in bias in data and AI technology. The protection of democracy and human rights when it comes to data collection, digital privacy, and AI. Having legislation in place for safe adoption of AI. The hype around the dangers of AI. The European Union’s proposed AI regulation and businesses speaking out against the Act. Cybersecurity considerations when it comes to AI. The Women Leading in AI Network – why it was started and its purpose.  

Om Podcasten

Ready to change how you think about cybersecurity? Every other week, Mimecast’s Brian Pinnock and Alice Jeffrey are joined by a special guest for tales of risk, reward and just a dash of ridiculousness. Whether it’s a tech expert who is not your average CIO or an expert from a field you wouldn’t expect, we’ll be exploring the lesser seen side of cybersecurity – to learn how we can all improve in the fight to stay safe.