AI for legal departments: Managing legal and regulatory risks within Copilot

Tech Law Talks - Ein Podcast von Reed Smith

Anthony Diana and Samantha Walsh are joined by Lighthouse’s Chris Baird as part of our series on what legal teams need to know about Microsoft 365 AI-driven productivity tool, Copilot. This episode presents an overview of the risks relating to Copilot’s access to and use of privileged and sensitive data and how businesses can mitigate these risks, including using Microsoft 365's access control tools and user training.  In particular, the episode provides in-depth information about Microsoft 365's sensitivity labels and how they can be used to refine a business’s approach to managing risk associated with privileged and sensitive data stored in Microsoft 365. ----more---- Transcript: Intro: Hello, and welcome to Tech Law Talks, a podcast brought to you by Reed Smith's Emerging Technologies Group. In each episode of this podcast, we will discuss cutting edge issues on technology, data, and the law. We will provide practical observations on a wide variety of technology and data topics to give you quick and actionable tips to address the issues you are dealing with every day.  Anthony: Hello, this is Anthony Diana, a partner here in Reed Smith's Emerging Technologies group, and welcome to Tech Law Talks and our podcast series on AI for legal departments with a focus on managing legal and regulatory risks with Microsoft Copilot that Reed Smith is presenting with Lighthouse. With me today are Sam Walsh from Reed Smith's Emerging Technologies Group and Chris Baird from Lighthouse. Welcome, guys. Just to level set, Copilot is sort of the AI tool that Microsoft has launched relatively recently to improve productivity within the Microsoft environment. There are a number of risks that we went through in a previous podcast that you have to consider, particularly legal departments, when you're launching Copilot within your organization. And let me just start to level set with Chris, if you could give a little bit of a technical background on how Copilot works.  Chris: Absolutely, Anthony. So thanks Thanks for having me. So I guess a couple of key points, because as we go through this conversation, things are going to come up around how Copilot is used. And you touched on it there. The key objective is to increase, improve data quality, increase productivity. So we want really good data in, want to maximize the data that we've got at our disposal and make the most of that data, make it available to Copilot. But we want to do so in a way that we're not oversharing data. We're not getting bad legacy data in, you know, stale data. And we're not getting data from departments that maybe we shouldn't have pulled it in, right? So that's one of the key things. We all know what Copilot does. In terms of its architecture, so think about it. You're in your Canvas, whatever your favorite Canvas is. It's Microsoft Word, it's Teams, it's PowerPoint. You're going to ask Copilot to give you some information to help you with a task, right? And the first piece of the architecture is you're going to make that request. Copilot's going to send a request into your Microsoft 365 tenant. Where is your data? It's going to use APIs. It's going to hit the Graph API. There's a whole semantic layer around that. And it's going to say, hey, I've got this guy, Chris. He wants to get access to this data. He's asking me this question. Have you got his data? And the first thing, really, there's this important term Microsoft use. They call it grounding. When you make your request into Copilot, whatever you request, you're going to get data back that's grounded to you. So you're not going to get data back from an open AI model, from Bing AI. You're only going to get data that's available to you. The issue with that is if you've got access to data you didn't know you had, you know, through poor governance. Maybe somebody shared a link with you two years ago. That data is going to be available to you as well. But what's going to happen, a few clever things happen from

Visit the podcast's native language site