Privacy & EdTech for K-12 - 2023 Updates

Share

This session explores the unique privacy challenges for K-12 institutions and EdTech. Join us for an interactive workshop session to learn about the impact of recent changes to US privacy laws, FTC activity and best practices for vetting Ed Tech providers.

Share
Video Transcript
Wonderful. Thank you all for joining us at, five o'clock. On the on conference day number one. I know it's always everyone's always tired, but we're super excited that you you all wanna come visit with us. This is a topic that is near and dear to my heart for those of you who don't know me. My name is Daisy Bennett.

I'm the data protection officer here at Instructure. And I spent a lot of time talking to our customers, helping them understand the really complex web of laws that we to comply with, they have to comply with, how we can help them comply with it, and the different things that we can do to help, protect our students and our teachers and our administrators and everybody's data, and be able to provide a wonderful service that you can all use, and that our community can use. But what I'm super excited about is we have two amazing people with us, that I'm super excited. This is, I'll give you a quick two second lowdown. The these lovely women we have been using as outside Council for Instructure for over eight years.

They are fabulous privacy experts. They know a ton of stuff, so they've got a lot of great information to share with you. Go ahead, Kelly. Alright. Yeah.

I'm Kelly Vastide. I am a co chair of the privacy and data security group at Venibal. We're a national law firm. We are based in Washington, DC. And Daisy stole my thunder, but we've been outside counsel to instructure now for, close to a decade.

Hi folks. I'm Dana Homstrand. I'm gonna associate in Venables privacy group. And I understand that we are the last thing between you and the reception, the hackathon, and getting back to the expo centers. So why don't we start talking privacy.

That's awesome. Thank you. Well, this is the agenda we're gonna loosely cover today. One thing we really wanted to make sure is privacy is such a big topic, and there's we literally could have a three day conference about just privacy and ed So we're hitting as many, like, highlights as we can of things that are happening that have happened this year, things that are gonna happen. We already did our higher ed sessions.

So we're focusing heavily now on the k twelve on K twelve right now, right? US K twelve. So we also really would love for this to be interactive. We have a few folks that were in our previous sessions, so If you have questions, raise your hand, and we'll we'll grab them, like, right in the middle of the session. We ran out of time last time, so we really will try to make sure we have time at the end. But if you have a question, please feel free to raise your hand.

So I'm gonna spend a two seconds talking about the ecosystem why it's so complex, and then we're gonna focus on some hot topics. At the intersection of privacy in k twelve, and then a little peek around the corner on what sort of we're gonna try to pro prognosticate a little bit, make some predictions on what we think's happening in the regulatory space here in the with respect to privacy. And if we get to it, I've got some advice for best practices for, vetting ed tech vendors, and then we have a bunch of different resources, we'll throw on the screens and QR codes for you all to get access to some really cool privacy resources from both Venable. I've got some instructure stuff, but we also have some other great external resources for laws, regulations, white papers, all that good stuff. So for those of you who aren't familiar with how complicated the EdTech ecosystem is.

It's obviously very complex, especially in the k twelve space. So when we were talking with the higher ed folks, complicated in higher ed, but I think K twelve is more complicated because K twelve is so resource con resource constrained that teachers, instructors, principals, administrators, are look for whatever tools they can get their hands on to provide the services they need, especially in our COVID world. Right? We want to be able to provide services and all the different resources for our students as fast as we can, and they don't always have the resources to vet all those vendors and to look at those third parties and to understand what data they process, what data they collect, are they running cookies? We're gonna talk about some FTC enforcement, about on some, folks that were collecting data, they shouldn't have in ways they shouldn't have been collecting them. So it's really complicated, especially for our k twelve users, because Those are are under eighteen, our children using these tools, right? And they have absolutely no control over what tools their schools use, right? Third grader can't say, I don't trust this tool, so I don't wanna use it, miss Smith. I would like to I opt out of using this LMS, right? It would break the entire instructional cycle and the culture of the classroom.

So students just don't have the ability and K-twelve students the privacy harms that they can experience are so much more, heightened compared to an adult, right? A middle aged woman. My stuff gets out there whenever I've got thirty more years of life. The harms aren't that bad. Right? But if you get if you are a seven year old and your Social Security number, all your information is stolen, The harms you're gonna experience, you're gonna spend the next sixty years trying to keep your data safe. And that is what part of why the K-twelve, in my heart, as a privacy professional, the K-twelve space is such a special, unique place.

I could talk about the other two topics for the next two hours, but I do not want to run out of time, because I want Kelly to tell you some really amazing stuff. So this is activities in the state. Yeah. So we're gonna start with some hot topics. So we're gonna do some quick hits, but as Daisy said, if there's anywhere where you want us to dig a layer deeper, just raise your hand and ask questions.

We're given free legal advice out today. So we're gonna start by talking about the state because there is a lot of interest in the privacy of children and minors happening in the state. I think the first bullet up there is state student privacy That's not an area where there's been a lot of legislative interest in the states. The forty or so states that have student privacy laws, those have stayed fairly constant over the over the current legislative cycle. Instead, the focus of state legislators have been on waves of omnibus privacy laws.

That in some cases address children and minors and in state, social media laws and some other things we're gonna talk about. So I think we wanted to start by talking about the state comprehensive or omnibus privacy laws. By the end of this year, we will have five of them in this country. So this started with California with the law called the CCPA, the California Consumer Privacy Act was passed in twenty eighteen, went into effect in twenty twenty. Was amended, and the new, the new amended version went into effect January one, along with a privacy law in Virginia.

And then as of July one, Colorado and Connecticut have, come into effect, Utah will be the end of the year. So at the end of this year, we will have five omnibus privacy laws, but there have been fourteen such laws passed in total. So look for another nine or so that'll come online over the next two year cycle. Legislators are quiet right now, but it wouldn't be surprising if we had another three or five of them passed in the fall when they come back. And this wave is gonna continue.

So we're really facing a state by state patchwork of privacy regulation. These laws are largely notice and consent laws. So they require privacy policy that has to make certain specific disclosures. They give individuals rights over their data, so a right to request a copy, a right of access, deletion, correction, the right to opt out of certain sales, data sales or data disclosures for advertising purposes. In some states like California, these rights are, heightened when it comes to minors.

Or in California's kids under sixteen. So, in places where adults have the right to opt out of certain kinds of sales or disclosures, For minors, it's a right to opt in, which means that those kinds of, advertising use cases have to be turned off for any users who are known to be under sixteen. I think an important thing to note is that most nonprofits are mostly exempt from these laws, but not in their entirety. So Namely Colorado, where we're sitting right now, is the first one of these laws that fully includes nonprofits. Oregon comes into effect in twenty twenty four, and then we'll apply to nonprofits as of twenty twenty five.

So this is a place where even in the space where most of you reside professionally and, you know, you haven't been affected directly by a number privacy laws. It's a place where the patchwork is gonna sort of encroach in your space. And, of course, your for profit vendors or the third parties you work with will be directly subject to these laws, to the extent that they meet the thresholds. They do have thresholds. Usually, you have to be processing a certain data volume like a hundred thousand state residents data.

But in some states, that vol that threshold is lower and California always unique includes HR data, business contact information, and also includes any entity that has revenue of over twenty five million dollars, so it brings in a much wider swath of businesses. I think most importantly for how you all would interact with vendors that are subject to these laws, they necessitate contract changes and amendments. So if you have seen waves of DPA data protection, agenda, data processing agreements that acronym is used in a lot of different ways. That's, you know, large part due to the requirements of these laws where to be a vendor or service provider, contracts have to say very specific things. And, that wave unfortunately is not gonna abate anytime.

Soon. So, these laws are close enough to one another that they are relatively uniform with state specific nuances But, there's a lot of experimentation in some of the pending legislation. So, that might not hold forever. So an area where, privacy is really heavily regulated right now. Also just indicating the state's interest in the privacy of children and minors, there have been a smaller wave of social media laws.

So state legislators have introduced more than a hundred bills in the past year that would govern social media use by minors. We have four that have been passed in Utah, Texas, Arkansas, and Montana. Just to give you a flavor of this legislation, this is from Utah. It'll be in effect March one of twenty twenty four. Social media companies have to.

Verify the age of a Utah adult seeking to maintain or open a social media account, get the consent of a parent or guardian for any user under age eighteen, So for those of you who think about the age of the internet is thirteen plus because of the federal law coppa, that's eroding in these state bills, and they're really looking at under eighteen really broadly. Without parents or guardians full access to their child's account, create default curfew settings that block overnight access to minor accounts, protect minor accounts from direct messaging, block them from search results, minor accounts, they can't collect data, can't target them for or can't have certain addictive, quote, unquote designs or features. How does this affect you? You might be wondering, well, social media platform is defined is an online forum that, makes available for an account holder to create a profile, upload posts, view posts of other account holders and interact with other account holders or users. So we all know it's gonna affect, of course, you know, TikTok, in Instagram, but any kind of online platform that has these kinds of interactive features is potentially subject to these types of social media bills. So an area where, you know, a hundred bills proposed, it's probably likely that we're, again, gonna see a state patchwork of this kind of legislation sort of slowly creep over the map.

And this really is an extension of the trend of changing the age of a minor online from thirteen to eighteen. There's a lot of interest in expanding these protections to, protect anyone under eighteen, especially with all the news there's been about kind of the harms of social media and online usage. Something else to highlight coming out of the states is the California age appropriate design code. So another development, this is along the same lines, but, you know, out of this theme of, promoting online safety for child users. This is a code that would dictate certain design features for that are targeted at minors.

It defines the kind of property that would fall as a property that's subject Kapa or is routine at routinely accessed by a significant number of children. So, again, a sort of undefined standard but could bring in a large swath of properties that don't think of themselves as directed to children under a sort of strict standard. You know, for example, your home pages might be visited by a large number of children, you know, in terms of empirical evidence. And there's a number of factors that you would have to look at to make this determination, but it will look at the design elements, whether children constitute a significant quantity of the audience as determined by company research. At its heart, the California age appropriate design code would require online sites to bake privacy and design privacy by design into their products and features.

And they're really looking for a granular analysis. So the analysis is supposed to be different. If you're targeted at zero to five years of age or what they call preliterate and early literacy, versus six to nine, the core primary school years, ten to twelve, the, quote unquote, transition years, thirteen to fifteen early teens, sixteen to seventeen, what they call approaching adulthood. So, it's not a one size fits all. It's really a sliding scale.

It would require disclosures. For example, the language has to be tailored to that, age group. And you could imagine in a preliterate age group, you know, if you have to make a disclosure, what is that? Maybe it has to be spoken. Right? Maybe it can't be read. But a lot of creative thinking about what features would appeal to different age groups.

It would have designed features with a, quote, unquote, high level of private see that term as undefined in the age appropriate design code. So it's a law that requires a lot of documentation, impact assessments, having to create records of online product services or features, how they use children's personal information, whether those features could harm children, permit children to witness or, be subject to harmful con content. Allow children to be exploited? Did they use algorithms that potentially could be harmful to children? Do they use targeted advertising that could be harmful to children? Do they use certain kinds of incentives that, are intended to increase the child's engagement with the platform, those kind of addictive features, and then there has to be mitigation built in. So you have to look for those harms, decide how they're gonna be addressed, create records, and those records would all be producible to the California attorney general at that office's request. It's modeled on similar code in the UK and Ireland that have been in effect for a couple years.

The interesting thing, it was supposed to go into effect July one of twenty twenty four. But it's been subject to litigation, California, a suit over the age appropriate design code, and there's actually a hearing on that tomorrow. So it's legal status currently in a little bit of jeopardy. Okay. I think that's the end of this.

Well, what's really interesting about the age appropriate design code is you know, as Kelly said, they've the UK has something similar. Ireland has something similar. And the this idea of protecting children of like of of more discreet ages is becoming very common. So you're you're gonna see if this goes through, because California as they say, California goes, the nation goes, when it comes to laws and privacy because they are, they are definitely our most sort of aggressive, if that's the right word, enthusiastic privacy state, and we'll say country. I can make fun of California.

I grew up there, so, but you're gonna start seeing in the tools that schools purchase. Right? If they fall and, you know, if your if your school is purchasing tools, software tools, you might see something that looks very different. A kindergartner is gonna see a very different type of tool than a even maybe a third grader will see, and that's something that some of our schools have to adjust to because you might want that same functionality for everybody. Right? You might have some sort of some very specific type of functionality that maybe that vendor has determined isn't appropriate under the age code. So I hate to say this, but it just is getting a little bit even more complicated than it already is with respect to privacy, and how do we provide services to children in a way that is safe but also still allows, this is a very touchy subject, but it's, I think it's very important, allows our schools to do the right kind of analytics they need to do or interventions.

Or the work that they need to do with respect to keeping track of what their students are doing. And so there's this fine line of how do we meet you know, that schools want to do what's best for the students, but we also also need to do what the law says. And so it gets, I hate to say it complicated and even more complicated. And it's even getting more complicated. Has Kelly's gonna talk about our friends at the Federal Trade Commission? Yeah.

So let's talk about what's happening. Rent in the federal government on children's privacy, and ed tech generally. So the FTC is the nation's primary federal regulator for privacy. They do so in two ways, so they have powers under what they call section five of the FTC Act to regulate unfair or deceptive practices, and they've long used this in privacy and data security cases. They bring enforcement actions and, you know, companies can consent and enter into a twenty year consent order.

They also are the enforcer of certain privacy laws notably Kappa. The Children's online privacy protection act. Nonprofits do not fall within the pcs power generally. So they can't enforce against schools, but they can enforce against your for profit vendors. And EdTech has been a top enforcement priority of the FTC now for over a year.

So in twenty twenty two, they issued a policy statement around their enforcement priorities. Declaring the enforcement of coppa against a tech vendors was one of their number one priorities. And then they continue to reinforce this statement with various public statements kind of reinforcing this. What they said is specifically in investigating potential cop violations by EdTech providers, the FTC intends to scrutinize compliance with the full breadth of the substantive prohibitions of Copa. So they were looking notably at a couple areas Capa has a prohibition against mandatory data collection.

You can't condition a child's participation in a service on collecting more data than the app absolute minimum needed to provide the service. That's an area of focus, kind of rarely enforced under Kappa, but an area of top priority. Use prohibitions. So Kapa limits in the education context how data can be used. It can only be it can only be used to provide education services that allows the school to provide that consent on behalf of the student population, and the vendor doesn't have to go out and get that verifiable parental consent that's on a one to one basis.

The trade off is that data can only be used for education purposes, strictly any purpose outside of that violates Kapa also has data retention pro provisions. Yes. Sure. About is it using, what's a lot? Like, when a sorry. This is why we should say Sure.

Yes. Yes. Yes. So the baseline requirement of Kappa is that, if you collect personal data of a child under age thirteen, you have to get verifiable parental consent. There's a couple exceptions that aren't important here.

However, schools can stand in the can stand in the shoes of the parent and provide a single consent on behalf of all student users, provided that the data is only used for educational purposes. I can even give you a quote. So in this, this is from the enforcement policy statement. In this context, EdTech companies are prohibited from using such information for any commercial purpose. Including marketing, advertising, or other commercial purposes unrelated to the provision of the school requested online service.

If that breaks in any way, then you can't rely on the school to provide that parental consent. There's also something we'll talk about. One of the enforcement actions too, they have to provide the notice to parents to the school. For this all to work. And we'll talk about that because it comes up in one of the enforcement actions.

And then there are data security requirements under too. That's their fourth enforcement priority. I saw Hanover here. Does that include feedback to, like, a third party vendor through Can us or something that, is using that sport, improving their own products. Daisy.

So I'd love to ask you. No. No. No. No.

So that's great. So here's a good example. We could ask that question all the time, and I'm standing up so I can actually see you. Oh, the question sorry. You're supposed to repeat it.

Does this apply to feedback or like data collected for a product improvement? That's using that -- Yes. -- needed to improve their own product. That's a gray area. It isn't So it depends on what product you're talking about, and I know I'm gonna get a little wishy washy here. It depends on what that product is collecting.

Right? So a great example, and I'll use Canvas, because a lot of y'all are here because we have a lot of Canvas, and our other products. But Canvas is the one that choose the most by that age group. Right? And so a great example is Canvas does not collect, doesn't like to get feedback from children. It doesn't ask, doesn't send surveys to children. It doesn't say, Hey, child.

Do you like this screenshot, do you like the color of this banner to improve our products? Right? We do with teachers and instructors all the time because it's super important. Like, it's gotta be usable for y'all. We're not, you know, if it's not, then how are we gonna make something that's appropriate? Canvas does collect something now called usage data. That is how many times somebody clicks on a screen? How many times is, is, so for example, you know, a canvas for elementary? Is there a feature in there that we see users, not any individual user, but we might see that users are taking it's taking two hours to upload an assignment. Or is taking a long time to run a video.

That's the kind of usage data to improve our products. We do collect for children, but it's none of it's personally identifiable. We make that we're not collecting names. We have no Canvas user IDs. Not we're not collecting IP addresses.

We just know that over the course of two weeks, thirty five users maybe in this region had a hard time uploading videos. And that's when we talk about, you know, how to talk to vendors and how to ask questions about vendors, that's one of the most important things from my perspective is to ask them what kind of data are you collecting? As people use the, you know, what kind of metadata? What kind of usage data? When you say usage data, what are you actually talking about? Right. So if the data is de identified, it's gonna fall outside of Kappa. There's also an express exemption in Kappa called support for internal operations where certain internal uses are not subject to the parental consent requirements. Yes.

Was our teacher of college high school in terms of parental access to students' results and so on. If I'm teaching a college Are the parents subject to high school rules or college rules? Well, it's the parent. So we're talking about Kappa, which applies to the collection of data from the child and the child under thirteen. Privacy rules in general. I I know you can't be exact about my class, but in general, does the school does the government, big g? Yeah.

You a high school student going to college to be a college for privacy from, No. They still fall. It's a age space. Under it's an age thing. Mhmm.

It doesn't matter if they're at a higher rate. We have a lot of high school students that take college courses. They're still protected under whatever laws applied to, because keep in mind, we're talk been talking about federal laws. There are what, a hundred and forty seven state student privacy laws in the US. So, yes, in general.

Yeah. Yeah. They so they fall they they fall under whatever applies to them as their age. Right? So, like if it's a sixteen year old, they would apply it to any law that applies to eight member. Access to the students much better for a high school student than college student, whereas they may have been good for the college process.

Which is very restrictive, basically, caused to basically control. Oh, yeah. That's because of the ferpa. Right. Yeah.

That's that's probably because of the ferpa rules. So when you become what's the term, I always say independent student, but that's not what it is when you become, like, a matriculate. No. No. No.

There's a there's a word for Nferpa, and I apologize, y'all, I can't think of it very specific that where the student from that point forward owns the record. Not the parent, and can control who habits has access to that record. Right? So When you say from that point forward, from where they from their age, It's from their age, eighteen and above. Thank you. Yeah.

So I have a thinking back question on that though. So I we have seen darnished her to concurrent, they go to the local college. Is my school district responsible for ensuring that that's followed or is the college. So I was just having this conversation at the end of the last session because we have a lot of, consortiums cross populate, you know, data's going back and forth. And we're like, when does this school govern it? When does that school govern it? When does this school govern it? And this is the most lawyer question you're ever gonna hear.

It depends. It totally depends on how how that is set up, the way the structure is set up, you know, most most students who are taking classes in the at the college level are doing it. They're, you know, they're they're they matriculate into the college. So they're, like, dual regist their dual dual students. So that way, their data for the their data under the colleges handled by the college, and their data under the high schools handled by the high school, and never the twain shall meet.

But we have a lot of systems now where it's a lot of unique, like I said, consortiums and other types of situations where that very clear line is blurred. And I hate to say it. It really depends on how it's built out. Just connecting with that collagen. Yep.

It just depends on how the, the way the, the way that program has been built. Thank you, Donna. Sir? Okay. So, I think we were talking about the Kappa policy statement and the FTC's enforcement. That was May of twenty twenty two and very consistent with their pattern.

They announced an enforcement priority. They bring a bunch of non public investigations So it makes sense that this year in twenty twenty three, we're starting to see kind of the fruits of those investigations. And this year, there were two enforcement actions announced against ed tech vendors. So the first was in January of this year, was against Chegg, which I understand as a scholarship search service. And it was premised on lax data security practices, specifically the FTC, and this is in the enforcement the sort of complaint and then the order that comes with it.

They cited lax data security practices such as storing personal data and plain text, using outdated and weak encryption standards around passwords. The collective harm of these lax practices led to four separate data breaches which exposed the data of forty million plus users and employees of Chegg. Data, which include date and date date of birth sexual orientation and disabilities, financial medical information about their own employees. This is a very traditional section five data security case. They the FTC will find a statement like, we use top of the line, best in practice security, and then they say, no, you didn't.

That's a deceptive statement. And then they're able to bind check to what's one of your order that imposes affirmative data security requirements. And then in May, they announced, a second enforcement action against Edmodo for violations of Copa. Edmodo offered an online platform and mobile app with virtual class basis to host discussions to your materials and online resources via via both free and subscription based services. They alleged that edmodo edmodo failed to provide coppa notices to teachers at sign up So one of the part of the value proposition of allowing a school to provide that consent on behalf of the parents is that notice to parents that talks about the vendor's, copper practices has to be provided to the school.

In theory, then the school, December agents to the parents through a handbook or something like that. That's kinda how it's supposed to work under Kapa. Instead, they used a generic terms of service. The service terms of service were wrong and didn't contain requirements of the copper role. Also, Ed Moto used personal information.

It connected from student it collected from students to serve targeted advertising, which is a non educational commercial purpose, and we talked about how that's not allowed. And then they retain personal information indefinitely for certain periods of time. Punishment, six million dollar monetary penalty suspended. The company is prohibited from conditioning a child's participation in an activity, on data disclosures. Prohibited from using children's data for non educational purposes.

They banned them from using schools as intermediaries in the parental process going forward. They have to impose data retention schedules. The company actually had to suspend US operations during the pendency of the investigation. So if they meet other requirements of the order, they can come back into the US market. I confess I don't know what the status of that company is now.

Did they shut down? Yeah. Well, that that happens sometimes when you get a six million dollar penalty from the federal government. So I think we can expect more enforcement coming out of the FTC around EdTech. It's an area that they've been interested in since COVID. There was a wave of sweeps in twenty twenty against EdTech vendors as, you know, everybody went online year.

So an area definitely to watch going forward. Oh, okay. And then just looking forward to two areas where we can look for legal reforms, ferpa reform. In May, the Biden administration announced certain actions to protect child mental health safety and privacy online. And one of the actions they announced is ferpa reform.

The Department of Education was expected to put out a proposed rule in April to amend ferpa. That didn't happen. I understand the new timing is now November. We'll see if that timing holds. We don't know exactly what they're gonna tackle within ferpa, but, they've announced that they're gonna update and clarify ferpa by addressing outstanding policy issues clarifying the definition of educational record, so that might answer some of the questions we've got in the room as to who owns what record at what time.

And clarifying provisions regarding disclosures to comply with judicial orders or subpoenas, which I know is always a fraught area if your school receives that kind of legal process. Ferpa reform, definitely on the horizon. There's no language that's been kicked around. So we don't really know what's gonna be in there. And then Kapa reform also, interestingly a hot topic.

Federal government also interested in the privacy of kids and minors. There are two bills that are pending in the senate right now. One is the kids online safety act similar to those state social media bills. They would sort of set certain privacy by by design and social media by default. It would apply to minors.

So again, anyone under eighteen would be a federal law that largely follows those kinds of strictures. And then a bill called the Children and Teens online privacy protection act would expand coppa to anyone under eighteen, essentially. So, again, further erosion of that age of thirteen is the age of majority online. Those are set for a vote in the Senate tomorrow, and president Biden actually called I think yesterday, he called on the senate to pass the bills out of committee. It looks like they will.

They both pass out of committee last year, but, you know, Congress in our hometown is paralyzed, and they don't pass much. But, it's definitely an area where there's strong bipartisan support. So if federal privacy moves in any direction, you know, it'll be ferpa reform, copper reform, both. Okay. This is wonderful.

And so one other topic we wanted to touch on as a potential area regulation and interest is, of course, artificial intelligence. So we'd be remiss if we didn't at least talk about it specifically generative AI or GAI. I'll note that there are other more in-depth panels here at this conference that are going to be talking about artificial intelligence and education. You should absolutely go to those. We wanted we felt it was important to talk about it as a topic in education, particularly where it relates to personal information.

So, generative AI, which is, you know, things like chat GPT. So it's AI that can generate content. So things like text, images, that kind of thing. And it presents a great, like, use case for education. There's lots of time saving opportunities You can do course design, document editing, data analysis, digital marketing, among all kinds of other tasks.

So it's great for all these kind of labor saving activities. And to the extent that you are using it, your school will likely procure its own tools and set its own rules. Kelly and I have been advising a lot recently on the use of g g AI across a bunch of different organizations across a lot of sectors. And we're starting to see when companies are starting to develop guardrails and rules around their employees' use of this technology, we're seeing a couple of, like, common themes start to come to light. So I just wanted to kind of address those with a group.

So first thing is information input restrictions. So we've seen organizations prohibiting employees from inputting any personal information, confidential information, or business or trade secrets into a GAI tool. So these would be things like student information, maybe proprietary materials at your school, as you're likely a the models for GAI tools are controlled and operated by third parties, and information input into GAI tools is used to further train the models and may inadvertently be disclosed as future outputs by the model. So you wanna be careful about what you're putting in because you don't want it to be a potential output for somebody else. So to that end, we've also seen a lot of restrictions around output evaluation.

So we recommend providing guidance to folks in your district and other employees on the use of GA outputs, especially requiring users to review each output for accuracy and requiring independent verification of information contained in the output. So, for example, if it sites a the model cites a particular study, you should make sure that that study actually exists. This is a big thing in the legal profession, we've seen some fake cases, that people have started to cite. So, yeah, just just check the work. And we also recommend prohibiting the use of GA for tests that require accuracy.

So that'll be things like legal documents or responses to regulatory material. That kind of thing. We've also seen some requirements around use restrictions. So to simplify and pro compliance for employees, we generally recommend outlining permissible uses of GII. So letting employees know ahead of time, here's what you can use it for so that there aren't questions kind of on the back end.

We generally recommend prohibiting the use of GAI for decision making, so particularly decisions with legally significant impacts. So things like hiring or admissions, especially because of the potential for bias in these underlying models. It provides a lot of real there's a lot of really interesting opportunities for GII, especially in education, but there are risks that we should all be attentive to. And then I'm gonna turn it back to Daisy to talk about how you can evaluate your edtech vendors. Well, I oh, yeah.

Question. Sorry. Yeah. I think we have a lot of questions Oh, yeah. No.

No. No. We didn't because that is like that is the can of worms. No. No.

And it's I get it. So, for I'll make sure, you know, on that topic, you, I'm sure you all have seen, I forget who did it. The, there was the TikTok video of the plugin, the Chrome plugin that allowed allowed students to, like, it was a chat JBT built plugin that allowed students to, guess answers for quizzes in Canvas. For the quiz modules. And we'd no one knew about it until we saw somebody forwarded me the the TikTok.

And I was like, are you kidding me? Students will be students, and, you know, and and I appreciate the enthusiasm of someone wrote that, but, you know, we had to go to basically, you know, the, the Chrome, the, the Google Play Store, and, you know, the Apple Store and say, like, you need to take this plugin down This is something you're, you know, that is basically, you know, sitting on top of our app and allowing students to cheat. And, you know, we're, you know, we That is one hundred percent. We even had there's like blog posts we've written about this because it was such a big deal because, you know, we that's This is my own personal opinion, not in Structures' opinion, but I think we're at a really unique time right now because there's so much promise in generative AI tools so much promise. But there's also so much risk of harm and challenges with it. And, I I can't so I taught briefly.

I've seen teachers I've seen papers written by old AI, which was terrible. But that's really good now. You know, I can't tell you how many times I've played with it and been like, write me a cover letter, and out came a beautiful cover letter, write me an essay on this, or write me an essay on that. And I would just say, I know we got a we got a question for one of our school clients that would like, oh, our third grade teacher wants everybody to use chat GPT. She has a really fun project.

Is that fine? It seems fine. Yeah. It's it's no different from a legal perspective it still has to follow you have to follow Kappa in that whole structure. And at the I don't know if this has changed, but that time, like, chat, GPT, does not have a coppa. You know, they don't have that back and forth.

Not the food. Yeah. Not the free. Right. So you, you know, you can't use the tools in the same way until you have to dot the i's and cross the t's like any other software platform.

Yeah. No problem. And I think he the gentleman on yellow shirt had question first. Yeah. I appreciate the kind of framework on some AI kind of policy or recommendations.

Do you have any kind of a template or has someone done this particularly that you can point us towards. I don't know of any examples that are public. Yeah. Yeah. Maybe? Well, so, great exam so, for example, we just and I can tout this because I literally just wrote in structures, AI governance policy.

And we built it off of there's a lot of, recommend date, recommended policy requirements, you know, the Biden administration published their bill of rights, their AI bill of rights, we based on that. We took a lot of those, you know, sort of distilled it down to like the five or six elements, right, transparency, fairness, no bias, like, you know, privacy and security. Yeah. You know, no personal data will ever go into the AI tool, student data will go into, like, and set that up in a way that fits for us, and we even say we highly recommend that schools quickly review what they're gonna do, you know, any AI tool we use inside of our products, right? We're just beginning to scratch the surface. There's some demos this weekend on it, which is super exciting, but we only use like licensed software that is also goes through our vendor risk management program, right? Cause we're a company, and we have tools, we can we can review the privacy, security, all of the adjacent intellectual property, all the different things that go into an AI tool, whereas, you know, schools that don't have a lot of resources can't do that.

And like, the third grade teacher who's like, this is a really amazing tool. I'm just gonna have all my students log in to chat GPT because it's free, right? Or, they're gonna go to use mid journey to do art, right? Like they're gonna do these things because it's really cool and it's this neat learning experience that it is. But unfortunately, there it's such a new area that there's so many risks that we're still struggling with. So I would just say, look at some of those real, you know, you know, look at some of those policies, policy recommendations will come out and then you can build one, and then of course it's gonna improve right as laws, and we, as we figure out more. So I think this gentleman had a question here.

The the reality is that you can't prosecute a computer program. Nope. The line should you spill it? Nope. Or is it being asked to give you a trusted person? The the point is that the chains of logic that we've honored AI it would take to a hundred years to trace it. Oh, you can in some Yeah.

And some in and some of the I decided to screw it against that student. There's won't be something clear cut. Nope. There'll be a whole series of little pieces. So how how do you deal with the fact that you can't go after the person But you the the program that you can't go at you can't kill a program.

You can prosecute a company. The FTC is actually investigating Chad GPT right now. Yeah. They issued a CID last week that was leaked to the Washington Post. It's the yeah.

So we'll see. Yeah. Yeah. Oh, yeah. No.

No. It's it's it is a super important question. And this has been going on even with, like, you know, when start developing, you know, neural network and where you get to a point where you actually can't. I now keep I bear in mind y'all I'm not a computer programmer. Just been in nineteen I've been computer adjacent for a long time.

You know, and it but we don't even know. There are so many there are a lot of complex machine learning models out there and other AI models that we actually don't know how they got to their output. You you can't backtrack through it. You can't. Trust anybody.

Well, that's why there's always a human at the end of it. Absolutely. It spits it out, but then someone has to apply it to someone in the real world. And that, I guess, is where the liability would attach. Thank you.

Yeah. And that's what we, you know, that, before we run out of time. That's that actually ties in to, you know Oh, one more question. Oh, one more question. Grammarly forced to barely go out.

The week of utmost gay and everybody was having a cow. And so I am not sure what to do about that. Because I can't separate regular grammarly from grammarly Go. And I was just curious if you lay up stuff down that, or you guys come up, or they just added something to the extension. Like, I don't remember getting a notification as people add them to mister.

It just showed up that we could final Alliana is that's what I mean. I don't know. I don't know about Grammarly specifically. So we can figure it out. Yeah.

Yeah. Definitely. Definitely. Because it's really it's, so what's funny is I've never used Grammarly, but I know one of my company loves to use it. I wish I had it when I took the bar.

That would have been super helpful, but they disable all that. Huge. It is. It is, it's a model. Yeah.

I have no Yeah. I would definitely look into that. Yeah. Yeah. And definitely have somebody Oh, I would say a hundred percent, especially if it hasn't been vetted by your IT team or any, but Yeah.

Any of your No. No. So that's why we It's why even though, you know, I am an attack vendor. I tell everybody I worked, you know, I've worked on the outside. I've been a privacy advocate for about twenty years now, and when we tell everybody is, you know, every single piece of technology, even if you, even if you don't have the resources, to do the really intense deep dives.

There are certain high level things that any any administrator, any IT director can do to quickly appraise whether or not a vendor. Extension are weird. Yeah. And extensions are really hard because you they just get plugged in. Right? So there is no like where is their privacy notice? Well, you might be able to find it in the yeah, you might be able to apply it in the Google Play Store, and then r, oh, what is that extension collecting from its primary? You know, it's primary website or tool.

That's the other challenge, but for the tool for the big tools that you use, the primary ones. Even free ones, I tell everybody, please read, you know, when you're getting a free tool, don't download it because as much as I am all in support of, you know, scrappy, young software companies that are building amazing tools, having worked for scrappy young software companies, privacy and security is usually the thing that comes once they start maturing, not when they're first being built, because, unfortunately, that's it's resource intent. It, right, compliance as resource intensive. So I tell everybody, you know, you know, read the privacy policy. Do they have a privacy policy? I've I vet vendors all the time for my company, and probably twenty percent of the time, there is not even a privacy notice I can get to that I can read.

That tells me we're off the bat. That's a no. Nope. Nope. They don't get to move forward.

We are not moving forward. They don't even have a policy. You know, do they disclose who they share their data with? That's what I mean by third party processors. Right? Do they sign data processing agreements? Do they have one online you can look at? That's the contractual promises they're making to you. Now for most of you all who are K twelve in this In the US, you're gonna be signing the student data privacy agreements.

You're gonna have state specific agreements that all these vendors have to sign. They're not, you know, you're really gonna use their online contract. Because that's just the way the school works. Do they have a verifiable privacy program? Any type of audit or certification? There's a bunch of them out there. Most people have security audits, but not really, not always privacy audits.

And then, key one for the k to twelve, Do they restrict advertising to the students? You should, you know, if unless you're I don't know. Unless you're Facebook, and I don't even think Facebook should be doing it. The tools, you know, you buy to to, to to help learning, to help students should not also be taking that data to serve advertising to them. So a great example, like, our community, you know, for those you're familiar, we have the campus community we don't have anyone under thirteen using it, but we do have people between thirteen and eighteen using it. Lots of students use it for resources, You know, we had a we had a outside marketing company that wanted us to put Facebook cookies and Instagram cookies to collect data so they could show us all the cool advertising they could do.

And we were like, absolutely not. This is a user site. This is not an advertising site. This is not so we can generate revenue. This is so we can help our users use our product, no, and they were like, oh, okay.

But they really wanted us as our advertising agency to do this because that's how they generate their own So you see how that downstream influence happens with your vendors. So, and of course with security security is a little, What I found security is easier to prove, right? Because if we have a lot of really good third party audits out there, you've got the, you know, ISO certification, the SOC two type two, for those of you who are familiar, The HeckVAT, that's really used at higher ed, but I have a lot of under K-twelve schools that use the HeckVAT model also from educause. That's basically you know, the big checklist. There's also like the Sig controls. There are tons of different security questionnaires that a vendor should be able to hand you something that shows explains what all their controls are.

Right? If they don't have a third party audit, they should have something they should can share with you, and as an institution, you should also have the right to audit them. Software companies love to push back about that because they're terrified someone's gonna find something that they haven't found, but based, because of the environment that our institutions work in, you should always be able to audit that third party vendor if they are processing student data full stop. And if they don't let you do that, you should probably step back for my own per this is Daisy's opinion, not instructor's opinion. Instructure, we let our students, we we let everybody audit us. Because of the vast amounts of data that we process, we are super transparent as part of our program.

And then this one is what probably the biggest one, that I tell everybody who buys software or is getting software is data management practices, because it's the it's the place that most companies have the least amount of maturity, unless they're like Adobe, right? Do they delete the data? Do they actually delete it? Are they deleting it off their backups? Do they have a data is all of it encrypted? Kelly was talking about Edmodo. I think it was, or was it checked where they were storing data unencrypted. Now, I'm gonna get a little bit technical here, but you know, if you don't have a security person that you can talk to about it, you know, Seating talked to some friends, there's a lot of great resources out on the internet, P TAC, which is the privacy technical assistance center with the US Department of Education has a ton of information on different types of encryption, because vendors will tell you your data is encrypted. Well, is that database encryption? Is it, is it, you know, is it object encryption? Is it, like, what kind of encrypt what are you actually encrypting? And does it provide the, does it provide the actual protection that we need to protect the data, or could it actually get could it actually get accessed by a bad actor? So Like I said, this is just high level. For those of you in the space who are who are members of, formally IMS, now what won EdTech, they have a great, they have their privacy and security trusted apps, vetting tool, you know, not I'm super passionate about it because I actually help them develop some of those some of the rubrics and stuff, but those rubrics are a great place to start.

So if you're a member, what was it called? It's, I'm gonna say their name wrong because they just changed it. Yeah. One ed tech. Yeah. One ed tech.

So what they do is they have a they have what they call the trust TED, trust, capital e d, app, Let me see if I've got it in the resource list. I don't have it on here. I'm so sorry y'all. But what they do is Right now they have a privacy seal. So it's a great place to start because you can download the rubric and see what these third party vendors have actually attested to.

And one, EdTech actually reviews, actually reviews everything. They're working on a security rubric as well. So if you don't have the resources, to go review a sock two, you know, an eighty five page sock two type two, because it's incredibly technical. There are some places that are a great place to start if you don't have the resources. Speaking of resources, because we're right out of time.

Take a picture of this. Please. This is Vinnables. Resource site. They have an amazing amount of white papers, documents, all sorts of great stuff.

So as someone who worked for small companies, when I was first starting out my career, I didn't have access to really cool law firms, but because really cool law firms like Venble did stuff like this, I was able to teach myself a lot of really interesting stuff. So there's a lot of really great resources out there. If you wanna nerd out and deep dive into this stuff, because there's it it's all super helpful. I've lost my train of thought there because it's the end of the day. Y'all And then, if you didn't already know this, this is the all the Instructure privacy resources, by the way, this is our we have a privacy FAQ page on our main site.

We also have a privacy hub on the community site. So if you need information about any of this, please feel free to email me. You can either email privacy at instructure dot com or daisy dot bennett at instructure dot com. Any questions reach out, I'm here for all y'all. And there's a lot here.

These are long websites, and you I believe you all are gonna get access to these slides as well. So, snap picture, and if you don't, email me, And I will send these to you. I will send everybody the deck that wants it, if you can't get it. Because then these are what I call general privacy resource Right? We were talking about if you don't know the law and you don't really have a lawyer to go talk to. There's a number of great resources here, like the future of privacy forum, has tons of white papers, opinion papers, They are a they're a nonpartisan think tank.

They do lots of stuff within privacy space, especially with education and privacy. Obviously, the US Department of Education has a lot. And then this one's really good. The I have a pointer I forgot. Where's the pointer? I won't be able to find it this fast.

Right there. This is a list, a list that keeps updated, keeps being updated, that is all the student privacy laws in the state of the US. In the all the state US privacy laws. It gives you an idea of scope. You can look up your own state think about other states, if you're interested, but it gives you an idea of the, breadth of laws. And then, of course, there's some other trackers here that are really good. And I think that's it.
Collapse

Discover More Topics: