Privacy & EdTech for HIED - 2023 Updates
This session explores the unique privacy challenges for higher education institutions in the US and globally. Join us for an interactive workshop session to learn about the impact of multi-jurisdictional privacy laws, best practices for vetting education technology, and how recent changes in privacy laws affect HIED student privacy.
Welcome, everybody. Thanks so much for dealing with our room shifting excitement. That's always how it is at conferences, but I love that we have more space here. We are also all thrilled that you are here for what some folks think is not very exciting topic but we love and are super passionate about it. Gonna talk about privacy updates. We're doing our higher ed session.
Now, we're also doing the K twelve session at five. So if you're interested in Updates related to K twelve, please come to that session as well. I'm date for those of you who don't know me, I'm Daisy Bennett. I'm associate Journal Council and Data Protection Officer here at Instructure. And, for the last couple virtual instructure cons, I've had the opportunity to do some really fun presentations about privacy.
This is even better, because now we're in person, and now we get to do a really interesting session that where we can actually talk to each other, ask questions, be interactive, and actually have some communication going on. But before we dive in, I would love to introduce you to our lovely panelists, Kelly. Do you wanna start first? Sure. Hi. My name is Kelly Bastside.
I am the co chair of the privacy and data security group at Venibal. We are a national law firm, but our privacy group is based in Washington, DC, and we are a long time outside counsel to instructure on privacy and data secure Hi folks. I'm Dana Homstrand. I'm an associate in Venables privacy practice, and we are thrilled to welcome you to think your first breakout session of instructurecon twenty twenty three. Thank you.
For remembering Matt Dana, she is on it. She got us into this room. She's like, got us all organized. I'm running around like a chicken with my head cut off. Thank goodness.
She's here. But now we are super excited to have them here. I joined in structure over three years ago. I've been working with them ever since our previous privacy, lawyer, and data protection officer worked with them. So they have our deep, deep, deep knowledge in education and privacy, and that's why we're so excited that they're here.
To talk to us, answer questions, and hopefully have a really great session. And I'm gonna try not to breathe super hard in this. I am not used to microphones y'all, so please bear with me. These are the topics we're gonna touch over today. We talk a little bit about why privacy is unique to the higher ed ecosystem.
We're gonna hit some hot topics, do a little what they call peeker on the corner. Kelly and Dana are gonna give us some great prognostication on what we might be looking at in the future. We're also I'm also gonna talk very briefly about best practices for vetting ed tech vendors. And then we're gonna go over some resources. Because we believe in giving people lots of tools and lots of opportunity to go research and find out more information.
As I mentioned before, We really want the session to be interactive. So if you have a question while we're on talk about it, please raise your hand. We're gonna try to save time for questions at the end. We wanna give everyone the opportunity to jump in. We want this to be conversational.
There is a lot going on in the privacy world y'all. Things are just I can't even begin to talk about how amazing, complicated, crazy it's been for the past couple years. I joined the privacy universe, and I'm gonna date myself right now, two thousand and three, when being a privacy professional really wasn't a thing, and really all we cared about here in the US was HIPAA. And health information, and the high-tech act, and some of that stuff didn't even think about education privacy in the way that we do now. And as technology has grown, as we've developed these amazing tools that we use in our schools, our ecosystems have gotten so incredibly complex.
That even as a privacy attorney, I have a hard time fishing through the data flows, understanding what data goes to what tool. When a student logs into their LMS and their administer has also connected twenty two other LTI tools. Well, what data are those LTI tools getting from the LMS? Most of our Canvas administrators can't tell. Most of our IT people can't tell. The system is so incredibly complex right now because of the explosion of technology, which of course, COVID brought on as well.
And as I said, there's very little transparency. Many of our K twelve schools, I don't see this as much in higher ed, but we do see it a lot in our K twelve schools where there are not resources for oversight. Right? You don't have a dedicated team to vet all your vendors. You've got teachers that are super excited, and they say, I have this amazing tool. It's free, loaded on your books.
You know, loaded on your Chromebooks kids, we're gonna go ahead and use this tutoring tool. We're gonna use this cool AI tool. We're gonna use all these neat fun things without really understanding what data they collect, what they do, do they even comply with Verpa? Do they comply with the GDPR? Do they need to comply with the GDPR? Do you need to even think about those things? So Unfortunately, because public education is so underfunded, it's really hard often to get these types of tools and resources actually reviewed in a way that is meaningful. And of course, we have issues of data equity. I always bring this up in my privacy presentations because students don't get a choice to use what tools their schools use.
They have to use what their schools tell them. Oh, we're gonna talk all about it. That is the general data protection regulation for those of you who operate international or schools operate internationally, or you deal with students internationally, that is the privacy law of Europe. It's no problem. Like I said, feel free to ask questions.
So to go back to students, right? Students don't have a lot of choices. Instructions sometimes don't have a choices either as to what type of tools their schools are using and what data is being collected by those tools. So this is a really, really sensitive issue, especially with AI and as the privacy, as privacy laws rush blindly to try to catch up with the technology, which, as most of the snow, technology never can technology moves so fast that the law just can't keep up, but it's doing its best. And then, of course, we've got issues of data, so sovereignity, And then we're gonna talk a lot about data transfers and localization in the context of international laws. Kelly, go ahead.
Let's hear about hot topics. Hot topics. So we really did intend this to be, kind of a way around hot topic, what we're looking at in our world session. Each of these is probably worthy of its own forty five minute session, but as Daisy said, raise your hand if there's where we're gonna keep it surface level, but if you want us to go a level deeper, just let us know. So we're actually gonna start in an area that is privacy adjacent at the Department of Education and around something called the Dear colleague Letter or the TPS third party servicer letter that was released in February.
For those of you who followed this development, and it is a winding saga. This was released by the Department of Education in the form of an open letter in February. That addressed third party servicers. It it would implement a new compliance regime that had to be in place as of the letter, as of September one. It was a significant expansion over prime, prior guidance, and it really did remain unclear as to the scope of third parties to which the guidance was supposed to apply.
And if it fell under the guidance, then these entities would be re regulated under the higher education act. So the TPS letter set forth a highly fact specific test. It would turn on the specific services that a third party provided to an institution of higher education. It attempted to clarify the types of activities that made an entity providing services to an institution at TPS, and, thus regulate it under the law. The guidance would have revoked prior third party service or guidance that the industry has been relying on forever.
And would include specific service terms in, contract. So really folks were sort of ramping up for a very quick contract change, roll out of new compliance regime at the institution level. The letter was open for comment, through March seventeenth, they received over a thousand comments. And in April, they released a blog post saying that they changed their mind. That they had received so many comments that they realized that the guidance had been overbroad, and it was unclear, and that they were looking to issue, further guidance at a future time.
They did in this blog post identify several activities that they said did not trigger TPS rules that included study abroad programs, recruitment of foreign students, clinical or externship opportunities, course sharing consortia, dual or concurrent enrollment programs, police departments, helping to compile crime statistics, the fact that they had to revoke all of these activities just shows how expansive potentially the guidance was in the first place, but they're looking to revise further guidance. We do not have a firm timeline on that now, but that they would identify specific services would fall under the category as they continue to review the thousand plus comments that they received. So continuing on the theme of their interest in third party servicers, in I think it was April. They also identified that they were gonna do a notice of proposed rulemaking. So this would be a formal rule.
Under the Administrative Procedure Act that would again codify a compliance regime for third party services. Again, no timeline, no text, they are presumably in their internal process to work on the rule, but, it's something that shows how much interest they have in this topic. It's a place definitely to keep your eye on the ball because they're gonna release something at some point, perhaps with a quick timeline as they had originally envisioned. Okay. So the next, this is privacy adjacent, then we're really gonna get into pure privacy.
Just yesterday, we just this is in the hot topic, breaking news category, the Department of Justice issued a press release saying that they did send a rule to the federal register for notice of a proposed rulemaking under the American with this abilities act that would aim to improve web and mobile application access for people with disabilities and would clarify how public entities can meet existing ADA obligation in terms of online activities. They sent it over yesterday. It'll be published in the federal register shortly, and then it'll be open to a sixty day comment period. So that text has also not been released publicly, but something to pay attention to. If you are involved in website designs or web properties.
Okay. So that was the privacy adjacent part now to the that is near and dear to my heart, what's happening in privacy these days. And we're gonna move to the states because that's really where a lot of activity is happening here. The privacy of minors is a very hot topic, top of mind for the states, It has not been a big year for state student privacy legislation. It's an area where we're really in equilibrium with the laws that they exist now.
Instead, we're seeing a lot of laws that tackle the online activity of minors, and general omnibus privacy laws. So as of twenty twenty three, let's talk about Omnibus privacy laws. As of at the end of this year, five states will have Omnibus privacy laws, laws that cover all forms of data collection from any identified or identifiable person. California and Virginia, as of January one, California's been in effect since twenty twenty, but was amended first of the year. July one, we got Colorado and Connecticut and Utah will come into effect at the end of the year.
So we'll have five on new bus privacy laws at the end of this year. There have been fourteen passed in total. So there's another nine or so that will come into effect over the next two years. These laws largely exempt non at organizations, but not all of them do, most notably Colorado, which is in effect now. It's the state in which we're sitting in actually.
Their law applies fully to nonprofits. So, your institutions, you may have to comply with Colorado if you meet the thresholds under the law. And then Oregon has provisions that will apply to non as of twenty twenty five, the law comes into effect a year earlier. These laws are a largely notice and choice laws. So they require provisions and privacy policies, expansion of rights to individuals.
These include rights of access, correction, deletion, the opt out of certain data quote unquote sales and disclosures. They're the reason why go to your favorite e commerce site, you might see a link at the bottom says do not sell my personal information. That's that's from the California law. It's the kind of compliance regime that's now spreading across the nay the nation in a patchwork fashion. Even though your institutions are largely nonprofit entities, your vendors, of course, are not.
They are for profit entities who may have to comply with these laws if they meet the specified thresholds. I think where you'll see the direct impact of that is in contracts. Because all of these laws require things to be placed in vendor contracts and agreements. It's the reason why you may be drowning in DPA why you keep getting a new wave of contracts over and over again. It's because every time there's a new wave of these laws, those contracts have to be looked at again, and we really don't anticipate an abatement in that cadence anytime soon.
Legislative sessions in the states are largely in recess right now. But they'll be back in the fall, and it wouldn't be surprising if another three to five laws passed this year in another, you know, ten or so year. So we really are reaching a place where this patchwork is expanding and creeping. The laws are similar enough now that uniform compliance is achievable But, it only takes one really to kinda up a set that Apple card and really create a strained, patchwork across the country. Okay.
Just to highlight a couple other developments in the States when it comes to minors, there's been a smaller wave social media laws. So state legislators have introduced more than a hundred bills in the past year. And we have social media laws in Utah, Texas, Arkansas, Arkansas, in Montana. Just to give you an example of what these laws talk about. In Utah, starting in March of twenty twenty four, Social media companies must verify the age of Utah adults seeking to maintain her open social media accounts, get the consent of parents or guardians for users under eighteen So I know we've long considered the age of, majority on the internet to be thirteen plus because of federal laws.
It's now slowly turning to eighteen. Online. Would allow parents or guardians full access to children's account, create default curfew settings that block overnight access to minor accounts, protect minor accounts from unapproved direct messaging and block minor accounts from search results. In addition, they cannot collect minors data target them for advertising, are you certain addictive social designs or features? You might say, well, we don't operate a social media network. Why is this interesting? The definition of the social media platform in the Utah law is any online forum that's made available to create a profile, upload posts, few posts of other account holders and interact with other account holders or users.
So it's gonna suite far more broadly across a lot of sites that have interactive features we're gonna just see, I think, a slow sort of creep for a lot of age gates, look for a lot of age gates and online properties. And this is an area where the law sort of continues to slowly move forward. Yes. In the, in state laws? Or, yeah, so it's a good question. The thresholds in the state laws Again, it varies by state.
Typically, you have to be processing personal data of a fixed number of individuals. It's hundred thousand in most states. In some states, it's less. It's like fifty in a small number of states, like in Montana, than on that many people. But, Yeah.
And then or derive a certain percentage of your revenue from data sales. California is unique. Anytime people talk about the state, lots people are gonna say California is unique because it also captures any business that has twenty five million dollars in revenue, annually full stop. So you can have that revenue threshold and process the date of like three California residents. And in theory, the law would apply.
Sure. Nothing else? Okay. The other kind of thing to highlight is the California age appropriate design code. So this is another development along the same lines. It's called the California age age appropriate design code.
It follows similar codes in the UK and in Ireland. What it requires are certain safeguards and then a certain assessment of features for any properties that are directed to children. And again, this would be anyone, excuse me, under age eighteen, so not, under age thirteen. So it would look at different to determine if you're a property that falls under the code. Like, do you have advertisements that are marketed to children Do you know that a significant number of children visit your site based on empirical audience composition? Again, something to think about potentially if you have sites maybe that are targeted for recruiting of high school students.
Right? That's the kind of site that potentially has by empirical numbers, a significant number of minors visiting the site. At a high level, what it would require is it would require you all to bake privacy by design into the development of that web property. And it's supposed to be, categorized by age. So you know, zero to five is one level of, privacy by design. Thirteen to fifteen is another.
Eighteen to seventeen is what they call approaching adulthood, so yet another. So it's really supposed to be a granular examination of the features of, a web property that is targeted to children. And you have to design products with a high level of privacy That's a quote. If you ask me what that means, it is undefined in California. It will require certain assessments around the purpose of the online product service or feature written assessments about how the they use children's personal information, and, a pretty it's like a long laundry list, like, how you can harm children, how it could lead children to experience harmful content, you know, etcetera, etcetera.
And it's a lot of formal documentation that would be acquired around those features. It's been subject to litigation. So the state of California was sued in connection with promulgation of the age appropriate design code. I checked today and the hearing on that is actually tomorrow. So it remains to be seen whether it does go into effect, but its effective date is about a year away, July first twenty twenty four.
So another place to really look for, ongoing developments. Okay. States. You wanna FTC? Is there another slide? Yeah, we're gonna move forward. I do wanna say something about the age appropriate design code and some of these types of codes that are coming out as a third party service provider to schools.
We design, you know, you you design products and tools, not really realizing how they can be used. So here's a great example. We have a product called Portfolium. It's the whole purpose of it is so students can build a lifelong portfolio that they can start with in childhood and carry it through all the way through college, even into adulthood and even into their workspaces. That it gets spammed constantly by people building bots to sell porn.
So we have an entire team that does nothing but go through those accounts last we did one audit that we did. I think I pulled up forty eight thousand of them and that was just in a three month period. That were literally like, go buy this film. Click here and it's heartbreaking, but when you're developing this tool for a student that you're gonna think is amazing, you don't realize how many bad actors there are that are out there. And that's what these codes are meant to try to capture, right? Because if you have somebody, you know, promo getting porn on a free portfolio site, and then you have five year olds using it that can get to that.
Those those pages easily or friend those accounts. And this is not social media. These are education products, right? There it can do a lot of harm to kids and that's what these are all about. So I just had to give you guys a little context there that It's very important for the law, but it is in practice actually a real thing that we face as a third party service provider because we wanna protect our students in are schools that are using these products, so they can actually use them in a way that's meaningful and not wind up, you know, getting a lot of bad content or even just spam content, right? It doesn't have to necessarily be intentionally harmful. It's free, you know, free downloaded movies that somebody recorded on their iPhone.
It's just all sorts of junk out there. So Kelly, we're talking about international hot topics now. Dana's gonna talk to us about international hot topics. So this is where we're gonna talk about the GDPR and all the international So this is what's unique. I'm gonna jump real quick intro why we're talking about this.
So whereas our K-twelve students, and schools are just operating here in the US are higher ed students because of COVID and the, you know, the the explosion of online education are offering are offering classes worldwide. And so our higher ed institutions and by virtue of providing services to them, their vendors have to comply with laws not just ferpa, not Kappa, not just all the US laws, but now we have to look at all the laws around the world, and of course the big The big privacy, I don't know what you call it hammer in the world is the GDPR. And so Dana's gonna tell us a little bit about it. Yes. So like easy explained, the general data protection regulation or we're gonna we're gonna call the GDPR today is, like, probably the first time we kind of start thinking about consumer privacy and, like, the age.
So this comes out in twenty eighteen, and why should you potentially as a US institution care about the GDPR? Well, it has an extraterritoriality provision, meaning that even if you're not located in the GDPR, if you're targeting services, are in the EU. If you're targeting services there, you might potentially fall under provisions of the GDPR. So compliance is something that should potentially be top of mind, especially if you have maybe exchange programs or students that are coming from other countries. So I just wanted to level set a little bit on some of the term that we're gonna be using, particularly in where we're talking about EU privacy, which are controllers and processors. So this is obviously not new for some of you who have been following privacy the last five or ten years, but when we're talking about controllers and processors, it's important to consider what your relationship is to vendors or service providers even other institutions that you might be working with because whether you're a controller or a processor will define what your rights and responsibilities are to the data subjects for whom you're collecting personal information.
So what's a controller? So this is the party who decides key elements of data processing. So they decide on the manner of process listing. So are we collecting information? Are we doing data analysis? Is this storage? That kind of thing? And your processor is the party that actually process this personal data on behalf of the controller. So you can think of this as your vendors or service providers. Right? They're only storing information because you ask them to do that.
They're only collect doing data analysis because you ask them to do that. So it is possible that you might be a controller in some instances and a processes or in others, and it is possible for a data transfer to involve to controllers or to processors. It just it's kind of a fact specific inquiry. So in instance where you might be a controller, sharing information with another controller is, for example, if you're like renting out a prospect list or sharing like research subjects with another institution. So again, why does it matter what role you have in the processing relationship? So it defines your rights and responsibility.
So if you're the controller, you get to decide what happens with the data right within the limits of the law, but your vendors who are acting as processors do not have the right to use the data however they like, but they also have no corresponding obligations to the data subject. They're honoring rights requests because you have to honor rights requests. So another reason that it's important to understand what your role is is liability shifting and contract. So who is responsible for what actions at any given time with respect to personal data Finally, it helps to outline the flow of data. So who is sending data to whom? So if you understand you're a controller like, okay, I'm the one who's sharing information with these other parties.
So often we see these processing really come up and where there's been some really interesting, developments has been in the realm of data transfers particularly cross border data transfers, and we work a lot with Daisy on these. So one of the GDPR's fundamental principles is data localization. So data physically has to stay in the EU, or if it leaves the EU, it has to be subject to certain safeguards. So transfers are permitted without additional safeguards, which we'll talk about in a moment. If a country outside the EEA receives an adequate decision from the European Commission, This is basically a seal of approval that says, yes, this country has adequate protections for personal data.
So for transfers to non adequate jurisdictions, which until very recently was the United States, companies must implement appropriate safeguards before transferring data. So these are things like the standard contractual clauses. When you've, executed data processing addenhams or other agreements with your vendors, you might have seen the standard contractual causes or SCCs referenced. These are like template, DPA terms. We also might see binding corporate rules you could get consent from data subjects, but that sounds onerous.
And prior to Shremst too, we saw the EU US privacy shield. So what's on the horizon? The EU US data privacy framework. So what is that? So since nineteen the nineteen ninety five a protection directive, there have been various agreements between the EU and US that allowed transfers to the US without these additional safeguards that I discussed. As long as the companies that were receiving the data in the US ascribed to certain principles. So you guys might remember, the EU US Safe Harbor, and then the EU US privacy shield, which were struck down in two decisions.
Shrems one and Shrems two. So we aren't giving up yet. In striking down the EU US privacy shield and trims too, the EU Court of Justice cited concerns that the US government's ability to access EU personal information when it was located in the US and data subjects rights to an effective remedy. Two things that are fairly important parts of the GDPR. So in March twenty twenty two, the EU and US an agreement in principle to replace the privacy shield.
So our government and the EU government kind of work together to figure out what could be a reasonable alternative to the privacy shield. And then in October twenty twenty two, President Biden released an executive order implementing the terms of that agreement. What does this executive order say? So it establishes a data protection review court, which allows European's ability to contest and bring objections to what they believe to be personal information that's been collected improperly by American intelligence activities So that addresses one of the concerns that was brought up in terms too. It also limits access to EU data by US intelligence services to what is necessary and proportionate in service of certain national security objectives yet another way to address concerns brought up in trends too. Finally, it requires the intelligence community to update policies and procedures to account for these changes.
So earlier this month, the European Commission finalized its approval of the data privacy framework granting an adequacy to into the US. Ray. But it's not quite the death of the FCC's and other safeguards. So personal data can now flow from the EU to US organizations that sell certified to the data privacy framework without additional safeguards. Note that only organizations subject to the Federal Trade Commission and the Department of Transportation's jurisdiction are permitted to self certify.
Yeah. Meaning that non profits are cannot participate, but we bring this up because you might see it in contracts with your vendors who are able to self certify to the privacy shield. And this might be important, especially as you are importing data from the EU and have to account for the practices of your vendors to your EU counterparts. So if you remember privacy shield, the requirements for certification under the data privacy framework will look very similar. As Kelly has pointed out, they basically took the same website and changed the logo.
Organizations must provide information to individuals about data processing provide free and accessible dispute resolution, ensure accountability for data transfer to third parties, and ensure commitments are kept in place as long as the organization holds the data. Naturally, our friend Shrims has vowed to bring Shrims three to the European Court of Justice, So we are keeping an eye out. Don't lose the SCCs yet. So what's really interesting about the new framework is you have vendors like Instructure who are providing services to your higher education institution, And you are providing distance learning, for example. Right? You've got, maybe you have a a teacher, you have a adjunct professor in France, and then you're offering this class around the world.
So what does that mean? That means well? Your vendor who's per your LMS, who's per that service has to comply with the GDPR. They also have to comply with any other law that happens to apply to that data subject. So when Dana was talking about data subjects, we're not just talking students here. The GDPR for those of you who aren't familiar apply to everybody, teachers, administrators, whereas here in the US, Verpa applies to students. The GDPR applies to everybody.
So if you're offering distance learning, you're involved in a distance learning program and you're bringing on vendors or third party providers, You gotta make sure that they can meet all of these requirements meet the GDPR requirements. The new data transfer framework's really great because it's actually helped us do a lot less contracts, for some of you who've had to do any of the data new data processing agreements we've had to do over the last few years the checklist, the odds, it gets really burdensome, and it's really hard for some of our smaller institutions to actually be able to manage it as we mentioned earlier. So it gets overly complicated. Dana, we talked about we've got the EU focus on the children's design act because we talked about the California. Now, we're gonna talk about the EU version of the same.
Yes. And unsurprisingly, it is very similar. So earlier this month, the European Commission, you know, has been Really hot for privacy, apparently in the month of July because also this month, they kicked off the special group on a code of conduct for age appropriate design as part of their better internet for kids' strategy. This is following efforts out of the UK as well as Ireland for their children's code. So the group was convened in part in response to the Digital Services Act, which requires all online platforms accessible to minors to ensure, as we've heard before, a high level of privacy, safety and security for minors on their services you'll be unsurprised to find there is no definition.
The group will be responsible for drafting a comprehensive code of conduct on age appropriate design then industry can sign up too. So this one is voluntary. The code will build upon and support the implementation of the Digital Services Act by specifically emphasizing provisions dedicated to safeguarding minors. That kind of it's what it says on the tin. So turning to our friends in South America, also wanted to touch a little bit on what's going on in Brazil.
So the Brazil has its own comprehensive consumer privacy law. That is in Portuguese. And so I will just use the acronym lgPD, which regulates the treatment of personal data of people in Brazil including by granting individuals certain rights over data, requiring a legal basis for processing and setting forth processing principles. So this is relevant to you because the LGPD, LGPD, does not fully apply to the processing of personal data that is carried out for exclusively academic purposes, provided that the processing is supported by another legal basis. However, the law does not define an exclusively, economic purpose.
So recently, Brazilian data protection authority, the ANPD, published a guide intending to clarify what it means to process personal data for exclusive academic purposes. So the ANPD has stated that the limited exception applies when the processing of personal data is strictly limited to freedom of expression, in an academic environment. So if the processing is for other purposes, think like administrative or commercial, so things like enrollment, attendance, evaluation, LGBTD fully applies. So that's what's going on at our friends abroad. And so I'm going to turn it over to Kelly to talk briefly about what going on next.
Anything else I can do for you? Go ahead. This is very micro, so I apologize for that. My regularly get students. Most of my courses are well aligned. Regular get students are overseas.
Some of the Germany where we've got a military follow some of the Middle East. In one case, I was sure the students were on the other side, by the way. Am I bound by that? It's random, and I was only you, but it happens to you. How do they find you? Or you find your courses? Fairly well known in many. It's Or if they have a relative in San Antonio and so they don't Yeah.
Yeah. So, you know, as Dana said, these laws have extraterritorial provisions, but they apply if you don't have a physical presence in these countries, it applies when you offer services there. And that's really a fact specific test. Oh, truth there. If that's what you ask.
Yeah. Or if there's, you know, marketing, but if they find you inadvertently because you have a global reputation and you have a website that's accessible from anywhere in the world, then no, it's not gonna apply. But They're free free legal advice for the day. Yes. Oh.
So the question was, how does it apply to military bases? I forgot that I have to repeat the questions. So our recording can capture them. Yes. This is a really interesting question. I have researched this one.
I know this one. It depends on the particular base. And the status of forces agreement that the United States has entered into with that country, which will set forth which laws apply, you know, when they can use the local law and, you know, for example, prosecute, soldiers if they get in fights and stuff. Right? Sometimes they can be prosecuted sometimes they can't. It depends on the base and the agreement with the home country as to whether the law applies there.
You can actually look up the status of forces agreement for its public information, but it's a it's a complicated question, though. We'll be here after if you ask some more questions about it. We'll we'll have the we've got some slides with some great resources that hopefully that their that their firm has, which is wonderful. Yeah. Anything else before we come back to the US? Sounds like a question.
You should turn a number to the school Yeah. Definitely if you have a privacy officer and attorneys of in house, please please please reach out to them on these complicated matters as someone who I did privacy as a non lawyer for a long time. And so I would try to figure some of these very complicated things out. And sometimes I was right, but sometimes I was wrong. And so I always say bring in experts to help.
And it's, especially if your school has the resources. It's great. And if not, there's a lot of external resources that are out there to help, especially in the education space. Okay. Oh, another question.
Mentioned to one of these terms. Oh, yeah. That's, that's definitely a hot button topic to proctoring exams and data collection associated with that. The question was an absence of certain definitions and certainty, you know, how do you make these decisions? It's gonna be a risk based decisions. Even though these terms are undefined, you'll see states build on federal law and federal law builds on you know, things that are barred from other places.
So sometimes you can sort of suss out a reasonable definition in the absence of an official one and use that to make a risk based determination. And that's kind of the best that you can do in a lot of cases. But, you know, you wanna look at the practices of your peers and and really try to make an informed you know, stake on an informed position. Yeah. So what is going to be happening a little peak around the corner? Talk a little bit about the Federal Trade Commission.
They are, are the United States top federal regulator for privacy. They regulate privacy either under what they call their section five authority, which is their authority to bring actions against unfair or deceptive trade practices that affect commerce. And then they have jurisdiction to enforce specific statues, most importantly, Kapa, the Children's Online Privacy Protection Act. They announced in twenty twenty two that one of their top enforcement priorities was going to be EdTech. They were looking very carefully at that sector.
And consistent with their kind of pattern of operations, they made this announcement in May of twenty twenty two. This year, we started to see publicly announced, enforcement actions against companies. So they likely launched a wave of investigations around the time that they made the state of priority. And, you know, it took about a year for those investigations to kind of percolate. So there have been two investigations that have come to enforcement against EdTech vendors this year.
One was against a vendor called Chegg. It was a pretty Didn't expect that. I don't it was a pretty standard section five data security enforcement action. So, Shay, I didn't know them before the FTC, but I guess some of you did. There was a scholarship search site.
They cited lax data security practices that had led to four separate data breaches. Some of the practices cited included storing personal data in plain text using outdated and weak encryption around password. And, the cumulative effect of these four breaches, the the data of forty million plus users and employees was exposed including date of birth, sexual orientation and disabilities, which they were collecting for, you know, eligibility for certain kinds of scholarships, and financial and medical information about their own employees. So they're now subject to all FTC enforcement action subject you to a twenty year consent order where you have to put certain safeguards in place going forward and you have compliance obligations. So that came out in January of this year.
And then the second one was against a company called Edmodo for violations of Copa. I know most of you don't deal with true child users but, you know, interesting to show their focus on the edtech area. Edmodo had offered an online platform and mobile app with virtual class space to host discussions, share and online resources via a free and subscription based services. I won't go into the specifics specifics of the order But during the investigation, Edmodo had had to suspend their operations in the United States. They were hit with a six million dollar penalty, which they were not able to pay.
It was suspended because, they are out of business. But due to entering into a consensus degree, they are allowed to start business again in the United States They have a whole host of compliance obligations going forward. The primary offense from their perspectives, they were using the date of for advertising, which is a really big no no under Kappa, if you're an ed tech vendor. So that's really what what got them, I think, in really bad trouble with the FTC. And then just looking forward a little bit on, Furba reform.
So in May of this year, the Biden administration announced a series of actions that they were gonna undertake to protect child mental health, safety, and privacy online. And one of the actions that they mentioned was ferpa reform. I think they had hoped that they would release amendments to ferpa publicly by April of twenty twenty three that didn't happen. So now the timeline looks like it's going to be November. We'll see if that holds.
These amendments are supposed to update and clarify ferpa by addressing policy issues such as clarifying the definition of education record and clarifying provisions regarding disclosures to comply with judicial orders or subpoenas. So we'll see what happens there. Copper reform is also on the horizon too. I think just yesterday, president Biden called on Congress to pass copper reform. So, another place where Congress is gonna be active.
And then, I think we're, you know, we're technically at time. But we're, yeah, because we got started a little bit late. So the moon shift, but, if you wanna hang out, we'll keep Daniel keep well, we'll keep going. We've only got another slide after this, and then we've got some great resources to share with you also. So we'd be pretty remiss if we didn't at least touch on the use of artificial intelligence.
Specifically generative AI or GA. So these are things that can generate their own content. So things like words, different text, images, that kind of thing. So I'll note that there are other in-depth sessions specifically talking about AI and education here at this conference. Feel free to check them out.
We just wanted to provide a little bit of what we felt was important to discuss. So, Janet BayI obviously presents some really interesting, very cool cases and time saving opportunities for educators and educational institutions. When it used properly, it can assist with course design, document editing, data analysis, you know, digital marketing among other tasks. And to the extent that you're using it, your school will likely procure its own tools and set its own rules. But Kelly and I have advised a number of organizations each with a different use case for AI and specifically yep.
Okay. GI, great. We're getting kicked We're gonna go really fast. So guardrails to consider. First, information input restrictions, particularly if you're using third party software.
We generally recommend that organizations, prohibit people from inputting personal information, confidential information, or business or treat secrets into a GAI tool, As you're likely aware, the models for GI tools are controlled and operated by third parties. Information input into GI tools is used to further train the model and may inadvert inadvertently be disclosed in the future. So you don't want an instance where you've been inputting student information into a GAI tool to write, let's say, an evaluation and then have it come out on the other end in someone else's results. So we also recommend output evaluation requirements, so making sure that every output is actually validated by a human being and specifically try not to use it for the production of things that really do require accuracy So things like legal documents or regulatory materials. Finally, we recommend some use restrictions.
So Generally, we recommend not using GAI for decision making particularly where the decisions have legally significant impacts such as hiring decisions, admissions decisions, in part because of the possibility for bias in underlying models. So finally turning it back to Daisy to talk to us about Instructure's privacy resources. Oh, yeah. No. No.
No. We're gonna skip that. If you're interested in structure privacy resources, email me privacy at instructure dot com. If you have questions about us, if you have questions about privacy, reach out. I talk to our customers.
I talk to our tech community all the time. Love talking about privacy if you can't tell. But, here's some non these are some industry resources. These are great with respect to looking up laws. Some of you guys were asking about how to look up laws.
FPF, the future privacy form has a great US state law. There's also the IAPP, which is the international association of privacy professionals. They have a great global law privacy. And then I'm hopefully y'all got to take a quick snap of this. This is Venables.
They have amazing, amazing white papers and resources. That are used for we use them. I use them all the time. So take pictures, use the resources, and feel free to reach out if you have questions are bios and contact information is at the end of the deck, which y'all should have access to via the, the app. I don't even remember what it's called.
The Instructure event app, I think, is what it's called. Thank you all so much for bearing with the move. Alright. Thank you. And we hope you all have a rest fabulous rest of your time.
Now, we're also doing the K twelve session at five. So if you're interested in Updates related to K twelve, please come to that session as well. I'm date for those of you who don't know me, I'm Daisy Bennett. I'm associate Journal Council and Data Protection Officer here at Instructure. And, for the last couple virtual instructure cons, I've had the opportunity to do some really fun presentations about privacy.
This is even better, because now we're in person, and now we get to do a really interesting session that where we can actually talk to each other, ask questions, be interactive, and actually have some communication going on. But before we dive in, I would love to introduce you to our lovely panelists, Kelly. Do you wanna start first? Sure. Hi. My name is Kelly Bastside.
I am the co chair of the privacy and data security group at Venibal. We are a national law firm, but our privacy group is based in Washington, DC, and we are a long time outside counsel to instructure on privacy and data secure Hi folks. I'm Dana Homstrand. I'm an associate in Venables privacy practice, and we are thrilled to welcome you to think your first breakout session of instructurecon twenty twenty three. Thank you.
For remembering Matt Dana, she is on it. She got us into this room. She's like, got us all organized. I'm running around like a chicken with my head cut off. Thank goodness.
She's here. But now we are super excited to have them here. I joined in structure over three years ago. I've been working with them ever since our previous privacy, lawyer, and data protection officer worked with them. So they have our deep, deep, deep knowledge in education and privacy, and that's why we're so excited that they're here.
To talk to us, answer questions, and hopefully have a really great session. And I'm gonna try not to breathe super hard in this. I am not used to microphones y'all, so please bear with me. These are the topics we're gonna touch over today. We talk a little bit about why privacy is unique to the higher ed ecosystem.
We're gonna hit some hot topics, do a little what they call peeker on the corner. Kelly and Dana are gonna give us some great prognostication on what we might be looking at in the future. We're also I'm also gonna talk very briefly about best practices for vetting ed tech vendors. And then we're gonna go over some resources. Because we believe in giving people lots of tools and lots of opportunity to go research and find out more information.
As I mentioned before, We really want the session to be interactive. So if you have a question while we're on talk about it, please raise your hand. We're gonna try to save time for questions at the end. We wanna give everyone the opportunity to jump in. We want this to be conversational.
There is a lot going on in the privacy world y'all. Things are just I can't even begin to talk about how amazing, complicated, crazy it's been for the past couple years. I joined the privacy universe, and I'm gonna date myself right now, two thousand and three, when being a privacy professional really wasn't a thing, and really all we cared about here in the US was HIPAA. And health information, and the high-tech act, and some of that stuff didn't even think about education privacy in the way that we do now. And as technology has grown, as we've developed these amazing tools that we use in our schools, our ecosystems have gotten so incredibly complex.
That even as a privacy attorney, I have a hard time fishing through the data flows, understanding what data goes to what tool. When a student logs into their LMS and their administer has also connected twenty two other LTI tools. Well, what data are those LTI tools getting from the LMS? Most of our Canvas administrators can't tell. Most of our IT people can't tell. The system is so incredibly complex right now because of the explosion of technology, which of course, COVID brought on as well.
And as I said, there's very little transparency. Many of our K twelve schools, I don't see this as much in higher ed, but we do see it a lot in our K twelve schools where there are not resources for oversight. Right? You don't have a dedicated team to vet all your vendors. You've got teachers that are super excited, and they say, I have this amazing tool. It's free, loaded on your books.
You know, loaded on your Chromebooks kids, we're gonna go ahead and use this tutoring tool. We're gonna use this cool AI tool. We're gonna use all these neat fun things without really understanding what data they collect, what they do, do they even comply with Verpa? Do they comply with the GDPR? Do they need to comply with the GDPR? Do you need to even think about those things? So Unfortunately, because public education is so underfunded, it's really hard often to get these types of tools and resources actually reviewed in a way that is meaningful. And of course, we have issues of data equity. I always bring this up in my privacy presentations because students don't get a choice to use what tools their schools use.
They have to use what their schools tell them. Oh, we're gonna talk all about it. That is the general data protection regulation for those of you who operate international or schools operate internationally, or you deal with students internationally, that is the privacy law of Europe. It's no problem. Like I said, feel free to ask questions.
So to go back to students, right? Students don't have a lot of choices. Instructions sometimes don't have a choices either as to what type of tools their schools are using and what data is being collected by those tools. So this is a really, really sensitive issue, especially with AI and as the privacy, as privacy laws rush blindly to try to catch up with the technology, which, as most of the snow, technology never can technology moves so fast that the law just can't keep up, but it's doing its best. And then, of course, we've got issues of data, so sovereignity, And then we're gonna talk a lot about data transfers and localization in the context of international laws. Kelly, go ahead.
Let's hear about hot topics. Hot topics. So we really did intend this to be, kind of a way around hot topic, what we're looking at in our world session. Each of these is probably worthy of its own forty five minute session, but as Daisy said, raise your hand if there's where we're gonna keep it surface level, but if you want us to go a level deeper, just let us know. So we're actually gonna start in an area that is privacy adjacent at the Department of Education and around something called the Dear colleague Letter or the TPS third party servicer letter that was released in February.
For those of you who followed this development, and it is a winding saga. This was released by the Department of Education in the form of an open letter in February. That addressed third party servicers. It it would implement a new compliance regime that had to be in place as of the letter, as of September one. It was a significant expansion over prime, prior guidance, and it really did remain unclear as to the scope of third parties to which the guidance was supposed to apply.
And if it fell under the guidance, then these entities would be re regulated under the higher education act. So the TPS letter set forth a highly fact specific test. It would turn on the specific services that a third party provided to an institution of higher education. It attempted to clarify the types of activities that made an entity providing services to an institution at TPS, and, thus regulate it under the law. The guidance would have revoked prior third party service or guidance that the industry has been relying on forever.
And would include specific service terms in, contract. So really folks were sort of ramping up for a very quick contract change, roll out of new compliance regime at the institution level. The letter was open for comment, through March seventeenth, they received over a thousand comments. And in April, they released a blog post saying that they changed their mind. That they had received so many comments that they realized that the guidance had been overbroad, and it was unclear, and that they were looking to issue, further guidance at a future time.
They did in this blog post identify several activities that they said did not trigger TPS rules that included study abroad programs, recruitment of foreign students, clinical or externship opportunities, course sharing consortia, dual or concurrent enrollment programs, police departments, helping to compile crime statistics, the fact that they had to revoke all of these activities just shows how expansive potentially the guidance was in the first place, but they're looking to revise further guidance. We do not have a firm timeline on that now, but that they would identify specific services would fall under the category as they continue to review the thousand plus comments that they received. So continuing on the theme of their interest in third party servicers, in I think it was April. They also identified that they were gonna do a notice of proposed rulemaking. So this would be a formal rule.
Under the Administrative Procedure Act that would again codify a compliance regime for third party services. Again, no timeline, no text, they are presumably in their internal process to work on the rule, but, it's something that shows how much interest they have in this topic. It's a place definitely to keep your eye on the ball because they're gonna release something at some point, perhaps with a quick timeline as they had originally envisioned. Okay. So the next, this is privacy adjacent, then we're really gonna get into pure privacy.
Just yesterday, we just this is in the hot topic, breaking news category, the Department of Justice issued a press release saying that they did send a rule to the federal register for notice of a proposed rulemaking under the American with this abilities act that would aim to improve web and mobile application access for people with disabilities and would clarify how public entities can meet existing ADA obligation in terms of online activities. They sent it over yesterday. It'll be published in the federal register shortly, and then it'll be open to a sixty day comment period. So that text has also not been released publicly, but something to pay attention to. If you are involved in website designs or web properties.
Okay. So that was the privacy adjacent part now to the that is near and dear to my heart, what's happening in privacy these days. And we're gonna move to the states because that's really where a lot of activity is happening here. The privacy of minors is a very hot topic, top of mind for the states, It has not been a big year for state student privacy legislation. It's an area where we're really in equilibrium with the laws that they exist now.
Instead, we're seeing a lot of laws that tackle the online activity of minors, and general omnibus privacy laws. So as of twenty twenty three, let's talk about Omnibus privacy laws. As of at the end of this year, five states will have Omnibus privacy laws, laws that cover all forms of data collection from any identified or identifiable person. California and Virginia, as of January one, California's been in effect since twenty twenty, but was amended first of the year. July one, we got Colorado and Connecticut and Utah will come into effect at the end of the year.
So we'll have five on new bus privacy laws at the end of this year. There have been fourteen passed in total. So there's another nine or so that will come into effect over the next two years. These laws largely exempt non at organizations, but not all of them do, most notably Colorado, which is in effect now. It's the state in which we're sitting in actually.
Their law applies fully to nonprofits. So, your institutions, you may have to comply with Colorado if you meet the thresholds under the law. And then Oregon has provisions that will apply to non as of twenty twenty five, the law comes into effect a year earlier. These laws are a largely notice and choice laws. So they require provisions and privacy policies, expansion of rights to individuals.
These include rights of access, correction, deletion, the opt out of certain data quote unquote sales and disclosures. They're the reason why go to your favorite e commerce site, you might see a link at the bottom says do not sell my personal information. That's that's from the California law. It's the kind of compliance regime that's now spreading across the nay the nation in a patchwork fashion. Even though your institutions are largely nonprofit entities, your vendors, of course, are not.
They are for profit entities who may have to comply with these laws if they meet the specified thresholds. I think where you'll see the direct impact of that is in contracts. Because all of these laws require things to be placed in vendor contracts and agreements. It's the reason why you may be drowning in DPA why you keep getting a new wave of contracts over and over again. It's because every time there's a new wave of these laws, those contracts have to be looked at again, and we really don't anticipate an abatement in that cadence anytime soon.
Legislative sessions in the states are largely in recess right now. But they'll be back in the fall, and it wouldn't be surprising if another three to five laws passed this year in another, you know, ten or so year. So we really are reaching a place where this patchwork is expanding and creeping. The laws are similar enough now that uniform compliance is achievable But, it only takes one really to kinda up a set that Apple card and really create a strained, patchwork across the country. Okay.
Just to highlight a couple other developments in the States when it comes to minors, there's been a smaller wave social media laws. So state legislators have introduced more than a hundred bills in the past year. And we have social media laws in Utah, Texas, Arkansas, Arkansas, in Montana. Just to give you an example of what these laws talk about. In Utah, starting in March of twenty twenty four, Social media companies must verify the age of Utah adults seeking to maintain her open social media accounts, get the consent of parents or guardians for users under eighteen So I know we've long considered the age of, majority on the internet to be thirteen plus because of federal laws.
It's now slowly turning to eighteen. Online. Would allow parents or guardians full access to children's account, create default curfew settings that block overnight access to minor accounts, protect minor accounts from unapproved direct messaging and block minor accounts from search results. In addition, they cannot collect minors data target them for advertising, are you certain addictive social designs or features? You might say, well, we don't operate a social media network. Why is this interesting? The definition of the social media platform in the Utah law is any online forum that's made available to create a profile, upload posts, few posts of other account holders and interact with other account holders or users.
So it's gonna suite far more broadly across a lot of sites that have interactive features we're gonna just see, I think, a slow sort of creep for a lot of age gates, look for a lot of age gates and online properties. And this is an area where the law sort of continues to slowly move forward. Yes. In the, in state laws? Or, yeah, so it's a good question. The thresholds in the state laws Again, it varies by state.
Typically, you have to be processing personal data of a fixed number of individuals. It's hundred thousand in most states. In some states, it's less. It's like fifty in a small number of states, like in Montana, than on that many people. But, Yeah.
And then or derive a certain percentage of your revenue from data sales. California is unique. Anytime people talk about the state, lots people are gonna say California is unique because it also captures any business that has twenty five million dollars in revenue, annually full stop. So you can have that revenue threshold and process the date of like three California residents. And in theory, the law would apply.
Sure. Nothing else? Okay. The other kind of thing to highlight is the California age appropriate design code. So this is another development along the same lines. It's called the California age age appropriate design code.
It follows similar codes in the UK and in Ireland. What it requires are certain safeguards and then a certain assessment of features for any properties that are directed to children. And again, this would be anyone, excuse me, under age eighteen, so not, under age thirteen. So it would look at different to determine if you're a property that falls under the code. Like, do you have advertisements that are marketed to children Do you know that a significant number of children visit your site based on empirical audience composition? Again, something to think about potentially if you have sites maybe that are targeted for recruiting of high school students.
Right? That's the kind of site that potentially has by empirical numbers, a significant number of minors visiting the site. At a high level, what it would require is it would require you all to bake privacy by design into the development of that web property. And it's supposed to be, categorized by age. So you know, zero to five is one level of, privacy by design. Thirteen to fifteen is another.
Eighteen to seventeen is what they call approaching adulthood, so yet another. So it's really supposed to be a granular examination of the features of, a web property that is targeted to children. And you have to design products with a high level of privacy That's a quote. If you ask me what that means, it is undefined in California. It will require certain assessments around the purpose of the online product service or feature written assessments about how the they use children's personal information, and, a pretty it's like a long laundry list, like, how you can harm children, how it could lead children to experience harmful content, you know, etcetera, etcetera.
And it's a lot of formal documentation that would be acquired around those features. It's been subject to litigation. So the state of California was sued in connection with promulgation of the age appropriate design code. I checked today and the hearing on that is actually tomorrow. So it remains to be seen whether it does go into effect, but its effective date is about a year away, July first twenty twenty four.
So another place to really look for, ongoing developments. Okay. States. You wanna FTC? Is there another slide? Yeah, we're gonna move forward. I do wanna say something about the age appropriate design code and some of these types of codes that are coming out as a third party service provider to schools.
We design, you know, you you design products and tools, not really realizing how they can be used. So here's a great example. We have a product called Portfolium. It's the whole purpose of it is so students can build a lifelong portfolio that they can start with in childhood and carry it through all the way through college, even into adulthood and even into their workspaces. That it gets spammed constantly by people building bots to sell porn.
So we have an entire team that does nothing but go through those accounts last we did one audit that we did. I think I pulled up forty eight thousand of them and that was just in a three month period. That were literally like, go buy this film. Click here and it's heartbreaking, but when you're developing this tool for a student that you're gonna think is amazing, you don't realize how many bad actors there are that are out there. And that's what these codes are meant to try to capture, right? Because if you have somebody, you know, promo getting porn on a free portfolio site, and then you have five year olds using it that can get to that.
Those those pages easily or friend those accounts. And this is not social media. These are education products, right? There it can do a lot of harm to kids and that's what these are all about. So I just had to give you guys a little context there that It's very important for the law, but it is in practice actually a real thing that we face as a third party service provider because we wanna protect our students in are schools that are using these products, so they can actually use them in a way that's meaningful and not wind up, you know, getting a lot of bad content or even just spam content, right? It doesn't have to necessarily be intentionally harmful. It's free, you know, free downloaded movies that somebody recorded on their iPhone.
It's just all sorts of junk out there. So Kelly, we're talking about international hot topics now. Dana's gonna talk to us about international hot topics. So this is where we're gonna talk about the GDPR and all the international So this is what's unique. I'm gonna jump real quick intro why we're talking about this.
So whereas our K-twelve students, and schools are just operating here in the US are higher ed students because of COVID and the, you know, the the explosion of online education are offering are offering classes worldwide. And so our higher ed institutions and by virtue of providing services to them, their vendors have to comply with laws not just ferpa, not Kappa, not just all the US laws, but now we have to look at all the laws around the world, and of course the big The big privacy, I don't know what you call it hammer in the world is the GDPR. And so Dana's gonna tell us a little bit about it. Yes. So like easy explained, the general data protection regulation or we're gonna we're gonna call the GDPR today is, like, probably the first time we kind of start thinking about consumer privacy and, like, the age.
So this comes out in twenty eighteen, and why should you potentially as a US institution care about the GDPR? Well, it has an extraterritoriality provision, meaning that even if you're not located in the GDPR, if you're targeting services, are in the EU. If you're targeting services there, you might potentially fall under provisions of the GDPR. So compliance is something that should potentially be top of mind, especially if you have maybe exchange programs or students that are coming from other countries. So I just wanted to level set a little bit on some of the term that we're gonna be using, particularly in where we're talking about EU privacy, which are controllers and processors. So this is obviously not new for some of you who have been following privacy the last five or ten years, but when we're talking about controllers and processors, it's important to consider what your relationship is to vendors or service providers even other institutions that you might be working with because whether you're a controller or a processor will define what your rights and responsibilities are to the data subjects for whom you're collecting personal information.
So what's a controller? So this is the party who decides key elements of data processing. So they decide on the manner of process listing. So are we collecting information? Are we doing data analysis? Is this storage? That kind of thing? And your processor is the party that actually process this personal data on behalf of the controller. So you can think of this as your vendors or service providers. Right? They're only storing information because you ask them to do that.
They're only collect doing data analysis because you ask them to do that. So it is possible that you might be a controller in some instances and a processes or in others, and it is possible for a data transfer to involve to controllers or to processors. It just it's kind of a fact specific inquiry. So in instance where you might be a controller, sharing information with another controller is, for example, if you're like renting out a prospect list or sharing like research subjects with another institution. So again, why does it matter what role you have in the processing relationship? So it defines your rights and responsibility.
So if you're the controller, you get to decide what happens with the data right within the limits of the law, but your vendors who are acting as processors do not have the right to use the data however they like, but they also have no corresponding obligations to the data subject. They're honoring rights requests because you have to honor rights requests. So another reason that it's important to understand what your role is is liability shifting and contract. So who is responsible for what actions at any given time with respect to personal data Finally, it helps to outline the flow of data. So who is sending data to whom? So if you understand you're a controller like, okay, I'm the one who's sharing information with these other parties.
So often we see these processing really come up and where there's been some really interesting, developments has been in the realm of data transfers particularly cross border data transfers, and we work a lot with Daisy on these. So one of the GDPR's fundamental principles is data localization. So data physically has to stay in the EU, or if it leaves the EU, it has to be subject to certain safeguards. So transfers are permitted without additional safeguards, which we'll talk about in a moment. If a country outside the EEA receives an adequate decision from the European Commission, This is basically a seal of approval that says, yes, this country has adequate protections for personal data.
So for transfers to non adequate jurisdictions, which until very recently was the United States, companies must implement appropriate safeguards before transferring data. So these are things like the standard contractual clauses. When you've, executed data processing addenhams or other agreements with your vendors, you might have seen the standard contractual causes or SCCs referenced. These are like template, DPA terms. We also might see binding corporate rules you could get consent from data subjects, but that sounds onerous.
And prior to Shremst too, we saw the EU US privacy shield. So what's on the horizon? The EU US data privacy framework. So what is that? So since nineteen the nineteen ninety five a protection directive, there have been various agreements between the EU and US that allowed transfers to the US without these additional safeguards that I discussed. As long as the companies that were receiving the data in the US ascribed to certain principles. So you guys might remember, the EU US Safe Harbor, and then the EU US privacy shield, which were struck down in two decisions.
Shrems one and Shrems two. So we aren't giving up yet. In striking down the EU US privacy shield and trims too, the EU Court of Justice cited concerns that the US government's ability to access EU personal information when it was located in the US and data subjects rights to an effective remedy. Two things that are fairly important parts of the GDPR. So in March twenty twenty two, the EU and US an agreement in principle to replace the privacy shield.
So our government and the EU government kind of work together to figure out what could be a reasonable alternative to the privacy shield. And then in October twenty twenty two, President Biden released an executive order implementing the terms of that agreement. What does this executive order say? So it establishes a data protection review court, which allows European's ability to contest and bring objections to what they believe to be personal information that's been collected improperly by American intelligence activities So that addresses one of the concerns that was brought up in terms too. It also limits access to EU data by US intelligence services to what is necessary and proportionate in service of certain national security objectives yet another way to address concerns brought up in trends too. Finally, it requires the intelligence community to update policies and procedures to account for these changes.
So earlier this month, the European Commission finalized its approval of the data privacy framework granting an adequacy to into the US. Ray. But it's not quite the death of the FCC's and other safeguards. So personal data can now flow from the EU to US organizations that sell certified to the data privacy framework without additional safeguards. Note that only organizations subject to the Federal Trade Commission and the Department of Transportation's jurisdiction are permitted to self certify.
Yeah. Meaning that non profits are cannot participate, but we bring this up because you might see it in contracts with your vendors who are able to self certify to the privacy shield. And this might be important, especially as you are importing data from the EU and have to account for the practices of your vendors to your EU counterparts. So if you remember privacy shield, the requirements for certification under the data privacy framework will look very similar. As Kelly has pointed out, they basically took the same website and changed the logo.
Organizations must provide information to individuals about data processing provide free and accessible dispute resolution, ensure accountability for data transfer to third parties, and ensure commitments are kept in place as long as the organization holds the data. Naturally, our friend Shrims has vowed to bring Shrims three to the European Court of Justice, So we are keeping an eye out. Don't lose the SCCs yet. So what's really interesting about the new framework is you have vendors like Instructure who are providing services to your higher education institution, And you are providing distance learning, for example. Right? You've got, maybe you have a a teacher, you have a adjunct professor in France, and then you're offering this class around the world.
So what does that mean? That means well? Your vendor who's per your LMS, who's per that service has to comply with the GDPR. They also have to comply with any other law that happens to apply to that data subject. So when Dana was talking about data subjects, we're not just talking students here. The GDPR for those of you who aren't familiar apply to everybody, teachers, administrators, whereas here in the US, Verpa applies to students. The GDPR applies to everybody.
So if you're offering distance learning, you're involved in a distance learning program and you're bringing on vendors or third party providers, You gotta make sure that they can meet all of these requirements meet the GDPR requirements. The new data transfer framework's really great because it's actually helped us do a lot less contracts, for some of you who've had to do any of the data new data processing agreements we've had to do over the last few years the checklist, the odds, it gets really burdensome, and it's really hard for some of our smaller institutions to actually be able to manage it as we mentioned earlier. So it gets overly complicated. Dana, we talked about we've got the EU focus on the children's design act because we talked about the California. Now, we're gonna talk about the EU version of the same.
Yes. And unsurprisingly, it is very similar. So earlier this month, the European Commission, you know, has been Really hot for privacy, apparently in the month of July because also this month, they kicked off the special group on a code of conduct for age appropriate design as part of their better internet for kids' strategy. This is following efforts out of the UK as well as Ireland for their children's code. So the group was convened in part in response to the Digital Services Act, which requires all online platforms accessible to minors to ensure, as we've heard before, a high level of privacy, safety and security for minors on their services you'll be unsurprised to find there is no definition.
The group will be responsible for drafting a comprehensive code of conduct on age appropriate design then industry can sign up too. So this one is voluntary. The code will build upon and support the implementation of the Digital Services Act by specifically emphasizing provisions dedicated to safeguarding minors. That kind of it's what it says on the tin. So turning to our friends in South America, also wanted to touch a little bit on what's going on in Brazil.
So the Brazil has its own comprehensive consumer privacy law. That is in Portuguese. And so I will just use the acronym lgPD, which regulates the treatment of personal data of people in Brazil including by granting individuals certain rights over data, requiring a legal basis for processing and setting forth processing principles. So this is relevant to you because the LGPD, LGPD, does not fully apply to the processing of personal data that is carried out for exclusively academic purposes, provided that the processing is supported by another legal basis. However, the law does not define an exclusively, economic purpose.
So recently, Brazilian data protection authority, the ANPD, published a guide intending to clarify what it means to process personal data for exclusive academic purposes. So the ANPD has stated that the limited exception applies when the processing of personal data is strictly limited to freedom of expression, in an academic environment. So if the processing is for other purposes, think like administrative or commercial, so things like enrollment, attendance, evaluation, LGBTD fully applies. So that's what's going on at our friends abroad. And so I'm going to turn it over to Kelly to talk briefly about what going on next.
Anything else I can do for you? Go ahead. This is very micro, so I apologize for that. My regularly get students. Most of my courses are well aligned. Regular get students are overseas.
Some of the Germany where we've got a military follow some of the Middle East. In one case, I was sure the students were on the other side, by the way. Am I bound by that? It's random, and I was only you, but it happens to you. How do they find you? Or you find your courses? Fairly well known in many. It's Or if they have a relative in San Antonio and so they don't Yeah.
Yeah. So, you know, as Dana said, these laws have extraterritorial provisions, but they apply if you don't have a physical presence in these countries, it applies when you offer services there. And that's really a fact specific test. Oh, truth there. If that's what you ask.
Yeah. Or if there's, you know, marketing, but if they find you inadvertently because you have a global reputation and you have a website that's accessible from anywhere in the world, then no, it's not gonna apply. But They're free free legal advice for the day. Yes. Oh.
So the question was, how does it apply to military bases? I forgot that I have to repeat the questions. So our recording can capture them. Yes. This is a really interesting question. I have researched this one.
I know this one. It depends on the particular base. And the status of forces agreement that the United States has entered into with that country, which will set forth which laws apply, you know, when they can use the local law and, you know, for example, prosecute, soldiers if they get in fights and stuff. Right? Sometimes they can be prosecuted sometimes they can't. It depends on the base and the agreement with the home country as to whether the law applies there.
You can actually look up the status of forces agreement for its public information, but it's a it's a complicated question, though. We'll be here after if you ask some more questions about it. We'll we'll have the we've got some slides with some great resources that hopefully that their that their firm has, which is wonderful. Yeah. Anything else before we come back to the US? Sounds like a question.
You should turn a number to the school Yeah. Definitely if you have a privacy officer and attorneys of in house, please please please reach out to them on these complicated matters as someone who I did privacy as a non lawyer for a long time. And so I would try to figure some of these very complicated things out. And sometimes I was right, but sometimes I was wrong. And so I always say bring in experts to help.
And it's, especially if your school has the resources. It's great. And if not, there's a lot of external resources that are out there to help, especially in the education space. Okay. Oh, another question.
Mentioned to one of these terms. Oh, yeah. That's, that's definitely a hot button topic to proctoring exams and data collection associated with that. The question was an absence of certain definitions and certainty, you know, how do you make these decisions? It's gonna be a risk based decisions. Even though these terms are undefined, you'll see states build on federal law and federal law builds on you know, things that are barred from other places.
So sometimes you can sort of suss out a reasonable definition in the absence of an official one and use that to make a risk based determination. And that's kind of the best that you can do in a lot of cases. But, you know, you wanna look at the practices of your peers and and really try to make an informed you know, stake on an informed position. Yeah. So what is going to be happening a little peak around the corner? Talk a little bit about the Federal Trade Commission.
They are, are the United States top federal regulator for privacy. They regulate privacy either under what they call their section five authority, which is their authority to bring actions against unfair or deceptive trade practices that affect commerce. And then they have jurisdiction to enforce specific statues, most importantly, Kapa, the Children's Online Privacy Protection Act. They announced in twenty twenty two that one of their top enforcement priorities was going to be EdTech. They were looking very carefully at that sector.
And consistent with their kind of pattern of operations, they made this announcement in May of twenty twenty two. This year, we started to see publicly announced, enforcement actions against companies. So they likely launched a wave of investigations around the time that they made the state of priority. And, you know, it took about a year for those investigations to kind of percolate. So there have been two investigations that have come to enforcement against EdTech vendors this year.
One was against a vendor called Chegg. It was a pretty Didn't expect that. I don't it was a pretty standard section five data security enforcement action. So, Shay, I didn't know them before the FTC, but I guess some of you did. There was a scholarship search site.
They cited lax data security practices that had led to four separate data breaches. Some of the practices cited included storing personal data in plain text using outdated and weak encryption around password. And, the cumulative effect of these four breaches, the the data of forty million plus users and employees was exposed including date of birth, sexual orientation and disabilities, which they were collecting for, you know, eligibility for certain kinds of scholarships, and financial and medical information about their own employees. So they're now subject to all FTC enforcement action subject you to a twenty year consent order where you have to put certain safeguards in place going forward and you have compliance obligations. So that came out in January of this year.
And then the second one was against a company called Edmodo for violations of Copa. I know most of you don't deal with true child users but, you know, interesting to show their focus on the edtech area. Edmodo had offered an online platform and mobile app with virtual class space to host discussions, share and online resources via a free and subscription based services. I won't go into the specifics specifics of the order But during the investigation, Edmodo had had to suspend their operations in the United States. They were hit with a six million dollar penalty, which they were not able to pay.
It was suspended because, they are out of business. But due to entering into a consensus degree, they are allowed to start business again in the United States They have a whole host of compliance obligations going forward. The primary offense from their perspectives, they were using the date of for advertising, which is a really big no no under Kappa, if you're an ed tech vendor. So that's really what what got them, I think, in really bad trouble with the FTC. And then just looking forward a little bit on, Furba reform.
So in May of this year, the Biden administration announced a series of actions that they were gonna undertake to protect child mental health, safety, and privacy online. And one of the actions that they mentioned was ferpa reform. I think they had hoped that they would release amendments to ferpa publicly by April of twenty twenty three that didn't happen. So now the timeline looks like it's going to be November. We'll see if that holds.
These amendments are supposed to update and clarify ferpa by addressing policy issues such as clarifying the definition of education record and clarifying provisions regarding disclosures to comply with judicial orders or subpoenas. So we'll see what happens there. Copper reform is also on the horizon too. I think just yesterday, president Biden called on Congress to pass copper reform. So, another place where Congress is gonna be active.
And then, I think we're, you know, we're technically at time. But we're, yeah, because we got started a little bit late. So the moon shift, but, if you wanna hang out, we'll keep Daniel keep well, we'll keep going. We've only got another slide after this, and then we've got some great resources to share with you also. So we'd be pretty remiss if we didn't at least touch on the use of artificial intelligence.
Specifically generative AI or GA. So these are things that can generate their own content. So things like words, different text, images, that kind of thing. So I'll note that there are other in-depth sessions specifically talking about AI and education here at this conference. Feel free to check them out.
We just wanted to provide a little bit of what we felt was important to discuss. So, Janet BayI obviously presents some really interesting, very cool cases and time saving opportunities for educators and educational institutions. When it used properly, it can assist with course design, document editing, data analysis, you know, digital marketing among other tasks. And to the extent that you're using it, your school will likely procure its own tools and set its own rules. But Kelly and I have advised a number of organizations each with a different use case for AI and specifically yep.
Okay. GI, great. We're getting kicked We're gonna go really fast. So guardrails to consider. First, information input restrictions, particularly if you're using third party software.
We generally recommend that organizations, prohibit people from inputting personal information, confidential information, or business or treat secrets into a GAI tool, As you're likely aware, the models for GI tools are controlled and operated by third parties. Information input into GI tools is used to further train the model and may inadvert inadvertently be disclosed in the future. So you don't want an instance where you've been inputting student information into a GAI tool to write, let's say, an evaluation and then have it come out on the other end in someone else's results. So we also recommend output evaluation requirements, so making sure that every output is actually validated by a human being and specifically try not to use it for the production of things that really do require accuracy So things like legal documents or regulatory materials. Finally, we recommend some use restrictions.
So Generally, we recommend not using GAI for decision making particularly where the decisions have legally significant impacts such as hiring decisions, admissions decisions, in part because of the possibility for bias in underlying models. So finally turning it back to Daisy to talk to us about Instructure's privacy resources. Oh, yeah. No. No.
No. We're gonna skip that. If you're interested in structure privacy resources, email me privacy at instructure dot com. If you have questions about us, if you have questions about privacy, reach out. I talk to our customers.
I talk to our tech community all the time. Love talking about privacy if you can't tell. But, here's some non these are some industry resources. These are great with respect to looking up laws. Some of you guys were asking about how to look up laws.
FPF, the future privacy form has a great US state law. There's also the IAPP, which is the international association of privacy professionals. They have a great global law privacy. And then I'm hopefully y'all got to take a quick snap of this. This is Venables.
They have amazing, amazing white papers and resources. That are used for we use them. I use them all the time. So take pictures, use the resources, and feel free to reach out if you have questions are bios and contact information is at the end of the deck, which y'all should have access to via the, the app. I don't even remember what it's called.
The Instructure event app, I think, is what it's called. Thank you all so much for bearing with the move. Alright. Thank you. And we hope you all have a rest fabulous rest of your time.