When’s the last time you read a privacy policy end-to-end? I hear ya, but when it comes to your kid’s school, it may be worth doing. It’s August, which means the kids go back to school in about a month. Just in time, Common Sense Media has released their annual State of EdTech Privacy Report, which has both encouraging and sobering news about student privacy in the classroom.

Our findings in 2019 still indicate a widespread lack of transparency and inconsistent privacy and security practices for products intended for children and students. State of EdTech Privacy Report, Common Sense Media

Source: 2019 State of EdTech Privacy Report, Common Sense Media

Strong privacy protection for students using education technology seems like common sense. But, like most things in technology, it’s actually nuanced and subject to unintended consequences. Why? Privacy, like teaching, should be about building a relationship of trust with students. And that’s more complicated than simply complying with regulations.

The 2019 EdTech Privacy Report reviewed the privacy policies of 150 of the most popular EdTech apps and services. The report was underwritten with support from the foundations of tech titans: namely, the Michael and Susan Dell Foundation, the Chan Zuckerberg Initiative, and the Bill & Melinda Gates Foundation. Common Sense found a 7-point year-over-year improvement in median scores, from 45% to 52%, across categories ranging from data collection and sharing to data rights and parental consent. That improvement’s good news.

We are, however, still hovering at around the 50% mark for EdTech privacy policy scores. This tech benefits and surveils one of our more vulnerable populations – children and teens, who often lack the capacity to consent or critically evaluate online information. And that’s key, regardless of whether you generally stand with strong privacy advocates or with those who believe that some balance between personal privacy and data collection is necessary to advance human progress. I’ve been in tech for over 25 years and tend to fall in the latter camp. I have seen major corporations hesitate to implement artificial intelligence (“AI”) programs that would have a material impact on revenues, due to concerns about privacy backlash. I’ve watched China dash ahead of North America, by a factor of 5X, in implementing AI in corporations. To me, it’s all about the balance between privacy and the benefits of data sharing. Respected tech commentator Amber Mac said it well in her Jan 2019 blog:

On the topic of privacy, we do have to consider the benefits of sharing data. This is something many experts don’t honestly address. If we’re looking at examples of how new technologies such as artificial intelligence can improve healthcare, they can only be effective with large and healthy data sets. – “Privacy Backlash,” Amber MacArthur, AmberMac blog, Jan 2019

I agree with Amber. However, I do think it is incumbent on those who capture and use individual data, especially that of minors, to gain explicit consent, provide a clear, simple privacy policy, align technology usage with the interests of the end user (I wrote about alignment in the context of AI, here), restrict data use to the intended purpose, and monitor and report results of technology use. Lawyers Nadya Tymochenko and Gillian Tuck Kutarna of Miller Thomson discuss this in their excellent book, An Educator’s Guide to Internet and Social Media Law in the Classroom. They provide a clear framework for school boards in chapter 12, principally around consent, right to an alternative method of instruction, teacher training, and auditing/reporting.

Tymochenko and Kutarna have this to say about consent:

It is important to clarify what is meant by consent. The law requires that for consent to be valid it must be informed consent. Details of what information will be disclosed, to whom, and how the information can be used, should be articulated as part of the consent process. School boards should request parent signatures as well as the signatures of students, since it is the student’s information that will be disclosed. An Educator’s Guide to Internet and Social Media Law in the Classroom, Nadya Tymochenko and Gillian Tuck Kutarna, 2018

The degree of consent required is context-dependent, which makes it nuanced. (Professor Ari Ezra Waldman discusses the role of context in  information privacy in Georgian Capital’s podcast, Information Privacy for an Information Age). The Office of the Privacy Commissioner of Canada provides guidelines as to in what context explicit consent (as opposed to umbrella consent under, for example, a broad-brush Responsible/Appropriate Use of Technology policy) is required:

Obtain explicit consent for collections, uses or disclosures which generally: (i) involves sensitive information; (ii) are outside the reasonable expectations of the individual; and/or (iii) create a meaningful residual risk of significant harm. Guidelines for obtaining meaningful consent, Office of the Privacy Commissioner of Canada, May 2018

Genuine consent’s important, as is having a process in place if a parent or student withdraws consent. And to me, the most important element of the framework in the Educator’s Guide is the right to an alternative method of instruction (i.e., read a book, or review paper materials, etc.). Without that, there can be no true consent – if you have to use X software at school in order to complete school work, even if you are opposed to X software’s surveillance, you’ll have to consent in order to pass the class. That’s not genuine consent. I think the percentage of students who will actually opt out of using tech in the classroom and seek alternative methods of instruction is in the low single digits, but providing the choice to do so is vital. Anything else is inauthentic and erodes trust.

Let’s move to the final two elements, teacher training and auditing/reporting. Tymochenko and Kutarna suggest that school boards consider running pilots on new software before rollout, and training teachers on new tools and their privacy policies. Pilot programs and training would reduce the risk of unintended consequences and of teachers using the software for purposes other than what the school policy intended. In terms of auditing and reporting, Tymochenko and Kutarna suggest surveying teachers and students on an ongoing basis about the impact of technology in the classroom, and the importance of appointing a “system leader” for identifying opportunities to improve:

Identifying a system leader is important to monitor for compliance, address deficits and promote a culture of continuous learning as it applies to the use of information technology. Policies and procedures are not static, particularly in a dynamic domain where innovation and change are constant. An Educator’s Guide to Internet and Social Media Law in the Classroom, Nadya Tymochenko and Gillian Tuck Kutarna, 2018

Innovation and rapid change are indeed constants in high tech, which can create the uncertainty and unintended consequences I referred to earlier. The framework in the Educator’s Guide is, however, a very good place to start for school boards looking to protect student privacy, and themselves, as they increasingly roll out new technology and, soon enough, AI.

What’s next in the K-12 EdTech conversation? Something that has nothing to do with compliance or legal regulations. Technology in the classroom changes the relationship between teacher and student. Technology inserts a screen where a quiet conversation and guidance used to be. This is far more important than any policy or regulation ever will be. Leading schools will spend as much time researching this relational impact as they do developing policies, in order to truly leverage technology to create a better learning environment and outcomes for students.

Leave a Reply

Your email address will not be published. Required fields are marked *