Our findings in 2019 still indicate a widespread lack of transparency and inconsistent privacy and security practices for products intended for children and students. – State of EdTech Privacy Report, Common Sense Media
Strong privacy protection for students using education technology seems like common sense. But, like most things in technology, it’s actually nuanced and subject to unintended consequences. Why? Privacy, like teaching, should be about building a relationship of trust with students. And that’s more complicated than simply complying with regulations.
The 2019 EdTech Privacy Report reviewed the privacy policies of 150 of the most popular EdTech apps and services. The report was underwritten with support from the foundations of tech titans: namely, the Michael and Susan Dell Foundation, the Chan Zuckerberg Initiative, and the Bill & Melinda Gates Foundation. Common Sense found a 7-point year-over-year improvement in median scores, from 45% to 52%, across categories ranging from data collection and sharing to data rights and parental consent. That improvement’s good news.
On the topic of privacy, we do have to consider the benefits of sharing data. This is something many experts don’t honestly address. If we’re looking at examples of how new technologies such as artificial intelligence can improve healthcare, they can only be effective with large and healthy data sets. – “Privacy Backlash,” Amber MacArthur, AmberMac blog, Jan 2019
Tymochenko and Kutarna have this to say about consent:
It is important to clarify what is meant by consent. The law requires that for consent to be valid it must be informed consent. Details of what information will be disclosed, to whom, and how the information can be used, should be articulated as part of the consent process. School boards should request parent signatures as well as the signatures of students, since it is the student’s information that will be disclosed. – An Educator’s Guide to Internet and Social Media Law in the Classroom, Nadya Tymochenko and Gillian Tuck Kutarna, 2018
The degree of consent required is context-dependent, which makes it nuanced. (Professor Ari Ezra Waldman discusses the role of context in information privacy in Georgian Capital’s podcast, Information Privacy for an Information Age). The Office of the Privacy Commissioner of Canada provides guidelines as to in what context explicit consent (as opposed to umbrella consent under, for example, a broad-brush Responsible/Appropriate Use of Technology policy) is required:
Obtain explicit consent for collections, uses or disclosures which generally: (i) involves sensitive information; (ii) are outside the reasonable expectations of the individual; and/or (iii) create a meaningful residual risk of significant harm. – Guidelines for obtaining meaningful consent, Office of the Privacy Commissioner of Canada, May 2018
Genuine consent’s important, as is having a process in place if a parent or student withdraws consent. And to me, the most important element of the framework in the Educator’s Guide is the right to an alternative method of instruction (i.e., read a book, or review paper materials, etc.). Without that, there can be no true consent – if you have to use X software at school in order to complete school work, even if you are opposed to X software’s surveillance, you’ll have to consent in order to pass the class. That’s not genuine consent. I think the percentage of students who will actually opt out of using tech in the classroom and seek alternative methods of instruction is in the low single digits, but providing the choice to do so is vital. Anything else is inauthentic and erodes trust.
Let’s move to the final two elements, teacher training and auditing/reporting. Tymochenko and Kutarna suggest that school boards consider running pilots on new software before rollout, and training teachers on new tools and their privacy policies. Pilot programs and training would reduce the risk of unintended consequences and of teachers using the software for purposes other than what the school policy intended. In terms of auditing and reporting, Tymochenko and Kutarna suggest surveying teachers and students on an ongoing basis about the impact of technology in the classroom, and the importance of appointing a “system leader” for identifying opportunities to improve:
Identifying a system leader is important to monitor for compliance, address deficits and promote a culture of continuous learning as it applies to the use of information technology. Policies and procedures are not static, particularly in a dynamic domain where innovation and change are constant. – An Educator’s Guide to Internet and Social Media Law in the Classroom, Nadya Tymochenko and Gillian Tuck Kutarna, 2018
Innovation and rapid change are indeed constants in high tech, which can create the uncertainty and unintended consequences I referred to earlier. The framework in the Educator’s Guide is, however, a very good place to start for school boards looking to protect student privacy, and themselves, as they increasingly roll out new technology and, soon enough, AI.
What’s next in the K-12 EdTech conversation? Something that has nothing to do with compliance or legal regulations. Technology in the classroom changes the relationship between teacher and student. Technology inserts a screen where a quiet conversation and guidance used to be. This is far more important than any policy or regulation ever will be. Leading schools will spend as much time researching this relational impact as they do developing policies, in order to truly leverage technology to create a better learning environment and outcomes for students.