Protection of student privacy in the digital age

0


Experts recommend guidelines for improving student privacy in educational technology platforms.

The surge in online learning triggered by the COVID-19 pandemic has created a seller market for educational technology companies. But long before the upswing, a cultural change in education had promoted the digitization of the industry. Today, many education executives consider the collection and use of student data to interpret, understand, and comply with data protection regulations to protect sensitive student data.

The Family Educational and Privacy Rights Act (FERPA) of 1974 prohibits educational institutions from disclosing “personally identifiable information” about students without parental consent. The FERPA regulations apply to all education providers who receive federal funding and are designed to protect students’ printed and digital educational records. In 1978, federal lawmakers expanded the protection afforded under FERPA by passing the Student Rights Protection Amendment (PPRA), which gives parents and students the right to opt out of government-sponsored surveys or reviews on a range of protected topics.

The regulations set by FERPA and PPRA are implemented by the Ministry of Education and mainly relate to the obligations of schools. FERPA and PPRA rules do not apply to educational technology companies – or so-called “ed-tech” companies. However, the Children’s Online Privacy Protection Act (COPPA), enacted in 1998, is enforced by the Federal Trade Commission (FTC) and prohibits online service providers, commercial websites, and children’s apps from accessing information about children under the age of 13 without parental consent.

As the use of Ed-Tech has continued to grow in recent years, industry leaders have looked beyond the top three federal laws governing privacy for students and turned to self-regulation as a means of protecting student data. In 2014, the Software & Information Industry Association and the Future of Privacy Forum developed the Student Privacy Pledge, an industry pledge whereby ed-tech companies make public statements about their privacy practices for students in an accountable manner.

While signing the pledge is voluntary, the FTC can use public pledges from companies to take civil enforcement action against any of the 400+ signatories of the pledge who fail to protect student data. However, many critics insist that such actions have not yet taken place. Some proponents call for traditional student privacy protections to be strengthened to accommodate the growing digital education landscape.

In this week’s Saturday seminar, scientists explain gaps in the regulation of data protection for students and suggest methods to better protect student information and data.

  • Current debates about student privacy do not include increasingly popular online learning platforms that offer learning experiences directly to users, suggest Elana Zeide of the University of Nebraska College of Law and Helen Nissenbaum of Cornell Tech. In an article published in Theory and Research in Education, Zeide and Nissenbaum point out that two types of platforms – Massive Open Online Courses (MOOCs) and Virtual Learning Environments (VLEs) – fall outside the scope of the privacy policy for students, as they collect personal data directly from learners without any school intermediation. They argue that MOOC and VLE operators should go beyond compliance with commercial data usage regulations to uphold student privacy standards that are specific to the education sector.
  • In an article published in the Duke Law & Technology Review, Alexi Pfeffer-Gillett of the University of Maryland’s Carey School of Law argues that educational software companies are failing to comply with student privacy obligations. After analyzing the privacy policies of eight companies that signed the pledge, Pfeffer-Gillett states that seven violate at least one of the pledge’s core promises. For example, Apple collects personal information and conducts behavioral targeting of advertisements. Pfeffer-Gillett also points out that companies that have not signed the pledge are not necessarily less compliant with the standards of the pledge. Instead, he suggests that “the promise could be more valuable as a public relations tool than a means to actually … make industry improvements.”
  • The anonymization of student data alone cannot adequately protect student privacy, argue Elad Yacobson of the Weizmann Institute of Science and several co-authors in one in the Journal for learning analytics. Using machine learning algorithms to analyze and cluster unlabeled records, Yacobson’s team was able to re-identify personal data from anonymized student interaction data. Yacobson and his co-authors were even able to determine when a select group of gifted children were going on a school trip. Yacobson and co-authors state that there is no such thing as a “silver bullet” for privacy in education, arguing that privacy technology needs to be accompanied by clear regulations and increased awareness among educators.
  • In a research project in Research and practice in technology and training, Tore Hoel from Oslo Metropolitan University and Weiqin Chen from Augusta University check Data exchange through an educational lens. Hoel and Chen suggest three principles that should be considered in educational privacy policies. First, privacy and data protection should be achieved by negotiating data exchange with individual students. Second, educational institutions should be transparent in their decisions about access to data, and that access should meet a necessary standard. Finally, schools and universities should use data sharing negotiations as an opportunity to increase data literacy.
  • In a paper in the Virginia Journal of Law and Technology, N. Cameron Russell of the Fordham Center on Law and Information Policy and several co-authors note a legal and regulatory loophole in the sale of student information: Existing data protection laws do not include the sale of student information through data brokers. Russell’s team is committed to promoting transparency in the commercial student data market. They argue that brokers should be required to follow procedures that promote data accuracy, such as an obligation to notify downstream data users of inaccuracies. They also advocate the provision of opt-out clauses for parents and emancipated students and encourage schools, students and families to inform about how their survey results will be used commercially before administering them.
  • In a chapter of the Cambridge Consumer Privacy Guide, Elana Zeide of the University of Nebraska College of Law argues that traditional data protection regulations are insufficient for students in “an era of big data.” Zeide recommends educational technology companies best practices to build trust between stakeholders, such as adequate transparency and accountability. She also suggests maintaining traditional expectations that student personal information will remain in schools and not be sold to for-profit companies.

The Saturday Seminar is a weekly feature that aims to shape the kind of content that would be conveyed in a live seminar with the participation of regulatory experts. Every week, The regulatory review publishes a brief overview of a selected regulatory topic and then distills current research and academic papers on the topic.


Share.

Comments are closed.