In the Era of Remote Learning, It's Time for Colleges to Update Privacy Practices

As the pandemic continues, what's happening to all the data being collected by the various programs being used by colleges and universities to deliver remote learning? That's a question explored in a new report published by think tank New America.

As "Privacy Considerations in Higher Education Online Learning" pointed out, all of the various programs in use right now generate repositories of student data. While institutions may use the data for analytical purposes, its existence also offers the potential for misuse. For example, what happens to the data accumulated by learning management systems when an LMS company changes hands? Is it beholden to the contractual obligations of the original company?

Or what about the risks created when students "invite" others into their homes during virtual video classes? As author Chris Sadler pointed out, "A great number of personal details can often be gleaned through the presence of other people, personal objects, photographs and calendars. Without proper security and privacy controls, other parties could potentially access these recordings." International students from countries with censorship laws could "face prosecution for comments" they make in their classes. Or undocumented immigrants could "face immediate threats" as a result of the information collected during their online classes.

Remote proctoring offers its own set of privacy risks. Multiple aspects of the student situation may be collected during the testing, including every key and mouse stroke and even eye movement. Students may go through facial recognition and ID checks. The proctoring company can also access and control the computer being used by the student. While some proctoring services are done without human interventions, others rely on proctors working remotely themselves. In either case, misuse of that data could happen.

In the case of online program managers (OPMs), companies that provide distance learning programs for schools but also often do recruiting and marketing for institutions, the personal information they collect from student applications may be "repurposed for marketing." The data from a student may be shifted from one university to another, based on the type of contract the OPM has with each.

Regulations such as the Family Educational Rights and Privacy Act (FERPA) impose restrictions on how schools handle the student data in their possession as well as what education technology vendors can and can't do with school data. But specifics are fleeting, according to Sadler. FERPA doesn't spell out policies for data retention or deletion, nor does it even cover information that's been "de-identified" (since it's no longer considered personally identifiable). Schools are allowed to do with it what they like. The problem is that anonymized data in the right operator's hands can be "re-identified."

What's needed, Sadler suggested, is for institutions and the vendors they work with "to do more than comply with the bare minimum protections required by law." Among the aspects they need to govern: how much data is to be retained in the first place (he suggests as little as possible), how it's to be used and how long it's retained.

As the reliance on online learning continues, Sadler advised, it's time for schools to "update their privacy policies and vendor contracts" along with "their own privacy practices and principles."

The complete report is openly available on the New America website.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • person signing a bill at a desk with a faint glow around the document. A tablet and laptop are subtly visible in the background, with soft colors and minimal digital elements

    California Governor Signs AI Content Safeguards into Law

    California Governor Gavin Newsom has officially signed off on a series of landmark artificial intelligence bills, signaling the state’s latest efforts to regulate the burgeoning technology, particularly in response to the misuse of sexually explicit deepfakes. The legislation is aimed at mitigating the risks posed by AI-generated content, as concerns grow over the technology's potential to manipulate images, videos, and voices in ways that could cause significant harm.

  • close-up illustration of a hand signing a legislative document

    California Passes AI Safety Legislation, Awaits Governor's Signature

    California lawmakers have overwhelmingly approved a bill that would impose new restrictions on AI technologies, potentially setting a national precedent for regulating the rapidly evolving field. The legislation, known as S.B. 1047, now heads to Governor Gavin Newsom's desk. He has until the end of September to decide whether to sign it into law.

  • illustration of a VPN network with interconnected nodes and lines forming a minimalist network structure

    Report: Increasing Number of Vulnerabilities in OpenVPN

    OpenVPN, a popular open source virtual private network (VPN) system integrated into millions of routers, firmware, PCs, mobile devices and other smart devices, is leaving users open to a growing list of threats, according to a new report from Microsoft.

  • interconnected cubes and circles arranged in a grid-like structure

    Hugging Face Gradio 5 Offers AI-Powered App Creation and Enhanced Security

    Hugging Face has released version 5 of its Gradio open source platform for building machine learning (ML) applications. The update introduces a suite of features focused on expanding access to AI, including a novel AI-powered app creation tool, enhanced web development capabilities, and bolstered security measures.