In the Era of Remote Learning, It's Time for Colleges to Update Privacy Practices

As the pandemic continues, what's happening to all the data being collected by the various programs being used by colleges and universities to deliver remote learning? That's a question explored in a new report published by think tank New America.

As "Privacy Considerations in Higher Education Online Learning" pointed out, all of the various programs in use right now generate repositories of student data. While institutions may use the data for analytical purposes, its existence also offers the potential for misuse. For example, what happens to the data accumulated by learning management systems when an LMS company changes hands? Is it beholden to the contractual obligations of the original company?

Or what about the risks created when students "invite" others into their homes during virtual video classes? As author Chris Sadler pointed out, "A great number of personal details can often be gleaned through the presence of other people, personal objects, photographs and calendars. Without proper security and privacy controls, other parties could potentially access these recordings." International students from countries with censorship laws could "face prosecution for comments" they make in their classes. Or undocumented immigrants could "face immediate threats" as a result of the information collected during their online classes.

Remote proctoring offers its own set of privacy risks. Multiple aspects of the student situation may be collected during the testing, including every key and mouse stroke and even eye movement. Students may go through facial recognition and ID checks. The proctoring company can also access and control the computer being used by the student. While some proctoring services are done without human interventions, others rely on proctors working remotely themselves. In either case, misuse of that data could happen.

In the case of online program managers (OPMs), companies that provide distance learning programs for schools but also often do recruiting and marketing for institutions, the personal information they collect from student applications may be "repurposed for marketing." The data from a student may be shifted from one university to another, based on the type of contract the OPM has with each.

Regulations such as the Family Educational Rights and Privacy Act (FERPA) impose restrictions on how schools handle the student data in their possession as well as what education technology vendors can and can't do with school data. But specifics are fleeting, according to Sadler. FERPA doesn't spell out policies for data retention or deletion, nor does it even cover information that's been "de-identified" (since it's no longer considered personally identifiable). Schools are allowed to do with it what they like. The problem is that anonymized data in the right operator's hands can be "re-identified."

What's needed, Sadler suggested, is for institutions and the vendors they work with "to do more than comply with the bare minimum protections required by law." Among the aspects they need to govern: how much data is to be retained in the first place (he suggests as little as possible), how it's to be used and how long it's retained.

As the reliance on online learning continues, Sadler advised, it's time for schools to "update their privacy policies and vendor contracts" along with "their own privacy practices and principles."

The complete report is openly available on the New America website.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • futuristic crystal ball with holographic data projections

    Call for Opinions: 2025 Predictions for Higher Ed IT

    How will the technology landscape in higher education change in the coming year? We're inviting our readership to weigh in with their predictions, wishes, or worries for 2025.

  • laptop screen showing Coursera course

    Coursera Introduces New Gen AI Skills Training and Credentials

    Learning platform Coursera is expanding its Generative AI Academy training portfolio with an offering for teams, as well as adding new generative AI courses, specializations, and certificates.

  • close-up illustration of a hand signing a legislative document

    California Passes AI Safety Legislation, Awaits Governor's Signature

    California lawmakers have overwhelmingly approved a bill that would impose new restrictions on AI technologies, potentially setting a national precedent for regulating the rapidly evolving field. The legislation, known as S.B. 1047, now heads to Governor Gavin Newsom's desk. He has until the end of September to decide whether to sign it into law.

  • network of transparent cloud icons, each containing a security symbol like a lock or shield

    Okta, OpenID Foundation Propose New Identity Security Standard

    Okta and the OpenID Foundation have announced the formation of the IPSIE Working Group — with the acronym standing for Interoperability Profiling for Secure Identity in the Enterprise — dedicated to a new identity security standard for Software-as-a-Service (SaaS) applications.