New Report Examines Use of Big Data in Ed

Even as "big data" is helping researchers better understand why some students seem to thrive while others don't, a growing backlash by parents and policymakers could hamper research efforts. On one side are concerns about "privacy breaches, hacking, the use of data by commercial software developers for marketing purposes and the possibility that sensitive information ... might limit future opportunities for students," as a new report from the National Academy of Education explained. On the other side is the potential for improving educational outcomes through the linkage of data generated by interactive education programs and administrative data.

The report, "Big Data in Education," summarizes the findings of a recent workshop held by the academy, in which the focus was to review the benefits of educational research using modern data systems, the risks to the privacy of families and students, and technical and political solutions for "maximizing benefits and minimizing risks." The authors also ponder how to balance the benefits of educational research and student privacy: Most of the burden appears to be on researchers themselves to better educate members of the public and others about what they're doing and why.

According to the report, laws protecting student privacy are on the rise. First, there are the three federal laws: Family Educational Rights and Privacy Act (FERPA), the Children's Online Privacy Protection Act (COPPA) and the Protection of Pupil Rights Amendment (PPRA). Besides those, however, over the last four years, 49 states and the District of Columbia have introduced 410 bills related to student data privacy, and 36 states have passed 85 new education data privacy laws. Also, since 2014, 19 states have passed laws that in some way address the work done by researchers. "While some of these recent laws have fairly reasonable requirements concerning the governance structures for the collection and storage of data, as well as transparency provisions aimed at ensuring better communications with parents," the report asserted, "others set restrictions on the format of data that can be made available to researchers, insist on parental permission for any study using data or ban the collection of certain types of data or certain data uses." For example, some of the new regulations require parental permission to use data already collected or prohibit the use of data for predictive analytics. Some of the laws impose penalties "to enhance the accountability of data users, and in some cases researchers specifically."

What's a researcher to do? The report offers several recommendations for data users to keep in mind.

First, researchers need to get better at communicating about their projects, especially with non-researchers. People "are more likely to share their personal information when they see prior positive results and are told of the current uses of their data," the report noted. One approach to follow in gaining trust "from parents, advocates and teachers" uses the acronym CUPS:

  • Collection: What data is collected by whom and from whom;
  • Use: How the data will be used and what the purpose of the research is;
  • Protection: What forms of data security protection are in place and how access will be limited; and
  • Sharing: How and with whom the results of the data work will be shared.

Second, researchers must pin down how to share data without making it vulnerable to theft. The working group recommended a multi-tiered approach, in which data is assessed against a "data continuum." On one end is "deidentified or aggregated data that are publicly available"; on the other end is "individual identified data that are highly protected, such as through the use of data centers, memoranda of understandings between data collectors/repositories and institutions, and [virtual private network] access."

Third, researchers should build partnerships of trust and "mutual interest" pertaining to their work with data. Those alliances may involve education technology developers, education agencies both local and state, and data privacy stakeholders.

Along with the summary report, the results of the workshop are being maintained on a page within the Academy's website here. That includes videos from workshop sessions and resource lists.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • business leader standing confidently amid interconnected gears

    Leading Through Complexity: How Online Leaders Can Drive Digital Institutional Transformation

    Leaders charged with developing and expanding online programs at their institutions are finding themselves in increasingly complex roles, but there are a few core steps institutional leaders can take to ensure success.

  • semi-transparent AI brain with circuit elements under a microscope

    Anthropic Develops AI 'Microscope' to Reveal the Hidden Mechanics of LLM Thought

    Anthropic has unveiled new research tools designed to provide a rare glimpse into the hidden reasoning processes of advanced language models — like a "microscope" for AI.

  • From Fire TV to Signage Stick: University of Utah's Digital Signage Evolution

    Jake Sorensen, who oversees sponsorship and advertising and Student Media in Auxiliary Business Development at the University of Utah, has navigated the digital signage landscape for nearly 15 years. He was managing hundreds of devices on campus that were incompatible with digital signage requirements and needed a solution that was reliable and lowered labor costs. The Amazon Signage Stick, specifically engineered for digital signage applications, gave him the stability and design functionality the University of Utah needed, along with the assurance of long-term support.

  • Stylized illustration showing cybersecurity elements like shields, padlocks, and secure cloud icons on a neutral, minimalist digital background

    Microsoft Announces Security Advancements

    Microsoft has announced major security advancements across its product portfolio and practices. The work is part of its Secure Future Initiative (SFI), a multiyear cybersecurity transformation the company calls the largest engineering project in company history.