New Report Examines Use of Big Data in Ed

Even as "big data" is helping researchers better understand why some students seem to thrive while others don't, a growing backlash by parents and policymakers could hamper research efforts. On one side are concerns about "privacy breaches, hacking, the use of data by commercial software developers for marketing purposes and the possibility that sensitive information ... might limit future opportunities for students," as a new report from the National Academy of Education explained. On the other side is the potential for improving educational outcomes through the linkage of data generated by interactive education programs and administrative data.

The report, "Big Data in Education," summarizes the findings of a recent workshop held by the academy, in which the focus was to review the benefits of educational research using modern data systems, the risks to the privacy of families and students, and technical and political solutions for "maximizing benefits and minimizing risks." The authors also ponder how to balance the benefits of educational research and student privacy: Most of the burden appears to be on researchers themselves to better educate members of the public and others about what they're doing and why.

According to the report, laws protecting student privacy are on the rise. First, there are the three federal laws: Family Educational Rights and Privacy Act (FERPA), the Children's Online Privacy Protection Act (COPPA) and the Protection of Pupil Rights Amendment (PPRA). Besides those, however, over the last four years, 49 states and the District of Columbia have introduced 410 bills related to student data privacy, and 36 states have passed 85 new education data privacy laws. Also, since 2014, 19 states have passed laws that in some way address the work done by researchers. "While some of these recent laws have fairly reasonable requirements concerning the governance structures for the collection and storage of data, as well as transparency provisions aimed at ensuring better communications with parents," the report asserted, "others set restrictions on the format of data that can be made available to researchers, insist on parental permission for any study using data or ban the collection of certain types of data or certain data uses." For example, some of the new regulations require parental permission to use data already collected or prohibit the use of data for predictive analytics. Some of the laws impose penalties "to enhance the accountability of data users, and in some cases researchers specifically."

What's a researcher to do? The report offers several recommendations for data users to keep in mind.

First, researchers need to get better at communicating about their projects, especially with non-researchers. People "are more likely to share their personal information when they see prior positive results and are told of the current uses of their data," the report noted. One approach to follow in gaining trust "from parents, advocates and teachers" uses the acronym CUPS:

  • Collection: What data is collected by whom and from whom;
  • Use: How the data will be used and what the purpose of the research is;
  • Protection: What forms of data security protection are in place and how access will be limited; and
  • Sharing: How and with whom the results of the data work will be shared.

Second, researchers must pin down how to share data without making it vulnerable to theft. The working group recommended a multi-tiered approach, in which data is assessed against a "data continuum." On one end is "deidentified or aggregated data that are publicly available"; on the other end is "individual identified data that are highly protected, such as through the use of data centers, memoranda of understandings between data collectors/repositories and institutions, and [virtual private network] access."

Third, researchers should build partnerships of trust and "mutual interest" pertaining to their work with data. Those alliances may involve education technology developers, education agencies both local and state, and data privacy stakeholders.

Along with the summary report, the results of the workshop are being maintained on a page within the Academy's website here. That includes videos from workshop sessions and resource lists.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • cloud icon with internal and external connections

    New Agentic AI Tool Analyzes Oracle Fusion and Workday Releases

    AI-powered automation platform Opkey has announced Release Advisor, a new agentic AI product aimed at helping Oracle Fusion and Workday customers analyze release updates, determine impact, and generate testing plans for their environments.

  • hand holding AI brain circuit with graduation cap surrounded by hexagonal education icons including books, videos and learning tools

    U.S. Department of Labor Defines 5 Key Areas of AI Literacy

    The United States Department of Labor (DOL) has released a new AI Literacy Framework detailing key aspects of AI literacy as well as "delivery principles" for effective AI literacy training.

  • Abstract speed motion blur in vibrant colors

    3 Ed Tech Shifts that Will Define 2026

    The digital learning landscape is entering a new phase defined by rapid advances in artificial intelligence, rising expectations for the student experience, and increasing pressure to demonstrate quality and accountability in online education.

  • Hand holding a glowing AI sphere

    Beyond the Hype: 5 Actionable Steps for Higher Ed to Master AI in 2026

    AI has arrived as a powerful, pervasive reality, bringing with it a whirlwind of innovation, new tools, and pressing questions. Here are five practical steps to help your institution navigate this rapidly evolving landscape and accelerate its path to real transformation.