Designing Higher Ed Data Visualizations: Aim Deeper to Gain Knowledge and Understanding

To gain more insight from analytics, consider whether your dashboards offer basic facts or dive into true understanding of patterns within your data.

closeup of hands on laptop with data icons

There are occasions as higher education professionals when serendipity smiles on us: when the right concept can be explored with students at a maximally impactful moment. I lived this a few years ago. Nearly a decade into my teaching career and wrapping my doctoral coursework, I was introduced to Grant Wiggins and Jay McTighe's foundational text on backwards design, Understanding by Design. The concept that struck me as novel was the distinction between "knowledge" and "understanding."

At the risk of over-simplifying, "knowledge" is the domain of verifiable, coherent facts. "Understanding" is more inferential; it is the pattern or theory that provides the coherence to the facts. One of my students described the distinction aptly: He knew how to throw a football well but had no understanding of the physiology or physics involved. When it comes to higher education insights, going beyond knowledge to embrace an understanding of data is required for driving meaningful outcomes.

Rote Learning Versus Transferability

If "doing something correctly" is taken off the table as a primary form of evidence, then how can one demonstrate understanding? The answer here is transferability into new and sometimes complex settings. Again, my students provided dozens of examples to illustrate this concept. They pointed to high marks they had earned in their high school careers for demonstrating knowledge in math, science, writing and foreign languages — marks which they believed had been achievable through rote memorization, task execution and rule compliance. Drawing a distinction between knowledge and understanding gave these students a framework to describe the challenges they were now facing in their current university courses. Their struggles in college weren't due to a character flaw or a lack of effort on their ends; they simply had not previously been tasked with transferring knowledge across learning domains or into new contexts.

Aiming for Knowledge and Understanding in Data Visualizations

Allow me to be absolutely clear that knowledge has value; the distinction is that we must be careful to avoid the mistaken presumption that possessing facts is equivalent to possessing understanding. Perhaps nowhere is this more critical, or more tempting, than in the space of data visualization.

Let's walk through a basic example: Imagine that before the fall term started, Hypothetical University (HU) offered incoming students a welcome week. There were 200 events offered, each with optional attendance. The goal was to help students successfully transition into university life. At the end of welcome week, HU stakeholders developed a dashboard that visualized event attendance. The temptation to use this dashboard as evidence of their efficacy in providing students with an understanding of university life may be tremendous, but this would be inaccurate. Furthermore, HU's dashboard did not display evidence of knowledge acquisition. Having attended an event is in no way a demonstration of possessing knowledge or understanding.

What HU had done successfully is create a pragmatic dashboard to use for operational and logistical planning of future welcome weeks. Staff members could easily view which days and times had events that were well attended, what types of events drew an audience, what demographic groups favored what types of content, and so on. These are all tremendously valuable for coordinating and budgeting, but not for showcasing what was learned — the understanding.

Design in the Present; Analytic Value in the Future

If we want to have both knowledge and understanding of the efficacy of any of our work in higher ed, then we must design not only our data visualizations, but indeed the entirety of our efforts. In our earlier example, HU's misstep came from not considering the basic stages of backwards design:

Stage 1: Identify the desired results.

At the completion of Hypothetical U's welcome week:

  • Knowledge: What information should these students possess that they hadn't previously?
  • Understanding: What can these students now explain, interpret or apply? How have their perspectives, empathy or self-knowledge changed?

Stage 2: Determine what constitutes as evidence of the desired result.

  • Attendance data alone is not sufficient; to claim a data visualization displays evidence of knowledge or understanding requires that this data be generated by some form(s) of assessment. HU was using data of convenience, rather than data of intention.

Stage 3: Plan learning experiences that will generate the knowledge and skills necessary to achieve the result.

  • Without forethought here, we run the risk of providing students with an abundance of information without concrete purpose and at the cost of transferability to other contexts. We are informative, rather than educative.

Identifying and Expanding Upon Patterns

Wiggins and McTighe ask that their readers imagine a black-and-white tiled floor. Metaphorically, each tile represents a discrete fact: A piece of knowledge and understanding is a pattern that can be seen across multiple tiles. Patterns can build upon and spill into one another and individual viewers may be able to see new patterns where others previously couldn't. Understanding is the ever-evolving interplay across bodies of knowledge.

Data visualization and analytic tools can extend this metaphor into reality: These technologies allow higher ed users to both literally and figuratively see patterns within information. By employing the principles of backwards design, we enhance the credibility of our efforts, we act as effective stewards of our students' time and resources, and most importantly, we provide ourselves a firm data foundation upon which to understand the efficacy of our own actions. It is from this space that insightful analytics can be born, and our efforts as faculty and staff become less reliant on serendipity, and more on intentionality.

Featured

  • glowing blue nodes connected by thin lines in an abstract network on a dark gray to black gradient background

    Report: Generative AI Taking Over SD-WAN Management

    In a few years, nearly three quarters of network operators will use generative AI for SD-WAN management, according to a new report from research firm Gartner.

  • abstract pattern with interconnected blue nodes and lines forming neural network shapes, overlaid with semi-transparent bars and circular data points

    Data, AI Lead Educause Top 10 List for 2025

    Educause recently released its annual Top 10 list of the most important technology issues facing colleges and universities in the coming year, with a familiar trio leading the bunch: data, analytics, and AI. But the report presents these critical technologies through a new lens: restoring trust in higher education.

  • abstract image representing AI tools for reading and writing

    McGraw Hill Introduces 2 Gen AI Learning Tools

    Global education company McGraw Hill has added two new generative AI tools to help personalize learning experiences for both K–12 and higher ed students, according to a news release.

  • abstract image of fragmented, floating geometric shapes with holographic lock icons and encrypted code, set against a dark, glitchy background with intersecting circuits and swirling light trails

    Education Sector a Top Target for Mobile Malware Attacks

    Mobile and IoT/OT cyber threats continue to grow in number and complexity, becoming more targeted and sophisticated, according to a new report from Zscaler.