Designing Higher Ed Data Visualizations: Aim Deeper to Gain Knowledge and Understanding

To gain more insight from analytics, consider whether your dashboards offer basic facts or dive into true understanding of patterns within your data.

closeup of hands on laptop with data icons

There are occasions as higher education professionals when serendipity smiles on us: when the right concept can be explored with students at a maximally impactful moment. I lived this a few years ago. Nearly a decade into my teaching career and wrapping my doctoral coursework, I was introduced to Grant Wiggins and Jay McTighe's foundational text on backwards design, Understanding by Design. The concept that struck me as novel was the distinction between "knowledge" and "understanding."

At the risk of over-simplifying, "knowledge" is the domain of verifiable, coherent facts. "Understanding" is more inferential; it is the pattern or theory that provides the coherence to the facts. One of my students described the distinction aptly: He knew how to throw a football well but had no understanding of the physiology or physics involved. When it comes to higher education insights, going beyond knowledge to embrace an understanding of data is required for driving meaningful outcomes.

Rote Learning Versus Transferability

If "doing something correctly" is taken off the table as a primary form of evidence, then how can one demonstrate understanding? The answer here is transferability into new and sometimes complex settings. Again, my students provided dozens of examples to illustrate this concept. They pointed to high marks they had earned in their high school careers for demonstrating knowledge in math, science, writing and foreign languages — marks which they believed had been achievable through rote memorization, task execution and rule compliance. Drawing a distinction between knowledge and understanding gave these students a framework to describe the challenges they were now facing in their current university courses. Their struggles in college weren't due to a character flaw or a lack of effort on their ends; they simply had not previously been tasked with transferring knowledge across learning domains or into new contexts.

Aiming for Knowledge and Understanding in Data Visualizations

Allow me to be absolutely clear that knowledge has value; the distinction is that we must be careful to avoid the mistaken presumption that possessing facts is equivalent to possessing understanding. Perhaps nowhere is this more critical, or more tempting, than in the space of data visualization.

Let's walk through a basic example: Imagine that before the fall term started, Hypothetical University (HU) offered incoming students a welcome week. There were 200 events offered, each with optional attendance. The goal was to help students successfully transition into university life. At the end of welcome week, HU stakeholders developed a dashboard that visualized event attendance. The temptation to use this dashboard as evidence of their efficacy in providing students with an understanding of university life may be tremendous, but this would be inaccurate. Furthermore, HU's dashboard did not display evidence of knowledge acquisition. Having attended an event is in no way a demonstration of possessing knowledge or understanding.

What HU had done successfully is create a pragmatic dashboard to use for operational and logistical planning of future welcome weeks. Staff members could easily view which days and times had events that were well attended, what types of events drew an audience, what demographic groups favored what types of content, and so on. These are all tremendously valuable for coordinating and budgeting, but not for showcasing what was learned — the understanding.

Design in the Present; Analytic Value in the Future

If we want to have both knowledge and understanding of the efficacy of any of our work in higher ed, then we must design not only our data visualizations, but indeed the entirety of our efforts. In our earlier example, HU's misstep came from not considering the basic stages of backwards design:

Stage 1: Identify the desired results.

At the completion of Hypothetical U's welcome week:

  • Knowledge: What information should these students possess that they hadn't previously?
  • Understanding: What can these students now explain, interpret or apply? How have their perspectives, empathy or self-knowledge changed?

Stage 2: Determine what constitutes as evidence of the desired result.

  • Attendance data alone is not sufficient; to claim a data visualization displays evidence of knowledge or understanding requires that this data be generated by some form(s) of assessment. HU was using data of convenience, rather than data of intention.

Stage 3: Plan learning experiences that will generate the knowledge and skills necessary to achieve the result.

  • Without forethought here, we run the risk of providing students with an abundance of information without concrete purpose and at the cost of transferability to other contexts. We are informative, rather than educative.

Identifying and Expanding Upon Patterns

Wiggins and McTighe ask that their readers imagine a black-and-white tiled floor. Metaphorically, each tile represents a discrete fact: A piece of knowledge and understanding is a pattern that can be seen across multiple tiles. Patterns can build upon and spill into one another and individual viewers may be able to see new patterns where others previously couldn't. Understanding is the ever-evolving interplay across bodies of knowledge.

Data visualization and analytic tools can extend this metaphor into reality: These technologies allow higher ed users to both literally and figuratively see patterns within information. By employing the principles of backwards design, we enhance the credibility of our efforts, we act as effective stewards of our students' time and resources, and most importantly, we provide ourselves a firm data foundation upon which to understand the efficacy of our own actions. It is from this space that insightful analytics can be born, and our efforts as faculty and staff become less reliant on serendipity, and more on intentionality.

Featured

  • Campus Technology Announces 2025 Product of the Year Winners

    Sixteen companies were selected as winners for their product achievements.

  • AI word on microchip and colorful light spread

    Microsoft Unveils Maia 200 Inference Chip to Cut AI Serving Costs

    Microsoft recently introduced Maia 200, a custom-built accelerator aimed at lowering the cost of running artificial intelligence workloads at cloud scale, as major providers look to curb soaring inference expenses and lessen dependence on Nvidia graphics processors.

  • large group of college students sitting on an academic quad

    Student Readiness: Learning to Learn

    Melissa Loble, Instructure's chief academic officer, recommends a focus on 'readiness' as a broader concept as we try to understand how to build meaningful education experiences that can form a bridge from the university to the workplace. Here, we ask Loble what readiness is and how to offer students the ability to 'learn to learn'.

  • row of digital padlocks

    2026 Cybersecurity Trends to Watch in Higher Education

    In an open call last month, we asked education and industry leaders for their predictions on the cybersecurity landscape for schools, districts, colleges, and universities in 2026. Here's what they told us.