Designing Higher Ed Data Visualizations: Aim Deeper to Gain Knowledge and Understanding

To gain more insight from analytics, consider whether your dashboards offer basic facts or dive into true understanding of patterns within your data.

closeup of hands on laptop with data icons

There are occasions as higher education professionals when serendipity smiles on us: when the right concept can be explored with students at a maximally impactful moment. I lived this a few years ago. Nearly a decade into my teaching career and wrapping my doctoral coursework, I was introduced to Grant Wiggins and Jay McTighe's foundational text on backwards design, Understanding by Design. The concept that struck me as novel was the distinction between "knowledge" and "understanding."

At the risk of over-simplifying, "knowledge" is the domain of verifiable, coherent facts. "Understanding" is more inferential; it is the pattern or theory that provides the coherence to the facts. One of my students described the distinction aptly: He knew how to throw a football well but had no understanding of the physiology or physics involved. When it comes to higher education insights, going beyond knowledge to embrace an understanding of data is required for driving meaningful outcomes.

Rote Learning Versus Transferability

If "doing something correctly" is taken off the table as a primary form of evidence, then how can one demonstrate understanding? The answer here is transferability into new and sometimes complex settings. Again, my students provided dozens of examples to illustrate this concept. They pointed to high marks they had earned in their high school careers for demonstrating knowledge in math, science, writing and foreign languages — marks which they believed had been achievable through rote memorization, task execution and rule compliance. Drawing a distinction between knowledge and understanding gave these students a framework to describe the challenges they were now facing in their current university courses. Their struggles in college weren't due to a character flaw or a lack of effort on their ends; they simply had not previously been tasked with transferring knowledge across learning domains or into new contexts.

Aiming for Knowledge and Understanding in Data Visualizations

Allow me to be absolutely clear that knowledge has value; the distinction is that we must be careful to avoid the mistaken presumption that possessing facts is equivalent to possessing understanding. Perhaps nowhere is this more critical, or more tempting, than in the space of data visualization.

Let's walk through a basic example: Imagine that before the fall term started, Hypothetical University (HU) offered incoming students a welcome week. There were 200 events offered, each with optional attendance. The goal was to help students successfully transition into university life. At the end of welcome week, HU stakeholders developed a dashboard that visualized event attendance. The temptation to use this dashboard as evidence of their efficacy in providing students with an understanding of university life may be tremendous, but this would be inaccurate. Furthermore, HU's dashboard did not display evidence of knowledge acquisition. Having attended an event is in no way a demonstration of possessing knowledge or understanding.

What HU had done successfully is create a pragmatic dashboard to use for operational and logistical planning of future welcome weeks. Staff members could easily view which days and times had events that were well attended, what types of events drew an audience, what demographic groups favored what types of content, and so on. These are all tremendously valuable for coordinating and budgeting, but not for showcasing what was learned — the understanding.

Design in the Present; Analytic Value in the Future

If we want to have both knowledge and understanding of the efficacy of any of our work in higher ed, then we must design not only our data visualizations, but indeed the entirety of our efforts. In our earlier example, HU's misstep came from not considering the basic stages of backwards design:

Stage 1: Identify the desired results.

At the completion of Hypothetical U's welcome week:

  • Knowledge: What information should these students possess that they hadn't previously?
  • Understanding: What can these students now explain, interpret or apply? How have their perspectives, empathy or self-knowledge changed?

Stage 2: Determine what constitutes as evidence of the desired result.

  • Attendance data alone is not sufficient; to claim a data visualization displays evidence of knowledge or understanding requires that this data be generated by some form(s) of assessment. HU was using data of convenience, rather than data of intention.

Stage 3: Plan learning experiences that will generate the knowledge and skills necessary to achieve the result.

  • Without forethought here, we run the risk of providing students with an abundance of information without concrete purpose and at the cost of transferability to other contexts. We are informative, rather than educative.

Identifying and Expanding Upon Patterns

Wiggins and McTighe ask that their readers imagine a black-and-white tiled floor. Metaphorically, each tile represents a discrete fact: A piece of knowledge and understanding is a pattern that can be seen across multiple tiles. Patterns can build upon and spill into one another and individual viewers may be able to see new patterns where others previously couldn't. Understanding is the ever-evolving interplay across bodies of knowledge.

Data visualization and analytic tools can extend this metaphor into reality: These technologies allow higher ed users to both literally and figuratively see patterns within information. By employing the principles of backwards design, we enhance the credibility of our efforts, we act as effective stewards of our students' time and resources, and most importantly, we provide ourselves a firm data foundation upon which to understand the efficacy of our own actions. It is from this space that insightful analytics can be born, and our efforts as faculty and staff become less reliant on serendipity, and more on intentionality.

Featured

  • a hobbyist in casual clothes holds a hammer and a toolbox, building a DIY structure that symbolizes an AI model

    Ditch the DIY Approach to AI on Campus

    Institutions that do not adopt AI will quickly fall behind. The question is, how can colleges and universities do this systematically, securely, cost-effectively, and efficiently?

  • Copilot Propels Microsoft to Lead Position in Analytics/BI Market

    A new Gartner report on the analytics/business intelligence market places Microsoft in the lead position of the field. The Redmond cloud giant stands apart and alone atop the axes for both the ability to execute and completeness of vision in Gartner's latest "Magic Quadrant for Analytics and Business Intelligence Platforms."

  • white desk with an open digital tablet showing AI-related icons like gears and neural networks

    Elon University and AAC&U Release Student Guide to AI

    A new publication from Elon University 's Imagining the Digital Future Center and the American Association of Colleges and Universities offers students key principles for navigating college in the age of artificial intelligence.

  • scientists working in a lab

    Learning Engineering: New Profession or Transformational Process?

    Learning engineering combines theories from the learning sciences with problem-solving approaches from engineering, to create a process that can transform research results into learning action. Here, Ellen Wagner guides an exploration of this transformational process.