Open Menu Close Menu

Imagining the Multisensory Classroom

David J. Staley
The Ohio State University

In 1981, my senior year of high school, I took a course called “Computer Math,” which used one of the recently developed desktop computers, a Tandy 486. This advanced class was intended for a few of us math and science geeks. No one in the class imagined that computers more powerful than our Tandy would become so important to the general curriculum that even kindergarteners would have access to them and be required to master them. This has been a recurring theme over the past few decades: technologies that were once on the periphery have moved to the core of the educational experiences. As these technologies become more important to our ideas about education, they similarly transform the shape and contours of our classrooms and learning spaces.

Like the computer a generation ago, technologies that are currently on the periphery of our educational experiences – such as 3-D printers, scent domes, voice in/voice out receivers, haptic interfaces – are poised to move to the center of our educational experiences, and in the process will transform our educational practices, from grades P-20. This essay urges readers to begin now to imagine how these new technologies will reshape our learning spaces to help them prepare their long-term strategies for information technology and physical infrastructure.

Stephen Acker and Michael Miller observe that in order “to justify the substantial investment in bricks-and-mortar construction, colleges and universities must design and continually renew the physical spaces in which students learn.” While the buildings that house these learning spaces are designed to last decades, “campus planners are challenged to build flexible interior spaces so that buildings with 100-year life spans can continually adapt to new generations of learners and our new discoveries of how people learn.” When planning for the future, therefore, we must look not only to the learning spaces we must build today, but to the anticipated shape of those learning spaces in 20 years, in 40 years, and further down the line.

Here I want to imagine the next generation of “multisensory learning spaces” of the next 15-20 years. By “multisensory,” I mean two things: a significant change in the types of digital content that will be available, and a change in the ways in which teachers and learners will interact with that content. In a multisensory learning environment, images, sound, touch and movement, and even smell will be as important as written or spoken language as a way to represent information and knowledge. The technologies that can deliver such multisensory content are currently on the periphery of general education, but will soon move to the center.

We already can see evidence of this shift toward “sensory awareness” in the culture at large. The marketing consultant Martin Lindstrom observes that branding and advertising are becoming more “sensory oriented.” “Sensory branding” involves not just appeals to the visual “look” of a product, but the way it feels, it smells, it sounds; each a distinctive part of the brand. (Think of the scent of the flight attendants on Singapore Airlines; passengers associate that scent with pleasant memories of the flight. This scent, it should be noted, has in fact been trademarked.) Indeed Muzak – the company that brought us ambient musical sounds – now offers ambient scents. The company president refers to these as “aroma marketing” and “sensory branding.”

I would argue here that the senses reach not only our feelings and emotions and our aesthetic sense, but our intellect as well. “All knowledge,” wrote Denis Diderot, “derives ultimately from the senses.” Yet education has since the Enlightenment involved narrowing the sensory channels through which we acquire knowledge and information. Before the computer revolution in higher education, the acquisition of knowledge was largely left to our aural sense (through oral lectures) or through sight (especially through the sight-poor channel of printed text).

The “multimedia” revolution in education has expanded the range of sights and sounds we use in the classroom to include information visualizations and rich image and sound files. But medical neuroscientist Dave Warner believes that traditional forms of information representation have been “perceptually deficient,” meaning that even multimedia digital content neglects “the extraordinary capacity of our brain to capture and process information from [all of] our senses.” As the designer John Thackara has observed, “In reaction to the limited bandwidth of technology-enhanced vision, [today’s] ecological thinkers emphasize that our senses – taste, smell, sight, hearing, touch – are the fundamental avenues of connection between the self and the world.” All the senses are potential information receivers and knowledge receptors.

This emerging awareness of the cognitive importance of all of our senses is finding expression in a number of technologies. New kinesthetic interface with computers – as popularized today by the game Dance, Dance Revolution – will alter what we mean by “physical education.” No longer confined to just sports and games or even to dance (that is, aesthetic movement), the educated and practiced use of movement itself will emerge as a way to access and communicate information and knowledge. (Something like the way Tom Cruise’s character in Minority Report summons information by moving his arms and his body. That is, not just the act of using his hands, but the formalized manner in which he uses them.)

Many immersive virtual reality environments employ haptic interfaces, which allow users to touch and feel information (consider a scenario in which a student uses virtual haptic reality to climb inside the structure of an atom, and attempts to physically break apart the bond between molecules as a way to physically experience van der Waals forces). 3-D printing and rapid prototyping tools, currently used in engineering, can also be used to make tactile representations of mathematical objects or geographic maps or sculpture (imagine having students feel a non-Euclidian space or feel the Alps, or feel the tactile properties of a model of Michelangelo’s David, an object typically held behind red velvet ropes).

With data sonification technologies, tables of numbers can be represented as sound, revealing patterns in those data by changes in pitch and volume (imagine having students hear statistical data; the “music” produced would be an abstract, but meaningful symphony of sound). One company – TriSenx – is producing a “scent dome,” an interface that converts digital data into different smells. Such tools can be used to represent data, not just replicate natural smells: imagine an olfactory economics classroom which releases a rose smell when the stock market rises, lemons if stock prices are falling (or imagine having students smell census data). An important feature of these applications is that they represent information that we usually do not perceive as having sensory form: geography as touch, number as scent, chemistry as kinesthetic movement.

At one stage, the PC was an avant-garde technology on the fringe of educational practice. The technologies described above are similarly at the advanced garde today, but will within the next 20 years become central features of our classroom environments. IT in the near future will involve technologies that present knowledge and information that depend upon an expanded range of our senses. Institutions that understand the pedagogical effectiveness of these new tools will be positioned ahead of this curve, and will be able to refashion the interior spaces of their buildings as “multisensory” learning spaces.

I would imagine that such a multisensory learning space might look like this: in a 30 x 30 room, the workstations with a keyboard and a mouse or even the tablet PCs of today’s learning spaces have been replaced with large, wall-spanning screens to create an immersive visual environment. A scent dome hovers over the class in the middle of the ceiling. “Voice in” devices are placed around the room to receive spoken commands; these alternate with motion sensors to capture the gestures of students manipulating information. In the corner where an ink-jet printer once stood is a 3-D printer or rapid prototyping device.

Within this technologically-mediated space, students are scattered around the room in movable desks or around collaborative design tables. Quite apart from the new tools that line the walls and surfaces, new pedagogical and classroom management techniques also will be present. Speech, movement, sound, touch, and sight will all be more formal than in today’s classrooms, each tailored toward receiving and transmitting knowledge and information. D'es this sound too expensive, too futuristic? Perhaps, but compare today’s computer-supported classroom to the traditional classroom of 1986. I believe we will continue on the path of falling prices for new technologies, with more information available for manipulation in multiple modes. Our challenges will be more in the new pedagogies to use and interact with this sensorium to promote learning.

David J. Staley, Ph.D. is Principal of The DStaley Group, a futuring and educational technology consulting firm. He is the Director of the Goldberg Program for Excellence in Teaching at The Ohio State University, Executive Director of the American Association for History and Computing and the President of the Columbus chapter of the World Future Society. He can be reached at [email protected].

comments powered by Disqus