Using Kinect Sensors and Facial Recognition in the Classroom

A Carnegie Mellon project is experimenting with inexpensive sensors and facial recognition technology to help improve instruction.

Large lecture classes may go through the content too quickly for the typical student to understand. That's why so many schools follow the practice of breaking the class cohort into smaller sections led by teaching assistants. This more personalized attention gives students a chance to ask questions, get in-depth explanations and practice what they're learning.

While TAs are intended to help students understand the material, their teaching skills vary and they come at the job with widely different backgrounds. So what students get out of their specific sections may not be as useful as they could be.

A project at Carnegie Mellon University promises to change that with the use of sensors in the classroom hooked to software to help TAs — especially those from other countries — refine their teaching skills in STEM courses. Computer-Aided Noticing and Reflection (CANAR) is the project of Amy Ogan, an assistant professor in the Human Computer Interaction Institute, and Ph.D. student David Gerritsen. The work is being supported by a $174,000 National Science Foundation grant.

Amy Ogan (photo courtesy of David Gerritsen)

As Ogan explained the problem, "Many of our teaching assistants in universities today come from a wide variety of cultures. They haven't all grown up in the U.S. educational system. So we're looking at exploring how to support them, not just as novice teachers in the classroom, but also in new approaches or approaches they may not have taken previously in their teaching experiences."

The ultimate goal of the project is to support improved teaching and learning in university classrooms by bridging cultural divides between students and their teachers.

Gathering Audio Data

The setup is a fairly simple one: The two researchers place a couple of $100 Kinect sensors in the classroom, one on the left side and the other on the right side, in between the teacher and students. Each sensor includes a 1,080-pixel, high-definition camera, a microphone array for capturing sound and motion and infrared technology, and is connected to a laptop computer. Throughout the class session, that computer records activity fed by the sensors.

Right now the focus is on audio data, which is being used in two ways. First, the system signals the TA when he or she is talking too much. That shows up as a big red screen that flashes to the instructor to give a warning, or a green light to signal that all is well.

Then after class, an analysis is done to examine aspects such as "how often the teacher is talking compared to the students or how long the teacher pauses after they ask a question in order to enable student participation," said Ogan. Getting the right mix of student-teacher interaction is simple thing, she noted, but can be difficult for novice teachers to do. "To ask a question and feel comfortable standing at the front of the classroom waiting for students to answer is really hard."

Kinect sensors monitor the audio data for a class and signal the instructor when he or she is talking too much. (photo courtesy of David Gerritsen)

The research team is trying to figure out how to deliver real-time feedback in a form that isn't a distraction to the TA and that can be used in the class while the teaching is going on. Because she's a teacher herself, Ogan brings that perspective to the initiative. As she explained, "I can try these things out or think about how they play out in my own class."

That's where the red screen idea came from: It turned out to be one of the simplest solutions that's also the least disruptive. As Ogan noted, "That's a really good reminder to me: Oh, man, I've just been going on and on for the last 15 minutes. I gotta stop and let my students engage and participate."

Part of the experiment is also trying to figure out the optimal amount of time — five minutes? 15 minutes? — for a TA to talk before students' attention begins to wander.

Next Up: Facial Recognition

Beginning this summer, the program will experiment with analyzing video data as well. The researchers will take advantage of facial recognition technology created by others in the university's CyLab Biometrics Center, which is currently being tested by social networking sites to help people tag photos. "It's the same sort of principle that applies in the classroom," said Ogan. "It's just a little more difficult because now there are 20 or 30 faces that it's looking for as opposed to three."

The software will detect facial expressions, posture and other features from the students in the classroom. "Are their faces and their eyes pointed forward so they're looking at the teaching assistant as opposed to looking down at their desk or around at other students?" Having that information, Ogan said, "tells us something about [whether] they engaged in the classroom and listening to what's happening."

The researchers are considering adding a third sensor in order to gain a "face-on view." But the goal will remain, Ogan emphasized, to "stay small and manageable and inexpensive, as opposed to having to completely redesign all of your classrooms to put smart technologies everywhere."

After the class is over, she added, the TA will be able to access the data to see how well he or she performed. "They can see only 10 percent of the class was paying attention yesterday or they only got participation five minutes out of the hour-long class," explained Ogan.

Improving Teaching

Ogan and Gerritsen are writing up a paper on their work that they hope to submit for peer review in the fall. Gerritsen is also developing an app that may be released in open source when it's ready. What the researchers haven't determined yet is the best form factor for the technology: a program that runs on the laptop in class when the TA begins showing slides; a smartphone app that can flash the red screen; or one that will simply allow the phone to start vibrating in the TA's pocket.

By the fall, Ogan hopes to increase the number of TAs testing CANAR from 10 to 40. So far, those who have tried it love the technology, she said. They're getting the "real-time red screen in the classroom" and receiving teaching support from Ogan's team in between their classes.

As she explained, "We're getting them to reflect on how their previous class went and we're helping them prep for their next class, but in ways that only take five or 10 minutes to do. Our TAs are telling us that they're doing it on their mobile phone, on the bus, on the way to school. So it's really fitting into their lifestyle in a way that trying to take a five-hour long class on teaching practices doesn't at the moment."

Featured

  • glowing blue nodes connected by thin lines in an abstract network on a dark gray to black gradient background

    Report: Generative AI Taking Over SD-WAN Management

    In a few years, nearly three quarters of network operators will use generative AI for SD-WAN management, according to a new report from research firm Gartner.

  • abstract pattern with interconnected blue nodes and lines forming neural network shapes, overlaid with semi-transparent bars and circular data points

    Data, AI Lead Educause Top 10 List for 2025

    Educause recently released its annual Top 10 list of the most important technology issues facing colleges and universities in the coming year, with a familiar trio leading the bunch: data, analytics, and AI. But the report presents these critical technologies through a new lens: restoring trust in higher education.

  • abstract image representing AI tools for reading and writing

    McGraw Hill Introduces 2 Gen AI Learning Tools

    Global education company McGraw Hill has added two new generative AI tools to help personalize learning experiences for both K–12 and higher ed students, according to a news release.

  • abstract image of fragmented, floating geometric shapes with holographic lock icons and encrypted code, set against a dark, glitchy background with intersecting circuits and swirling light trails

    Education Sector a Top Target for Mobile Malware Attacks

    Mobile and IoT/OT cyber threats continue to grow in number and complexity, becoming more targeted and sophisticated, according to a new report from Zscaler.