MIT Teaches Wireless Routers to Know How You're Feeling

MIT CSAIL researchers demonstrate a scenario where the subject's face is neutral, but the EQ-Radio analysis of his heartbeat and breathing show that he is sad. (photo by Jason Dorfman, MIT CSAIL)

A team of researchers at MIT believe they've created a way to recognize basic emotions with signals transmitted from the closest wireless router. The technique relies on measuring small changes in somebody's breathing patterns and heartbeat and doesn't require body sensors or facial recognition. The potential applications are far flung: The technology could be used for medical diagnosis, marketing and entertainment, or even controlling the home or office environment.

As Computer Science and Artificial Intelligence Laboratory (CSAIL) researchers Mingmin Zhao, Fadel Adib and Dina Katabi describe in their paper, "Emotion Recognition Using Wireless Signals," their technique, dubbed "EQ-Radio" transmits a radio frequency (RF) signal and then analyzes its reflections off of a person's body to recognize whether he or she is happy, sad, angry or content. Algorithms slice the reflections into individual heartbeats and then analyze small variations in individual beat length. At the same time, the technology mitigates the impact of breathing in order to emphasize the signals from the heartbeats.

According to the researchers, their approach for recognizing emotion is more accurate than the use of facial recognition because the latter "tends to miss subtle emotions and can be easily controlled or suppressed." (Think about a poker face.) Likewise, physiological measurements such as ECG and EEG signals may be more reliable because they're "controlled by involuntary activations of the autonomic nervous system"; however, "the signals require physical contact with a person's body," and therefore "interfere" with the user's experience, which can in itself affect the emotion state.

In contrast, the researchers noted, "EQ-Radio can capture physiological signals without requiring the user to wear any sensors, by relying purely on wireless signals reflected off her/his body."

The technique has three components: a radio for capturing the RF reflections; the heartbeat "extraction" algorithm; and a classification system that maps learned physiological signals to emotional states. The latter can be thought of like an X-Y diagram with "valence" on one axis to denote positive or negative feelings and "arousal" on the other to denote calm vs. "charged up." For example, anger and sadness are negative feelings, but anger involves more arousal. In the same way, joy and pleasure are both positive feelings, but joy is normally tied to excitement while pleasure is typically tied to a state of contentment.

While the exact correlations may vary from person to person, the researchers explained, they're still consistent enough that EQ-Radio can detect the appropriate emotion with 70 percent accuracy, even without a baseline in place.

"Just by generally knowing what human heartbeats look like in different emotional states, we can look at a random person's heartbeat and reliably detect their emotions," said Zhao, a PhD student, in an article about the project.

To test the technology, the team recruited 12 subjects, including actors, who used videos or music they personally selected, intended to evoke one or another of the emotions for them, as well as a fifth video that enabled the researchers to collect a "no-emotion" baseline. Trained just on those five sets of two-minute videos, EQ-Radio could then accurately classify the person's behavior among the four emotions 87 percent of the time.

The technique used to capture the heartbeat's "entire waveform" has application beyond simple emotion detection, the researchers insisted. In the future, they said, it could be used in non-invasive health monitoring and diagnostic settings.

"By recovering measurements of the heart valves actually opening and closing at a millisecond time-scale, this system can literally detect if someone's heart skips a beat," explained PhD student Adib. "This opens up the possibility of learning more about conditions like arrhythmia, and potentially exploring other medical applications that we haven't even thought of yet."

Katabi, project lead and a professor of electrical engineering and computer science in CSAIL, suggested that the system could be used by movie studios and ad agencies that want to check viewer reactions in real time; or smart homes could use information about the occupant's mood to adjust the climate controls or open windows for fresh air. Or the output could be used in other ways, she added. "We believe that our results could pave the way for future technologies that could help monitor and diagnose conditions like depression and anxiety."

In October, the team will present its findings at the Association of Computing Machinery's International Conference on Mobile Computing and Networking (MobiCom). A video about the project is available on YouTube.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured