Research: People Ignore Security Warnings through Habit

Don't be so sure that you pay sufficient attention to messages delivered by your computer warning you about unsafe surfing activities. An experiment at Brigham Young University in Provo found that users "routinely ignore security warnings." One reason we do that is because we tend to get "habituated" to certain common messages on the screen and overlook them to our peril.

Researchers Bonnie Anderson, Brock Kirwan and Anthony Vance conducted the project to explore how people deal with online security risks. While users declared that they "care" about keeping their computers secure, their behavior suggests otherwise.

For the study, the faculty members queried students about how they felt about online security. Next, in a "seemingly unrelated task," the students were asked to use their own computers to visit a site to categorize pictures as animations or photographs in order to confirm the accuracy of a computer algorithm developed to do the same task.

As the participants worked through the image pages, warnings would randomly appear on the screen informing them of malware issues with the site they were asked to access. If they ignored the message a certain number of times, they were "hacked."

"A lot of them freaked out — you could hear them audibly make noises from our observation rooms," said Vance, an assistant professor of information systems. "Several rushed in to say something bad had happened."

The hack, showing a screen-sized warning message from an "Algerian hacker," wasn't real. Nothing bad actually happened to the computers.

But the researchers took it a step further. Kirwan, an assistant professor in the department of psychology, set up EEG machines to measure brain responses to risk. They're using functional magnetic resonance imaging (fMRI) to measure neural activity in the visual processing centers of the brain and track how it responds with repeated exposure to warnings. The idea is to explore how "repetition suppression" occurs in the brain in order to develop security warnings that people will pay attention to.

"A lot of people don't realize that they are the weakest link in their computer security," said Kirwan. "The operating systems we use have a lot of built-in security and the way for a hacker to get control of your computer is to get you to do something."

The project showed that brain data is a better predictor of security behavior than a person's own response. "With neuroscience, we're trying to understand this weakest link and understand how we can fortify it," noted Vance.

Anderson, associate professor of information systems, and her colleagues have received a $294,000 grant from the National Science Foundation to continue the research. The findings from the latest experiment were recently published in the Journal of the Association for Information Systems. A draft version of the paper is available online.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured