Want To Draw Women into CS and Engineering? Broaden the Stereotypes

Sapna CheryanThe stereotype of the typical computer scientist or engineer as someone who is white or Asian, socially inept, obsessed with technology and almost always male is keeping girls out of those fields, according to a new study from the University of Washington. In a paper published in Frontiers in Psychology, the authors suggest that broadening the stereotype could go a long way toward drawing more females into CS and engineering occupations.

To test their stereotype hypothesis, researchers Sapna Cheryan, Andrew Meltzoff and Allison Master performed an experiment. They brought young women into a room, where they were teamed up with a character played by an actor. Six actors — three male and three female — were used, and all claimed to be computer science majors. Half of them dressed to fit the stereotypical image of a computer scientist — sporting glasses and a t-shirt proclaiming, "I code therefore I am" — and told test subjects that they enjoyed solitary hobbies such as playing video games. Half of the young women met with these actors. The other half met with actors who didn't fit the stereotype; they dressed in solid-colored t-shirts and cited hobbies such as hanging out with friends.

After the interaction was complete, the researchers reported, participants were asked about their interest in their partner's major and then asked the same questions again two weeks later.

The women who interacted with a stereotypical student "were significantly less interested in majoring in computer science than those who interacted with the non-stereotypical student, and this effect was equally strong regardless of whether the actor was male or female," the report stated. Also, the negative impact of the stereotype lasted at least as long as two weeks after the interaction.

As the report noted, "The computer science major's gender mattered less in influencing women's interest in computer science than the extent to which he or she fit current computer science stereotypes."

"People use these images to decide where they fit, where they're going to be successful and what's appropriate for them to pursue," said lead author Sapna Cheryan, an associate professor of psychology. "The first image that comes to mind for many students is the guy who is obsessed with science fiction and technology and not interested in people. Students often think you have to fit that image to be successful at computer science."

Environment also plays a role, the researchers suggested — even when the space is online. In a 2011 study in which Cheryan participated, undergraduates virtually entered two introductory computer science classrooms in Second Life. One contained "stereotypical objects" while the other contained non-stereotypical objects. Women chose to take the class in the stereotypically decorated classrooms 18 percent of the time; men chose it 60 percent of the time. That project also found that women expected to perform worse in the class with the stereotypical objects than men. In the non-stereotypical space, women's expectations rose such that women and men expected to do equally well.

The researchers argued in their paper that stereotypes "act as educational gatekeepers, constraining who enters these fields."

However, the stereotypes aren't all bad, they emphasized. A sizable number of students are drawn to the fields "because" of the stereotypes — including between 20 and 25 percent of women.

They also offer an antidote: to "broaden" the stereotype to be more diverse. "Rather than attempting to overhaul current stereotypes, which may deter some men and women, a more effective strategy may be to diversify the image of these fields so that students interested in these fields do not think that they must fit a specific mold to be a successful computer scientist or engineer," the paper stated.

Some schools, they noted, have done just that. For example, the computer science departments at Carnegie Mellon and Harvey Mudd have both increased the proportion of women who are majoring in computer science from about 10 percent to about 40 percent over five years. That's been accomplished, the authors noted, in several ways: through "structural changes" such as modifying recruiting processes; by using diverse role models to expose students to a wide range of applications of computer science; and by modifying an introductory CS course to help students see that it's not a field populated only by "geeky know-it-alls."

"These examples show that efforts to reduce gender disparities in computer science and engineering benefit from actively working to change the culture of these fields, so that they are seen as places where all students are valued and have the potential to be successful," the researchers said.

"It's about making students aware of the diversity in computer science and engineering so that women feel that there is a place for them in the field," added Cheryan. "This research shows that broadening the image of these fields is not only possible, it can be done with some simple changes."

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • row of students using computers in a library

    A Return to Openness: Apereo Examines Sustainability in Open Source

    Surprisingly, on many of our campuses, even the IT leadership responsible for the lion's share of technology deployments doesn't realize the extent to which the institution is dependent on open source. And that lack of awareness can be a threat to campuses.

  • abstract pattern of cybersecurity, ai and cloud imagery

    OpenAI Report Identifies Malicious Use of AI in Cloud-Based Cyber Threats

    A report from OpenAI identifies the misuse of artificial intelligence in cybercrime, social engineering, and influence operations, particularly those targeting or operating through cloud infrastructure. In "Disrupting Malicious Uses of AI: June 2025," the company outlines how threat actors are weaponizing large language models for malicious ends — and how OpenAI is pushing back.

  • cloud icon with a padlock overlay set against a digital background featuring binary code and network nodes

    New Cloud Security Auditing Tool Utilizes AI to Validate Providers' Security Assessments

    The Cloud Security Alliance has announced a new artificial intelligence-powered system that automates the validation of cloud service providers' (CSPs) security assessments, aiming to improve transparency and trust across the cloud computing landscape.

  • geometric grid of colorful faculty silhouettes using laptops

    Top 3 Faculty Uses of Gen AI

    A new report from Anthropic provides insights into how higher education faculty are using generative AI, both in and out of the classroom.