Will the Real Digital Native Please Stand Up?

A decade after the term 'digital native' was first popularized, educators examine what the term means today--if anything.


Photo: Dirk Anschütz/Corbis

A decade has passed since author, game designer, and educational thought leader Marc Prensky heralded the arrival of a new generation of students whose immersion in information technology distinguished them in fundamental ways from previous generations. Because they had spent their entire lives "surrounded by and using computers, videogames, digital music players, video cams, cell phones, and all the other toys and tools of the digital age," Prensky wrote in a two-part article published in 2001, these "digital natives" were "no longer the people our educational system was designed to teach."

Prensky's widely circulated article served as a clarion call to educators, from K-12 to higher ed, to recognize these emerging differences and adapt to them. Many educators remember that, when "digital native" first entered the lexicon, the term provoked a heated conversation about how they should be addressing the academic needs of students raised in an age of ubiquitous, highly accessible, and swiftly evolving IT--not to mention how they might improve their own use of that technology.

So, 10 years later, has Prensky's concept of a new generation of tech-tuned students held up to the test of time?

It's Not About Age
"I remember being really struck by that article," recalls Helen Chen, a research scientist at Stanford University's (CA) Center for Innovations in Learning. "Since then, I've certainly cited Prensky in some of my work. But the question of whether the technology is actually creating students who are truly different learners has yet to be answered. The problem with Prensky's assumption is that it's based on age--on the idea that, because you were born in a certain era, you must be a certain way. Of course, we know that's not true. Even among your 20-somethings, there are students who are very tech-savvy and those who are not. The term is a generalization, and the reality is much more nuanced."

Another flaw in the eyes of some educators is the fact that Prensky labels everyone else as "digital immigrants," unable to achieve true fluency in the new tech world. "If you limit the conversation to [Prensky's definition of] digital natives, you have limited yourself to barely a quarter of the head count in American higher education," says Kenneth "Casey" Green, founding director of The Campus Computing Project. "'Digital natives' are the most visible group as we think about full-time undergraduate college students, but they're roughly 25 to 30 percent of the population. The other parts of the population--adults, full-time workers, people with families--come to campus with, in some ways, even higher expectations and better technology skills. The reality is, just because you can Google, game, text, and tweet doesn't mean you necessarily have the technology skills for your portfolio."

Eszter Hargittai, associate professor of communication studies and faculty associate of the Institute for Policy Research at Northwestern University (IL), echoes Green's comments--and says she has the research to prove her point.

"I don't think the term was ever valid, or even very useful," she says. "It assumes that older people are worse than younger people when it comes to technology. And it seems to assume that all young people are homogenous when it comes to technology use. Neither of those things is correct."

For about a decade, Hargittai has been studying the social and policy implications of digital media, with a particular focus on how differences in people's web skills influence what they do online. She started her research at about the same time "digital native" began generating its initial buzz.

"My work has shown over the years that there are, in fact, significant differences among people of the same age when it comes to the skill with which they use digital media--and that age is not necessarily a determinant of skill," she says, "'Digital native' assumes a fluency that 'digital immigrants' are lacking. It suggests this grand generational divide, but we don't have enough empirical evidence to support that notion. It's just not that simple."

For her 2007 study, Digital Na(t)ives? Variation in Internet Skills and Uses Among Members of the "Net Generation," Hargittai considered the differences in web-use skills among a universally wired group of more than a thousand college freshman. She surveyed them on the frequency and diversity of their web use, and indexed for skill on 27 variables. She was able to control for two key variables: age and education. Her conclusion: "While popular rhetoric would have us believe that young users are generally savvy with digital media, data presented in this article clearly show that considerable variation exists even among fully wired college students when it comes to understanding various aspects of internet use."

In other words, Prensky's digital natives are not equally "native." Hargittai's research found differences in information and communication technology skill levels along socioeconomic lines, among races, and even among genders. Higher levels of parental education, being male, white, or Asian-American were factors associated with higher levels of skill in using the web. Students of lower socioeconomic status, women, students of Hispanic origin, and African-Americans exhibited lower levels of web know-how than others.

But, contrary to fears of a digital divide that were first voiced during the mid-1990s, the difference in skill levels does not appear to be related to access to technology. "I don't use 'digital divide,' because it's so black-and-white," says Hargittai. "Plus, the term tends to refer mainly to the hardware, and nowadays most kids do have access to that technology."

Nevertheless, Hargittai's study concluded that socioeconomic status is one of the most important predictors of how effectively people incorporate the web into their everyday lives. Those from more privileged backgrounds simply use it in more informed ways for a larger number of activities.

Computer vs. Digital Literacy
Seen through the prism of higher education in 2011, Prensky's definition of tech fluency for those considered digital natives was perhaps too limited. "It's not just about how long people have been using the technology, or how much time they spend with it," says Hargittai. "It's much more about how they're using it and whether they're learning to use it critically and carefully."

Looked at another way, spending 12 hours playing Angry Birds and watching YouTube videos of surfing wipeouts is not the same as evaluating the reliability of source material on sites about the Vietnam War.

Indeed, it might be useful to think of Prensky's definition of digital nativity as the modern tech equivalent of knowing how to type. In the same way that being able to type doesn't make students better writers, understanding how to use tech devices doesn't necessarily make students digitally literate or skilled.

It's a distinction that's certainly not lost on Susan Metros, associate vice provost and associate CIO for technology-enhanced learning at the University of Southern California. Metros teaches courses in digital and multimedia literacy, and has served as principal visual designer on award-winning multimedia projects. She fears that when a term like "digital native" becomes institutionalized, colleges come to expect a uniform level of tech savviness among their incoming freshmen, and may be unprepared for those less "native" students who lack the skills and experience needed to succeed.

"We see that assumption at the university all the time," she notes. "Students are digital natives, right? So they know all about the technology. But the truth is they just know that top layer of the technology. They're digital dependent and digital stimulated. They know how to text messages and upload a video to YouTube, but in general they don't possess the deeper critical thinking skills they need to be truly digitally literate."

To illustrate her point, Metros cites the evolving nature of visual literacy. "Being visually literate used to mean that you could look at a picture and decode it," she explains. "If you looked at a painting, you could understand what you were seeing, maybe in context if you've had some art history. If you looked at a sign, you could read it. But the definition of visual literacy has changed because of technology. Now you have to be able to make images--charts, graphs, presentations. Even more important, you have to be able to understand the ethical implications of images you see and post online. That's not something many students get to on their own, no matter how comfortable and capable they are with the technology. It takes teaching."

At Home on the Web
So was Prensky completely off base in positing a new generation of students whose brains are actually wired to learn in different ways from previous generations? Certainly, there is no clinical evidence to back up any claims about physical changes in the brains of today's traditional-age students. But educators are providing anecdotal evidence of a shift in how students approach learning and education in general.

Metros, for one, acknowledges that access to the web has allowed today's students to evolve a uniquely contextual learning style. "They don't rely on the textbook the way I did when I was a student," she says. "They go to a website and look something up, then they link that instantly to something else, and before they know it they have this broad contextual understanding of the topic. It's actually a very big difference, when you think about it. Ten years ago, students were sort of trapped in the textbook."

And, unlike even recent generations of internet-connected students, adds Metros, the current crop isn't satisfied with merely consuming web content; they expect to produce some of it themselves. Becoming "prosumers" (a blend of "producer" and "consumer") isn't exclusive to digital natives, but it's a very "native" concept.

Ten years after Prensky defined his digital natives essentially as savvy users of a list of hardware devices "and all the other toys and tools of the digital age," it is the kind of web fluency Metros cites that truly defines what it means to be a digital native today, says Ali Jafari, director of research and advanced applications in the Office of Integrated Technologies at Indiana University-Purdue University Indianapolis.

"It is how they perceive [the web] that makes them different in my opinion," he explains. "Many older people use the web, of course, but for digital natives the web is an integral part of their lives. They go there first, instinctively. And yes, some are better at it than others. I definitely agree that there is a continuum of capabilities among the digital natives. But if we are talking about what makes them different from previous generations, I believe it is this connection to the web."

Julie Evans, chief executive officer of Project Tomorrow, agrees. "They have a very clear sense of the value of the internet and having access to information," she says. "Access is their lifeblood."

Project Tomorrow is a nonprofit focused on improving science, math, and technology in K-12 schools. The group's Speak Up National Research Project polls K-12 students, teachers, parents, and administrators on their thoughts about and use of technology in learning environments. The fact that survey respondents are self-selected may skew its results toward the more advanced end of the web-use spectrum, but the pool of data is nevertheless impressive--more than 2.2 million online surveys have been submitted by participants since 2003.

Project Tomorrow has eschewed "digital native," says Evans, in favor of the term "free agent learners," which the group believes more accurately describes a portion of a generation of internet-connected students who aren't tethered to traditional educational institutions. She says this group of learners is more globally aware, thanks to the internet, and more adept at collaborative uses of the web. She also says that she hears "pretty regularly" from English and social studies teachers that these students are better writers than those from previous generations.

"This generation definitely has a thematic approach to learning," she says, "which is not about, 'I'm a vessel--go ahead and fill me up.' It's about, 'I'm the master of my own educational destiny. Give me lots of input and I'll find what I think is most important.' Most of the [K-12] schools I talk to still believe that they are the custodians of knowledge. But for these kids, increasingly, [schools] are just one more source of input."

Even as educators refine and redefine what it means to be a digital native, others point out that Prensky's 2001 articles nevertheless served as a catalyst in focusing attention on the educational impact of the new digital age. While Prensky's original definition might not survive close scrutiny a decade later--too generationally focused and without enough attention on how students use their devices--he was definitely on to something. "[Prensky's definition] is a shortcut, so some nuance gets lost," says Steve Hargadon, founder of Classroom 2.0, a popular educator-focused social network. "But it was really helpful initially, because it brought up the topic when nobody was talking about it." 

Is the Real Digital Native Still at Recess?

Some educators believe that Prensky's 2001 identification of an entire generation that's technologically adept and wired to learn differently was not altogether wrong--but certainly came too soon. Julie Evans, chief executive officer of Project Tomorrow, makes the case that, even though we've been talking about digital natives for 10 years, the first wave of true digital natives--kids who have been connected to the internet in school since kindergarten--is just now in middle school. In contrast, even though most of today's college students view the web as an extension of themselves--and expect to use their devices everywhere--they did not use the web as an educational platform until comparatively late in their academic careers.

"Most of the students in college right now probably weren't connected to the internet at school until almost the seventh grade," she says. "And among these students, some had internet access at home, and some didn't. There's going to be a difference between a student who didn't take his first test online until the seventh grade and a student who started taking online tests in the first grade. There's just a range of experience out there that makes it less useful to lump all these students together in one category."

Evans cautions that this broad spectrum of web experience makes it difficult to use the current campus population as the basis for long-term planning. "If you're in higher education and you're developing a strategic plan or making investment decisions based on conversations you're having with the students currently in your classrooms--or even high school students--you're talking to the wrong audience," she warns. "You really need to be talking to third-graders. The high school kid applying to your school today is just not as 'native' as the kids further down the pike."

Featured