Risk and Responsibility

Cognitive development and the implications for higher ed

Last week, a reader objected to the part of my column where I noted a connection, in my mind, between some thinking and research I had been doing about the risk-taking behaviors of young people, and the fact that we tend to fight our wars with the lives of those same young people. I hope he was reassured that I had meant no offense, after I shared with him that, during the period of 1967 to 1971, I had spent more than four years in the military, including three Vietnam tours.

Youth in Asia
I didn't share with him then, though, what I will share with you now. I have, ever since the mid-'70s, described myself as "ignorant and naïve," when explaining how it came about that I ended up with a lot of medals, including the Vietnam Service Medal with three "campaign stars." I was lucky. When I did see combat, it was always in situations where I was relatively protected, surrounded by friendly troops, and I escaped unscathed.

I happen to have been raised in a small, economically depressed, Ohio River valley town. Although I had consumed vast amounts of the East Liverpool Carnegie Library, in those days of static-y television and local news that really meant "local." I really knew nothing about much of the outside world. Certainly I had no concept of what fighting in a war was like, the United States' involvement in the murder of Ngo Dinh Nhu, or the likelihood that the Gulf of Tonkin incident was "manufactured." I didn't have to be recruited: I didn't know what to do with my life, and the concept of "serving my country" while seeing the world was all that I needed.

Like I said, I used to say that I was "ignorant and naïve." I was. I also did not have a fully mature brain, although you would not have been able to persuade me of that, and neuroscience of the time probably would have said that I did. It's only recently that brain imaging has made it clear that certain parts of the brain that allow mature, reflective thought, do not complete their development until a person is in his or her mid-20s.

Cognitive development
One of the ways this late development seems to express itself is in higher crash rates for teen drivers. No doubt there is an evolutionary advantage for humans, in general, in the fact that our young people are a little more likely than mature adults to take certain kinds of risks. There are so many ways that young people express their inclination to take risks: sexual behavior, drugs, MySpace and  FaceBook, driving recklessly, and so forth. Certainly, just the act of leaving home and attending a college or a university is risk-taking of a sort, and something that is easier for a 20-year-old to do than for a 50-year-old. Unfortunately, the same is true of, for example, strapping on a bomb and expecting to end up in paradise after the blast.

The same studies show that youths particularly make relatively poor choices about their actions when they face a host of distractions:
Devoid of cues and the physiological hardware to initiate deliberate and thoughtful critical thinking, young people may act or react without reflection on the implication for self and others (Bergsma, 2000). As a result, adolescents typically underestimate the influence of digital technologies on their behavior or the potential for risk. Influences are especially powerful when youth cannot readily perceive potential threats nor access skills to create a barrier from harm (Berson & Berson, 2003). Nonetheless, modern technologies captivate youth because they draw on one of the most powerful genetic biases of the human brain—a preference for visually presented information.
This very concise article, "The Connection Between Brain Processing and Cyberawareness: A Developmental Reality" (PDF), goes on to state:
Modern technologies require young people to make sense of an overload of information. Despite the amplification of the quantity of data available, the nature of the sensory input restricts and often distorts the quality of visual and tactile cues, the primary modalities used by the brain to represent experience. While filled with superfluous data, virtual interactions provide limited access to the critical signals needed to differentiate safety from harm (Anderson, 2002). Moreover, without the biological structure necessary for deliberate and thoughtful action, young people may impulsively act and react in cyberspace without forethought to the influence of powerful sensations or be easily lured by the artificial distinctions between virtual encounters and real life activity (Berson & Berson, 2003).
Universities take responsibility
One of the higher education trends I have been following is the growing tendency of higher education institutions to broaden their scope with regard to being somewhat more responsible for the behavior of their students. In this Web document, I have been collecting a variety of news stories about student risk-taking behavior and the reactions of higher education institutions. You may find some interesting and useful stories there.

Despite the fact that I spent the years of my youth when I could have been a hippie, serving instead in the military, I still retain enough of the rebelliousness of the age towards authority to be a bit dismayed when colleges and universities reach beyond the campus and the classroom in attempts to control student behavior. However, the science does seem to be supporting at least the need to be cognizant that a 20-year-old's brain has not yet fully matured, even though the rest of his or her body has. I expect that over time we will see some significant changes in the structure of higher education based on this new knowledge.

Featured