Needed Now: A Re-Mediated Education Blueprint To Improve Retention and Employability
Higher education seems stuck in a tricky dilemma: how to move, en masse, to new learning models that seem, on the surface, to require a lower teacher-student ratio. But the dilemma is tricky only if the basic assumption about how students learn remains in the box of behaviorism. And it is tricky only if technology is seen as peripheral--handy but not transformative.
Disturbing indications are that too many graduates are not getting jobs, yet tuition continues to rise. The “product” of our economic sector, in other words, is not selling as well as before. This may be because the learning model prevalent in higher education is behaviorism, a theory of human learning far from current theories based on studies carried out in a number of fields of inquiry (psychology, anthropology, social science, cognitive science, linguistics and others) and strikingly different than the learning occurring informally outside the academy.
It is true that if the assumption persists that teachers are the primary source of knowledge and must be directly involved in all learning conversations as the primary contributor, then we have a problem of scale. Large lectures may benefit students who learn no matter what, although even at a place like MIT, indications in some courses strongly suggested otherwise: A whole new interactive, group-based learning lab was created to replace a lecture room. This was the TEAL Room, the “Technology-Enabled Active Learning” room. Attendance for a required beginning course shot up once this room was opened.
As numbers of students exploded after World War II, and again during the baby boom, higher education was forced to increase class sizes, believing that, if the teacher did not have direct involvement with knowledge construction, either learning would not occur at all, or if it did, it would be erroneous. The idea of social construction of knowledge did not seem to affect the long “build out” of higher education over the past half century.
Still, as the education sector grew, tuitions also grew, and the number of applicants grew as well. From most indications through the 1990s, higher education seemed to be a successful sector of our economy.
The critique of the basic model of higher education grew more pointed in the 1980s as alternate approaches to learning based on more current research into learning were tried and found to be successful, albeit in an academic culture hostile to these new approaches. Faculty who opted for the new approaches might find their student evaluations dropping or their department chair asking them why they were not teaching.
Only in the past few years has a crisis seemed to be brewing in higher education, starting roughly at the time of the Web 2.0 tipping point in 2004. The 2008 recession has accelerated this sense of crisis further as institutional budgets shrink and employers question the readiness of higher education graduates for the changed economy and the changed nature of work today.
In the face of this crisis, as is true of human nature, the response was to do more vigorously what we have been doing all along, pushing “accountability” to make sure we are still doing the same thing but somehow more seriously, and tracking student progress toward learning outcomes as if a more aligned (albeit anachronistic) curriculum would make our graduates more employable.
But something new seems to be arising: We hear more often on more campuses academic leaders saying that maybe there are ways for students to learn other than listening to a teacher or reading a textbook. Campus leaders are beginning to see the value of active learning: Maybe if students originate the work and get more involved in activities related to a discipline-specific problem or project, maybe if students are assessed and evaluated based on real-life learning (authentic assessment), then maybe higher education can improve learning in new ways that are more effective and more aligned to work in our culture and economy now, while not increasing the number of faculty members.
Still, if students are actively engaged in work but out of sight of the teacher, how do we credit that work by warranting that the work resulted in learning consonant with disciplinary knowledge?
The obvious answer is by requiring students to maintain an electronic portfolio that contains evidence of their work away from the classroom.
Why, then, given this emergent awareness, haven’t electronic portfolios become mainstream? Some reasons:
1. Electronic portfolios help faculty members work in very different ways than “talk and test.” The campus systems must then accommodate these different ways.
2. This change--working in new ways with electronic portfolios--is comprehensive, mind-stretching, and based on faith that the change will succeed.
3. Electronic portfolio technology is in a catch-22 trap: Until it is widely used to produce the revenue to hire teams of programmers to bring it up to Web 2.0 standards, it will seem more difficult to use than what we are all used to outside of academia. But until it is at Web 2.0 quality, it will not be widely used.
Higher education needs to work together for the benefit of all in this time that requires the most fundamental change it has ever undergone. Moving incrementally could work if we had the luxury of a 50-year period of time in which to make the necessary changes in our basic learning and business model (credit system and seat time). But we don’t.
About the Author
Trent Batson is the president and CEO of AAEEBL (http://www.aaeebl.org), serving on behalf of the global electronic portfolio community. He was a tenured English professor before moving to information technology administration in the mid-1980s. Batson has been among the leaders in the field of educational technology for 25 years, the last 10 as an electronic portfolio expert and leader. He has worked at 7 universities but is now full-time president and CEO of AAEEBL. Batson’s ePortfolio: http://trentbatsoneportfolio.wordpress.com/ E-mail: [email protected]