Beginning the Third Decade
When did you buy your first computer? Where did you buy it? How much did
you pay for it? When did you send your first e-mail? First visit the Web? Make
your first purchase from Amazon? First cite a URL in a blibiography or a course
syllabus?
These seemingly simple questions reveal a lot about our individual and institutional
odyssey in the world of information technology. The nascent microcomputers of
the late 1970s and early 1980s, including the first IBM PCs and Macintosh computers,
introduced what Steve Gilbert and I tagged in a 1985 Change magazine
article as “The New Computing in Higher Education.”
It has, indeed, been a journey. Technologies that did not exist or were simply
emerging in 1985—personal computers, notebook computers, cell phones,
PDAs, and the Web—today have moved from incidental to essential. These
technologies, and others now emerging (for example, wireless) have made the
transformation from costly conveniences to compelling, inexpensive, and ubiquitous
necessities.
We—and our students—want and expect more: more technology tools,
more digital content, more resources, more stuff!
That said, there is no question that our aspirations for information technology
continue to exceed our individual and institutional capacity to innovate with
and integrate technology into instruction and operations. The early adopters
among us seem to integrate, effortlessly, all the emerging tools and technologies.
In contrast, the rest of us are engaged in a continuing game of digital catch-up.
But even as our reach exceeds our grasp, what we can—indeed should—ask
is, “how far we have come over the past two decades?” And we should
also ask about the distance we have to go.
The easy metrics involve individuals and individual work: if you (like me!)
are “middle-aged” and “mid-career” (somewhere between
40 and 65), the digital shadows of IT are everywhere: e-mail, word processing,
PowerPoint presentations, course management systems, and instant messaging,
coupled with the emerging ubiquity of wireless technologies and video, all serve
as constant reminders of how much the work environment in academe (and elsewhere)
has changed over the past two decades.
The Web, in particular, has dramatically enhanced, what was until about 1995,
the largely unconnected desktop computer. The explosive growth of the Web has
provided fingertip (well, keyboard) access to an incredibly rich and constantly
growing array of resources that reside well beyond my office and time zone.
But for those of us in academe, there are also the instructional and operational
aspects of technology. Here the critical issues are far more difficult to measure:
instructional infrastructure and curricular deployment, as well as classroom
and organizational impacts and outcomes.
Up close and personal, I think about the experiences of my son and daughter,
one a college junior, the other heading off to college in fall 2004. They learned
about computers in elementary and middle school, and were sent to the Web for
information and resources for their class projects and term papers by the time
they hit their teens. Their teachers and professors have used PowerPoint presentations
in class and have included URLs in the syllabus. My children have textbooks
that include CDs. A course management system seems to be widely used at my son’s
college.
But are their classrooms—and is the classroom experience—so different
from what I experienced as an undergraduate three decades ago? To be sure, some
of the physical trappings are different: LCD projectors have replaced overhead
projectors and many students take notes on computers or PDAs. But the “in-class,
on-task” activities seem remarkably similar to my own experiences as a
college student: lectures, group discussion, and student presentations.
So how, then, do we address the continuing questions about technology and instruction?
What are the appropriate metrics for tracking the instructional integration
of information technology? Number of URLs in the syllabus? Use of a course management
system? Online content and assessment? Number of PowerPoint presentations? Hits
on the course Web site and average session time?
Do efforts to quantify aspects of IT in instruction accurately reflect
the all-important qualitative dimensions and impact of IT in the curriculum?
These are really important questions. Alas, we in the campus community don’t
have very good answers. As I stated in last month’s column, too often
the best we can offer is evidence by epiphany.
So here’s my prediction: much as the past two decades have been marked
by academe’s great aspirations for the role of technology in instructions
and operations, this decade may be marked by efforts to make institutions accountable
for the continuing (and rising) investment in IT. Inquiring minds—board
members and public officials, parents, and even some faculty—will focus
on two questions: (1)Why don’t faculty do more with technology? and
(2) Why don’t colleges and universities make better use of information
technology in campus operations and services?
As we enter the third decade of the “computer revolution” in higher
education, these seem like fair, timely, and, yes, admittedly difficult questions
that we in the campus community will have to address.
About the Author
Kenneth C. Green is the founding director of The Campus Computing Project (campuscomputing.net), the largest continuing study of the role of computing, eLearning, and information technology in American higher education. Launched in 1990, Campus Computing is widely cited by both campus officials and corporate executives in the college publishing and technology industries as a definitive source for data, information, and insight about a wide range online education and information technology planning and policy issues that affect U.S. colleges and universities. Green is also a senior research consultant at Inside Higher Ed, which publishes his Digital Tweed blog, and he is the author/co-author or editor of a dozen books and published research reports and more than 90 articles and commentaries that have appeared in academic journals and professional publications. He is often quoted on higher education, information technology, and labor market issues in The New York Times, The Washington Post, The Los Angeles Times, The Wall Street Journal, The Chronicle of Higher Education, Inside Higher Education, and other print and broadcast media. In October 2002, Green received the first EDUCAUSE Award for Leadership in Public Policy and Practice. The award cites his work in creating The Campus Computing Project and recognizes his “prominence in the arena of national and international technology agendas, and the linking of higher education to those agendas.”