Open Menu Close Menu

Viewpoint

Innovation in Higher Education: It’s Not the Technology

The real innovation in higher education IT is not the technology itself. This may seem obvious now, but it wasn't in the past. It's a recent revelation that comes with changes in the roles of IT staff and faculty in innovation with technology for teaching and learning and in IT organizations and departments on campus.


When you walk into a playground with your child or grandchild--especially the new imaginative, soft-surfaced, wonderland playgrounds now sprouting up--your kid wants to try everything at once. If they are like my own grandkids, they’ll run to a slide and slide down it, race to the swings and swing a bit, try out one of the little vehicles, and generally do “the grand tour” in a few minutes. Their running from apparatus to apparatus is analogous to the grand tour education has been on for thirty years, the technology rapture grand tour. Faculty and technology support staff, academic leaders, CIOs, even presidents, trying out one technology whiz-bang after another, believing/hoping each one will make enough of a difference to justify the cost, or will make parents happy when they tour the campus with prospective students, or will keep the institution current, or--the real chimera--lower the cost of teaching.

Many technology implementations did have a positive impact, of course--sometimes not the expected impact but perhaps, instead, another unexpected but worthy impact, such as faculty members being able to use a course management system to post a syllabus on the Web before registration starts so students can make wiser decisions as they register for courses.

During those thirty years from 1980 when microcomputers first became available, to just recently, the common rhetoric in higher education has been along the lines of “This technology will do this and that, will bring about a complete change, will revolutionize this or that...” The technology was the active agent. Colleges and universities purchased technology after watching an expert demonstrate the wonders of this or that application. IT leaders were involved in many of these decisions since the purchasing decisions affected them directly: Applications had to be installed on servers on the campus and then maintained.

To keep the number of applications manageable, IT offices made lists of applications that they supported. As the list was built, IT offices therefore were taking the lead in innovation on campus since it was IT staff that acquired the applications, installed them, trained users, offered help-desk support, and kept the license active from year to year. There was, therefore, a natural limit to how many applications could be supported and so, by the late 1990s and early 2000s, IT leaders were less innovators than limiters. The IT leaders made the reasonable case that they were maxed out and their budgets had become flat or were in decline. If technology innovation was to continue, two things had to happen. First, academic leaders needed to help build consensus among faculty as to which technology applications were most valuable for the institution, making the biggest bang for the buck, and second, the applications could not add any significant new responsibilities for the IT staff.

But, also, a third change had to occur. In the new climate of perspicacious choices regarding new technologies on campus, the rhetoric had to change: No longer could we live in la-la land believing the technology had magic. We had to become responsible. We had to recognize that watching an expert demo a technology did not in any way address the real strategic issues, the hard question of who will use the technology, how they will use it, for what purpose, with what support, guided by what assessment process, with what expected outcomes, and with what plan for sustainability. The real innovation, we painfully discovered, is not the technology, but the change in behavior of humans using the technology.

Very fortunately, just as flat IT budgets and disappointment in fantasy purchases of technology magic bullets coincided to end one era of technology in higher education, Web 2.0 technologies suddenly became widely available and wholly new patterns of technology use became possible. These applications were in the “cloud” (on the Web), often were free, interfaces were wonderfully intuitive, and both faculty members and students, from 2004 on, found a whole new path for technology innovation. 2004, when the phrase “Web 2.0” was coined and the first Web 2.0 conference was held, marked the end of IT staff on campuses leading technology innovation for educators. IT staff on most campuses now lead innovation in operating efficiencies and in improving performance in connectivity--expanding the capabilities of the infrastructure--but cannot be looked to in most cases for leadership regarding change in the academic enterprise.

Now we find new patterns of technology innovation on campuses:

· CIOs and IT staff keeping the infrastructure running with zero downtime and continually adding capabilities as core technologies improve; fail-over plans and mirror sites and so on to guarantee the institution can keep running no matter what; a trend toward virtualization and cloud computing

· Provosts, and especially vice-provosts or assistant provosts, or deans, leading technology initiatives with heavy involvement of faculty at all stages; CIOs may or may not be involved in these initiatives

· Faculty choosing to use alternate Web 2.0 tools instead of similar features in the campus LMS; the LMS slipping from its “end-all and be-all” status as the most important academic application

· Social software “bleeding into” courses that are intended primarily for student use

· Offices for educational innovation and technology being established that are unrelated to the IT hierarchy; a technology alliance on the academic side among such innovation offices and faculty development centers and/or centers of teaching and learning that reinforce the emerging faculty leadership in technology innovation; departments and schools and colleges within universities, in many cases, now have their own IT leaders/point persons that join the new academic technology alliance

Clearly, the myth that the technology does something itself to bring about significant human change in teaching/learning/assessment practices has been “busted.” Campuses are instead accepting the obvious truth that some human change must come first, that time and human commitment to a sustainable support system must precede technology adoption, and that educators themselves must lead technology initiatives. We are now fully into the millennial re-tailoring of our academic garbs (cf. Thomas Carlyle’s Sartor Resartus), moving beyond the playground approach to technology adoption, and seeing all about us actual changes in teacher and learner behavior.

[Photo by Trent Batson]

comments powered by Disqus