Command, Control, and Curriculum
Between 1943 and 1961, Dwight David Eisenhower had five jobs: Supreme Commander
of the Allied Forces in Europe (1943-45), Army Chief of Staff (1945-48), Supreme
Commander of NATO forces (1950-1952), and president and commander in chief of
the United States (1953-61). He also served, briefly, as president of Columbia
University (1948-50).
One story about Ike’s months at Columbia highlights the difference between
academic and military organizations: walking across the campus with the provost
just prior to assuming the Columbia presidency, the provost was briefing the
General about an important issue on which the faculty and administration had
differing opinions. Eisenhower, ever the military man, commented that the University
should simply tell the faculty what to do. The provost responded, “General,
the faculty are the university.”
Generals enjoy command and control; college presidents suggest, encourage, and
cajole.
The Eisenhower story comes to mind as we enter the third decade of the so-called
“computer revolution in higher education.” In January, 1984, concurrent
with Apple’s launch of the Macintosh, a small number of colleges and universities
began the first computer resale programs as a way to provide discounted microcomputers
to students and faculty.
Campus officials viewed the resale programs as an extension of the instructional
mission of their institutions—providing students with access to technology.
This new link between instructional mission and information technology manifested
itself in other ways. For example, during the mid-1980s, colleges and universities,
along with Apple and IBM, encouraged faculty to create courseware—sometimes
simple/sometimes sophisticated homegrown curricular resources designed for “microcomputers.”
The excitement of the times and technology is reflected in a 1984 Drexel University
documentary, “Going National: The Drexel Microcomputer Project.”
We see Drexel students, faculty, and administrators getting their new computers,
learning to use a Mac, using computers in class and for course assignments,
and talking about their great aspirations for the role of computers and information
technology as part of the college experience.
Two decades later, what hath we wrought? Without question, the technology that
was new and unique in 1984 has become ubiquitous. Twenty years ago students
came to campus to learn about computers; today students come to campus (and
to online courses) to learn about and to learn with technology.
And yet, there is continuing concern about the disappointing levels of instructional
integration. The 2002 Campus Computing Survey reveals that senior campus IT
officials continue to identity the “instructional integration of information
technology” as the “single most important technology issue”
confronting their institution over the next two-to-three years. Moreover, less
than a fifth of institutions participating in the 2002 survey report that their
campuses consider faculty IT efforts as part of review and promotion.
Even as institutions spend increasingly difficult dollars to license course
management systems, upgrade labs, install wireless networks, provide faculty
training and support, and create presentation classrooms, campus and conference
conversations suggest that many senior officials feel their institution is not
progressing: “Maybe a third or 40 percent, maybe even half of our faculty
are doing something with technology in their courses. But we seem stuck, as
the numbers have not been rising the past two-to-three years.”
Is this simply a matter of biblical metaphors? An old joke in the software industry
notes that God could create the world in seven days because there were no legacy
systems or legacy users. Clearly we’ve spent far more than seven years
(and some multiple of $7 billion) to build the IT infrastructure for American
colleges and universities.
Or d'es the too-slow migration of IT into the syllabus reflect the Marx Brothers’
theory of academic culture? At his inauguration as president of Huxley College
in the 1932 movie Horsefeathers, Groucho Marx, as Prof. Quincy Adams Wagstaff,
proclaims in song and dance that “Whatever it is, I’m against it.”
Admittedly, these are (comic) extremes in the serious conversation about technology
and instructional integration. But the questions linger: why, after so much
time, effort, and money, is there still much to be done? Why do many faculty
continue to avoid technology as an instructional resource?
The core infrastructure is in place; the student interest (read expectation)
is there. But what seems missing for many faculty is a compelling sense that
technology makes a difference in student learning and educational outcomes.
Too, given that many students (aged 18-68) may have better IT skills than their
professors, there is also the professorial concern about “How do I do
this—use this technology stuff—without looking foolish in front
of my students?”
Ah, it would all be so simple if college presidents and provosts had the power
and authority of generals: send the troops (faculty) to training, give them
the resources, and set hard deadlines for implementation.
Yet as Eisenhower learned from his provost, faculty are the university. The
atoms of academic organizations are individual professors who operate as free
agents rather than as employees of the university.
In the closing chapter of War and Peace, Tolstoy reminds us, “For a command
to be carried out to the letter, it has to be a command actually capable of
fulfillment.” From “user-friendly DOS” to the Web and beyond,
an important lesson of the past two decades has been that infrastructure, not
just technology, fosters adaptation and innovation. More than just “Build
it and they will come,” curricular deployment and technology integration
also depend on sustained support and some sustained support and some significant
nurturing to attain “fulfillment,”to experience broader instructional
intergration.