Open Menu Close Menu

Unintended Consequences Can Follow IT Policy Implementation

Earlier this week I was one of about 300 people attending the National Learning Information Infrastructure (NLII) conference in New Orleans, Louisiana. (I can truthfully say that I have eaten more crawfish--not crayfish--in the past few days than in the entire previous 57 years of my life.) On Tuesday afternoon I was attracted to a presentation titled "Security, Privacy, Copyright, and Other Institutional Policy Implications of Online Learning," given by Rodney J. Peterson, policy analyst and security task force coordinator, EDUCAUSE. I have corresponded with Rodney over the years. I know that he is the expert in this realm; and also wanted a chance to meet him in person, which I did.

His session was a fast and furious 50,000-foot view of higher education's relationship to the legal issues implicated in the title, and I enjoyed it very much. However, it was a comment--and a subsequent conversation I had afterwards--with another attendee that gave me the topic for this column. Peter Chen, of Stanford University, brought up an issue that had actually been borderline-alluded to in other presentations, but he presented it as a real set of circumstances. It's about one (of many) unintended consequences that can arise from institutional information technology policies.

Right after Rodney had talked about institutions having "turned the corner on using automated means to put security messages in front of students," I commented that the motivation for that may have been the events of August and September, 2003--a time, full of worms and viruses, that none of us will forget. He agreed, saying that "Had we not [made those and related changes], this past fall would have been 20 times worse than 2003."

Then Peter noted that an unintended consequence of some of those automated messages and systems, and implementation of some of the policies that were created in the past year and a half, might actually fly in the face of our institutions' educational missions. How is that? Well, I don't want to provide any details specifically from Stanford, but here's a hypothetical set of circumstances that are similar but not identical. (I've 'hyped" it up a bit to make a point, so this is not what has really happened anywhere . . . yet.)

Suppose Student A leaves his laptop on in his dorm room and Student B, perhaps a roommate or just a good friend, comes along and illegally downloads some files. Suppose that not only d'es your network recognize the illegal download but the download also brings along some malware that causes your network to kick in an automated system that shuts down that computer's access to the network and also block's Student A's authentication to university resources, due to the illegal download.

Not a problem, right? He can go somewhere, to some office, explain the situation--once he understands what happened--and get access to what he needs. Well, consider that perhaps this happens late on a Friday afternoon and finals begin on Monday and that Student A has a very important final exam Monday morning.

Through no fault of his own, he can't get into the network from his dorm room and thus he cannot access the resources of the class for which he has an exam on Monday morning. He tries and tries, but he also can't get adequate IT support for his laptop because by the time he learned he had a problem it was already the weekend and his dorm d'esn't have 24x7 resident experts.

So, he g'es to a computer lab, determined to log in and access the course from there, only to find that his access to all university online resources is invalid--but he d'esn't even know why. And it's still the weekend. Imagine trying to find someone on campus on a Saturday evening to make complicated decisions about whether or not to turn on a student's authentication due to automatically detected improper network usage.

Since Student A can't reach his professor on the weekend, and it's a large class in which he was relatively anonymous and d'esn't know any of the other students, he stresses all weekend without access to study materials and blows the final exam. Ouch! That's not the kind of thing we're supposed to do to our students.

In many respects, this type of "punishment" is a lot like suspension for a K-12 student. And there are many respected authorities who will agree that "punishing" a student by removing his access to learning isn't exactly appropriate. (Protecting other students is another story, and in this case the dis-authentication was for network protection, but the result was the same.) In earlier days that removal from learning was to ban the student from the classroom or the school. Nowadays, with so many parts of so many classes on campus located on the network, it may just be an automated shutdown of network access, like what happened to Student A.

Poor Student A. He may not get accepted to the law school of his choice in two years due to a 'C' grade instead of a 'B' grade in that class. If he's got good lawyers, your institution may be hearing about that some day.

What's the point? Policies have consequences, not all of which may be intended. Creation and implementation of IT policy, even if well thought-out, may lead to unintended consequences that are detrimental not only to individual students but to the university's mission. Just as what the IT staff intentionally d'es should support the mission of the institution, even if it makes their jobs tougher sometimes, what they (or their software programs) do unintentionally matters, too. Something to think about, eh?

comments powered by Disqus