How to Learn From IT Failure

For the University of Notre Dame, analyzing failed technology projects has led to a more efficient, successful IT operation.

How to Learn From IT Failure
Image: Shutterstock.com

Many people were shocked that the federal government's Healthcare.gov project stumbled so badly out of the gate. Yet IT executives and application developers were probably less surprised. In 2012, a study by McKinsey estimated that more than 40 percent of IT projects — across both public and private sectors — fail.

What can be done to mitigate the risk and reduce the number of projects that fail to deliver? One approach is to zero in on common problems across many types of projects and make organizational changes to address the core issues. A few years ago, executives in the Office of Information Technologies (OIT) at the University of Notre Dame noticed something odd: Their project backlog kept increasing even as they made efforts to boost their throughput. They also realized that the growing backlog was a symptom of a much larger problem: Too many projects were failing to deliver business value. Since January 2013, OIT has used metrics from its project management office and a Six Sigma methodology to revamp project prioritization and implementation.

The first order of business was to cut down on the project backlog. One problem was that internal customers would propose lots of new projects, and IT developers rarely said no. "Our business leaders who were proposing these projects didn't have a good governance model for prioritizing them," said Scott Kirner, director of enterprise solutions.

To keep project requests more focused and manageable, OIT put together steering committees called guidance councils. "We said to those guidance councils, 'Give us one to two projects at a time. We will get those done,'" explained Kirner. OIT also worked on identifying issues that were keeping projects from reaching completed status and overcoming those roadblocks. "After that happened, we knew we were being more efficient because our project throughput went up by 30 percent," he said.

Learning From Project Backlog
Yet research into the backlog, which involved data collected by Notre Dame's Project Management office over three years, revealed an inconvenient truth: 49 percent of IT hours were being spent on projects that did not deliver the intended business value.

"We lived through a lot of these projects," said Tracy Weber, manager of digital document management and workflow. "Some of these were really big projects we were pouring hours into and they weren't worthwhile."

Here is how OIT defined business value: Did the project meet stated deliverables — that is, if IT projected financial savings or efficienc
y gains, were those realized? And was the system still in use after five years?
Whether the system was still in use was a big indicator, Weber added. "When we spend 1,000 hours on a project and replace the entire solution two years later, that's a problem," she said. "In higher education, we don't move that fast that we can afford to be switching out systems that quickly."

Although they didn't want to detail specific project failures, Weber and Kirner said they could think of several clear examples. "We implemented multiple modules of one system, but we were using a new system within three years," Kirner recalled, "and another never got off the ground."

Understanding Failure
Needless to say, calling attention to the project failure rate could be politically sensitive on campus. To help put everyone on the same page, Notre Dame has made a strong commitment to streamlining processes in administrative departments. It has created an Office of Continuous Improvement responsible for designing, leading and fostering a culture that embraces improvement efforts.

"We put the projects we felt hadn't delivered business value in front of OIT leadership and business leaders, and we had validation from them that this is true," Weber said. "If there was any controversy about whether a project did or didn't deliver business value, we took it off our list."

Because they had quantified the scope of the problem, they had good support from senior IT leadership, Weber said. In fact, Ron Kraemer, Notre Dame's vice president and chief information and digital officer, encouraged the pair to do a presentation about their efforts at an Educause meeting.

In 2012, they began working with the guidance councils to understand why so many projects failed. Using Six Sigma methodology for articulating business processes, they developed root cause analysis diagrams.

Three top root causes that surfaced were:

  • Resources not fully committed;
  • Requirements not fully defined; and
  • No formal business case.

One warning sign is when there is no clear business sponsor, Kirner noted. "We don't want to do projects because IT thinks it's a good idea. We have to be sure it's the sponsor that wants to do this."

Another question to ask is how clear the sponsor is about alignment with other departments and systems. Has the sponsor thought through the downstream effects of the changes he or she is making? For instance, if you make a change to the registrar's office system, how is that impacting other systems and workflows?

In addition, any change in departmental leadership is a reason to step back and reevaluate the commitment to a project, Kirner and Weber said. If someone leaves a high-level position in a business area, the new person may not be committed to a system being implemented.

"We are getting better at seeing warning signs and stopping projects early on, stepping back and analyzing whether it makes sense to proceed," Weber said.

For example, a project may have a four-month timeframe, Kirner explained. "If halfway through that time, we realize there is no way we are going to meet the deadline, perhaps we didn't understand the complexity. We have to hit the pause button and reconsider how we are going to proceed."

"Sometimes even in software selection," Weber added, "we have to work more closely with the customer to make sure the business process analysis has been done."

Measuring Success
In January 2013, OIT decided to start with a clean slate in terms of measurement of project success. Since that time, by the definition the team created, wasted IT hours dropped from 49 percent to 3 percent. But that dramatic decrease doesn't tell the whole story yet: "It is in the nature of tracking project failures that you need at least a year's worth of data to show you are making improvements," Weber said. "Previously, we looked at three years of data. We won't know for two more years how we compare, but so far all indicators are good and we are now tracking the metrics quarterly."

Indeed, Kirner fully expects that 3 percent failure rate to increase. "We implemented a lot of systems in 2013 and the longevity of those projects is one thing that will determine success rates," he conceded.

Both Kirner and Weber expressed satisfaction that they are addressing this challenging project management issue head-on. "We want to avoid wasting IT resources," Weber said, "because they are a precious commodity on campus."

Featured