Open Menu Close Menu

Data Security | Feature

A Security Framework Tailor-Made for Higher Ed

This free spreadsheet tool from Educause can help identify gaps in a college or university security profile and kick-start a security conversation on campus.

When the Heartbleed security vulnerability made major headlines, higher ed leaders across the nation were reminded that security must be continuously managed as a program. Dartmouth, Bryn Mawr, Cal Poly, Lafayette, the College of Charleston, U Hawaii, U Delaware, U Michigan, Durham U, U California Los Angeles and many, many others posted notices to their campus communities regarding short outages in computer services as they worked through their server patching processes and asked students and staff to change their passwords for network access.

No doubt, some schools knew this day — or one like it — was coming and simply put into play the plans they'd already prepared; others were caught off guard and had to scramble. The differences in response relate to each college or university's level of security maturity.

Maturity models abound: IT has ITIL, Carnegie Mellon's CMMI or ISACA's COBIT. Security has NIST's recently updated Cybersecurity Capability Maturity Model and ISO's 27002 IT security standard. However, until recently no framework existed specifically for assessing higher ed security maturity. That void was filled last year when the Higher Education Information Security Council launched its freely available HEISC Information Security Program Assessment Tool.

A Tool for Higher Ed
This project was developed by a group made up primarily of higher ed chief information security officers (CISOs) and others from the Department of Education and Educause. A major goal was to develop a framework that would align with recognized industry standards but with a higher ed spin. The framework also links back to the "Information Security Guide," a joint project of Educause, Internet2 and HEISC, which provides practitioners with resources to kick-start their program.

The result is a straightforward Excel spreadsheet with 103 questions in 12 broad areas:

  • Risk management;
  • Security policy;
  • Organization of information security;
  • Asset management;
  • Human resource security;
  • Physical and environmental security;
  • Communications and operations management;
  • Access control;
  • Information systems acquisition, development and maintenance;
  • Information security incident management;
  • Business continuity management; and
  • Compliance.

Each area aligns with a specific section within the ISO 27002 security standard. For example, the questions posed in the area of access control tie to section 11 in the ISO standard on the same topic. (If a user finds a question confusing, he or she can click on a small question mark icon to pop up a brief explanation.) The spreadsheet tool also draws a line to related controls in the NIST standard. And clicking on each subject's header links the user to content in the Information Security Guide that provides an overview, an explanation, guidance on best practices and links to other resources for deeper dives.

In other words, the HEISC tool has no intention of stepping on anybody's preferred framework; it simply provides colleges and universities with a way to assess their security programs and get a baseline sense of what their biggest gaps are. In a vast security environment where attacks are on the rise, the assessment tool aims to minimize scope and help focus efforts onto an institution's highest risk areas.

How the Assessment Tool Works
Currently, the tool is intended for self-assessment. Users answer each question with a score, and the tool provides a tally on the institution's level of maturity in a given area or overall. Questions can be answered by individuals or by a security team working through the list collaboratively. Or, in a distributed environment, the questions could be doled out to those best suited to answer for specific topics.

According to Cathy Hubbs, CISO at American University, the first time she sat down to do the assessment — by herself — it took about 90 minutes. Generally, it's expected to take a couple of hours. However, when she sat down with her security team to work through the questions, they spent nearly eight hours, "because we were really talking through all of the questions and thinking it through. It was a much better exercise and I really enjoyed it."

The tool uses the ISO framework for scoring maturity, which runs from zero (the least mature) to five (the most mature):

  • 0 = Not performed;
  • 1 = Performed informally;
  • 2 = Planned;
  • 3 = Well defined;
  • 4 = Quantitatively controlled; and
  • 5 = Continuously improving.

But that's just the start of the process — next comes the assessment of the results.

What the Numbers Mean
In an Educause conference presentation given by Hubbs and David Escalante, director of computer security at Boston College, the two presenters made several points regarding how best to use the findings from the tool.

First, each question is given equal consideration in the assessment, which means any raw score will be a simple average, whether measured within a lone security area or taken across all of the questions. The tool does no weighting of one area over another. However, because the tool comes in the form of a spreadsheet, the user can assign his or her own weights and rework the calculations.

Escalante noted in the talk (which is available by video) that the members of the HEISC sub-committee "cheated the weighting" by putting more questions into those categories they thought were more important. "Some categories have a lot of questions and some have a few." That question distribution takes the HEISC Information Security Assessment Tool beyond a simple rework of what's already been done by ISO; it's infused with a college flavor.

Second, the scale of maturity can be misunderstood. Whereas zero as a score for a given area isn't good — that activity hasn't been performed at all — a five is nearly impossible to achieve, according to Escalante.

"Fives tend to be something that is like a Six Sigma/Total Quality Management thing, where you have a committee of people looking at it, and they're doing metrics and doing continuous improvement," he explained. "Most people aren't going to do that in a security program." The major issue: Achieving a five would probably require a lot of funding. "You should be pretty suspicious of somebody scoring themselves as a five. Most people don't do that in higher education."

Third, users need to balance best practice against risk management. "It could be that just because your score in some area is low, it doesn't mean you need to bump that score up. It could be that your risk in that area is low and that's why you're not spending a lot of attention on it," said Escalante.

He and Hubbs recommended doing an overlay chart of risk and maturity. Those areas with high risk and low maturity may need to go to the top of the priority list, while other areas with a low maturity score may require further discussion. Take question 28: "Does your institution have a process for issuing keys, codes and/or cards that require appropriate authorization and background checks for access to these sensitive facilities?" While that may be an ISO best practice, Escalante pointed out, "it's contentious with HR departments at various institutions."

Ultimately, the results of the assessment can be used to build conversations with other parts of the campus in order to gain broader institutional cooperation on aspects of security.

Future Plans
The original intent for the HEISC work was to create a benchmarking tool that would enable schools to assess their security profile and compare their findings to each other through a central system. Due to the sensitive nature of sharing security-related information, the project has deemed this work for phase two. Those discussions have begun, and it's possible, suggested Hubbs in a post-conference interview, that the group may begin testing a benchmarking solution sometime later this year.

To emulate benchmarking, Hubbs suggested persuading fellow CISOs at "market basket" institutions to take the assessment and perform the comparison that way.

At American University she sat down with peers from other IT domains within the institution to find out how they would rate security. Her goal was to compare the security team's appraisal against how others on campus scored security efforts, in order to uncover gaps in perception regarding IT security maturity.

To do the assessment, 10 people representing multiple areas — customer service, business intelligence, development and so on — sat in a room and went through every question, answering them with Poll Everywhere. "It was a good experience," she recalled. "We began with, 'We're not going to talk about this question. If you don't know what it means, I don't want to influence you and I don't want you to influence each other.'" As it was, the exercise took about three hours.

What did she learn? Although the analysis is ongoing, a couple of "outliers" stood out:

  • Question 77: Does your institution have a process for validating the security of purchased software products and services?
  • Question 96: Does your institution have a records management or data governance policy that addresses the life cycle of both paper and electronic records at your institution?

"The first has been in place for a couple years, but in the last four months became even more defined," Hubbs noted. "My team knows the process, but we have not communicated — beyond the staff involved. It's a matter of spreading the word beyond those doing the validation. The purchasing department is revising its procurement policy and plans to send out a campus memo, and that will be a second way to raise the awareness of our community that IT security partners with procurement to assess the security of services."

For the second item, she said, "The records retention policy is not maintained by IT security, and though the security staff knew it exists, the larger group was less aware. Again, an awareness and communication opportunity."

Continued Hubbs, "All in all, I was very happy to see that by and large the group of IT directors not directly involved with IT security knows what the group and the university are up to and assessed the program's maturity quite similarly."

More broadly, Hubbs emphasized that information security professionals in higher ed "should select some sort of framework," to translate IT security risks into clear business decisions. "Universities are like a city with a multitude of businesses happening on campus," she said: academics, research, payroll, public safety, hospitals and healthcare. With "competing priorities come competition for resources."

The HEISC Information Security Assessment Tool can produce results that are "meaningful for business officers and CIOs and useful for making business decisions, useful to make risk decisions and useful to us as practitioners to measure our progress and be able to prioritize their efforts."

comments powered by Disqus