Get Ready to Quantify Your Computing and Network Capacity

Every so often the National Science Foundation (NSF) conducts a survey of science and engineering research infrastructure at research-performing colleges and universities. This tends to be a biennial project and is a critically important and, to some, forbidding task. Those responsible for managing space on campuses are now getting ready for the call this coming October, when the NSF will once again unveil its redesigned survey and collect data.

But this time there’s a surprise! In recognition of the fact that “infrastructure” means more, now, than mortar and bricks, there’s a new Part 2 to the survey. It will call for institutions to report on their computing and networking capacity. So, CIOs and other IT staff can expect to hear in late October – from whomever their president appoints as “institutional coordinator” for collection and provision – of some pretty interesting statistics. If you’re not already expecting it, this NSF request will be using up some as-yet-unbudgeted-for staff time.

There’s not a lot on the NSF website about the details of this year’s infrastructure survey yet. If you’re inclined, you can see the 2001 survey’s methodology and the report based on the survey from 2001. Even though that survey collected only physical space data, you might find it useful to take a look. And for some early thoughts on the addition of computer and networking capacity to the survey, you can read this article, “Science Board Mulls Cyber Infrastructure,” from 2000.

We were privileged to attend a conference session recently which presented the methodology for the creation of the new survey and described the data which will be collected and the process for that data collection. We can’t share specific survey questions yet, but you’ll be asked to provide information such as the number of networks, the nodes on the networks, wireless hot spots, speed of connections, and the like. Items such as “advanced computing resources; digital libraries; shared data and information bases; research and education networks; distributed user facilities; and standards and protocols” are the focus of the new Part 2 of this survey.

For some, the collection of these data will present quite a challenge. And it is likely that IT staff will have to work more closely with other campus departments in ways they perhaps do not work now. Although we can’t tell you the details of what you’ll be asked for, we can describe a bit of the process.

* Institutional presidents will receive a letter, probably in October, from the NSF providing detailed survey information and asking them to designate an “institutional coordinator.”

* Campuses will be expected to designate that person and to register with NSF within two weeks. Then a familiarization program will begin.

* Coordinators will designate someone to respond to the computing and networking capacity section, likely a request that will come through the CIO’s office. However, the appointed coordinator, not the IT staff, will be the person who actually holds the Web-based survey “keys” and submit the data to NSF.

* It is expected that reported data will be available through the NSF by as early as February 2004.

In the past, there was significant confidentiality that kept most data from being identified as from specific institutions when it was reported. The intent for the 2003 survey is to make much more of it available in an institution-specific manner – to make it more useful for inter-institutional peer comparisons and the like.

So . . . we don’t know the details of the data which will be requested, just some generalities. But it is clear that some fairly senior IT staff on your campus will find a “new” project on their plates by November.

It might be a good idea to query your president’s office or your space management or institutional research staff to see who the institutional coordinator was in 2001, since that person, or someone in their office, may well be designated again in 2003. If you get those lines of communication open now, you’ll likely save some time and stress later.

Featured

  • student reading a book with a brain, a protective hand, a computer monitor showing education icons, gears, and leaves

    4 Steps to Responsible AI Implementation

    Researchers at the University of Kansas Center for Innovation, Design & Digital Learning (CIDDL) have published a new framework for the responsible implementation of artificial intelligence at all levels of education.

  • three glowing stacks of tech-themed icons

    Research: LLMs Need a Translation Layer to Launch Complex Cyber Attacks

    While large language models have been touted for their potential in cybersecurity, they are still far from executing real-world cyber attacks — unless given help from a new kind of abstraction layer, according to researchers at Carnegie Mellon University and Anthropic.

  • Hand holding a stylus over a tablet with futuristic risk management icons

    Why Universities Are Ransomware's Easy Target: Lessons from the 23% Surge

    Academic environments face heightened risk because their collaboration-driven environments are inherently open, making them more susceptible to attack, while the high-value research data they hold makes them an especially attractive target. The question is not if this data will be targeted, but whether universities can defend it swiftly enough against increasingly AI-powered threats.

  • magnifying glass revealing the letters AI

    New Tool Tracks Unauthorized AI Usage Across Organizations

    DevOps platform provider JFrog is taking aim at a growing challenge for enterprises: users deploying AI tools without IT approval.