Time To Discover NFM

Network file management (NFM) can help your institution better manage data resources campuswide—markedly so.

Networking VIRTUALIZATION HAS BEEN a buzzword in IT the last couple of years. From servers (e.g., VMWare) and networks (network address translation [NAT] and virtual private networks [VPNs]), to application availability and performance (load balancing), managing resource usage and data delivery with virtualization devices is a staple of many of today’s data infrastructures. By breaking the traditional direct physical access and inserting an abstraction layer, what you see is what you get, but the mechanics of delivery may be quite different.

Why Virtualization Is Worth It

The reason for the increase in virtualization deployment is simple: The return on investment (ROI) is significant. For example, load balancers present a simplified interface to the user that usually does not directly reflect the resource. In other words, what may seem like an independent application server to the end user may actually be several dozen servers, each handling a share of the load. In this instance, for example, members of the application server pool may be taken offline for maintenance, without the application delivery being affected. Clearly, the savings involved in not interrupting the business process make the expenditures (capital and administration) for a load balancer solution thoroughly worthwhile.

Network file management is, in many ways, a load balancing methodology for file access. It is sometimes referred to as file virtualization because it acts as a proxy between client file server requests and the file server resources. NFM is sometimes confused with storage virtualization (used in SANs—storage area networks—to virtualize the storage media itself), although a complete storage methodology may include both file and storage virtualization. As with other virtualization schemes, NFM can provide a significant reduction in the total cost of ownership (TCO) of systems by reducing storage and administration costs of networked data.

Better (Much Better) Storage Management

Many campuses have dozens of servers, and each typically has underutilized storage capacity. Some departments may require more storage than others, yet if a standardized server solution is applied to departmental servers campuswide, some servers may see disk utilization of 30 percent or less.

“Underutilized resources represent a capital expenditure that has a negative ROI,” points out Clay Ryder, president of The Sageza Group, a Californiabased market research firm. “If storage is utilized at 50 percent, for example, in some respects that is the same as having paid twice the purchase price for the resource.”

Whereas storage virtualization is confined to a SAN, NFM can be applied to maximize this unused storage across the enterprise by creating dynamic pools where storage is available. If, for example, a research project has large storage needs for a short period of time, space can be allocated on other servers and managed by NFM, says David Kim, president and CSO for secure networking and security solution provider Security Evolutions. “Once the project ends, the storage can be returned to the allocation pool for the next project,” he says.

NFM Products: A Sampling

Storage resources also may be maximized by utilizing access frequency, size, type, and other attributes of files. While frequently accessed files may be stored on a highperformance SAN, there is little need to utilize these more expensive systems for files stored mainly for historical purposes; NFM administrators can create policies to route file storage to the most cost-effective storage media.

“The cost of maintaining a large storage pool requires that storage be allocated based upon a policy of demonstrated need,” Ryder adds. By using the shared storage capacity to optimize backup systems, policies may be created to assign storage (for all files not accessed within a certain period) to a storage medium that has a backup schedule consistent with the policy. This can reduce using backup resources to store multiple copies of the same file version. But which kinds of products are campus systems architects turning to?

Joe Little, principal systems architect of Stanford University’s (CA) Electrical Engineering department, turned to NeoPath Networks’ File Director product (recently acquired by Cisco Systems), when he was faced with optimizing storage of many file servers. The virtualization technology acquired by Cisco manages the combined storage capacity of the department’s NAS and, in the future, Linux and Solaris servers (an impressive 14 Terabytes), and serves as the primary entry point for its NFS resources. In addition, NFM allows Stanford to migrate data when performing hardware maintenance, all without the end user even noticing. A server may be taken offline for upgrades, but because its storage is temporarily—and transparently— migrated to another server, access downtime is eliminated. “It’s nice to have a highly available system for all of our mount points,” Little offers.

Namespace and System Administration

For Stanford, the benefits go beyond storage management to namespace aggregation and control. Little notes that “We can advertise namespace based on the logical namespace,” as opposed to relying on physical servers to dictate namespace schemes. In many environments, migrating away from a traditional namespace means unacceptable changes in the business process, and maintaining namespace tied to physical servers means constantly upgrading the servers. But NFM reduces the expense of purchasing hardware (and the software to operate it) to satisfy historical namespaces.

The greatest savings, however, may result from savings not related to hardware but to system administration. “A [significant] ROI can be achieved particularly because storage administrators are expensive resources,” Kim believes. Ryder concurs, stating that the highest costs “are not for the equipment, but for its operation and the personnel required to operate it.” NFM, like storage virtualization, can reduce the number of FTEs required to administer enterprise storage resources.

NFM allows Stanford to migrate data when performing hardware maintenance, without the end user even noticing.

Yet there are other possible cost savings to consider. Since NFM can reduce the quantity of servers, other downstream benefits can be achieved, according to Kim, such as increased rack space, lower power consumption, reduced backup power capacity need, and lower heat generation.

The Right Solution?

Even with all of the potential benefits, examining the current infrastructure is essential when determining whether NFM is an appropriate solution. For example, if the amount of wasted storage space that could be recovered is minimal, adding an abstraction layer may not produce much benefit. In other cases, migrating many file servers to one or two large servers may be appropriate, or implementing a storage virtualization solution may be preferred.

NFM is a relatively new approach to solving some common issues in information technology management. Whether it becomes a mainstay offering in the IT infrastructure, or is replaced by a different methodology, remains to be seen. One thing is certain, however: With online learning demands, complex ERP systems, and research and collaboration requirements, the need for and complexity of file storage resources at higher ed institutions will continue to increase.

Featured

  • glowing blue nodes connected by thin lines in an abstract network on a dark gray to black gradient background

    Report: Generative AI Taking Over SD-WAN Management

    In a few years, nearly three quarters of network operators will use generative AI for SD-WAN management, according to a new report from research firm Gartner.

  • abstract pattern with interconnected blue nodes and lines forming neural network shapes, overlaid with semi-transparent bars and circular data points

    Data, AI Lead Educause Top 10 List for 2025

    Educause recently released its annual Top 10 list of the most important technology issues facing colleges and universities in the coming year, with a familiar trio leading the bunch: data, analytics, and AI. But the report presents these critical technologies through a new lens: restoring trust in higher education.

  • abstract image representing AI tools for reading and writing

    McGraw Hill Introduces 2 Gen AI Learning Tools

    Global education company McGraw Hill has added two new generative AI tools to help personalize learning experiences for both K–12 and higher ed students, according to a news release.

  • abstract image of fragmented, floating geometric shapes with holographic lock icons and encrypted code, set against a dark, glitchy background with intersecting circuits and swirling light trails

    Education Sector a Top Target for Mobile Malware Attacks

    Mobile and IoT/OT cyber threats continue to grow in number and complexity, becoming more targeted and sophisticated, according to a new report from Zscaler.