IT Trends

5 Tech Trends that Will Drive IT Decision-Making for the Next 5 Years

Data growth is forcing IT departments to adopt new forms of operation and reset their expectations of work.

data trends

Today's IT organizations face a big data challenge: unstoppable growth. Global data center IP traffic equaled 255 Exabytes per month in 2013, according to the Cisco Global Cloud Index — and by 2018, traffic is predicted to nearly triple. Fortunately, not all of that traffic will land in the data center, but it does call for response from the information technology organization, said Gartner Research Vice President David Cappuccio in a recent webinar. "The real questions that IT centers need to ask themselves are, how much of that traffic is important, how does that data need to be acted upon and how do we do it?"

IT has to rethink how it will address infrastructure and operations planning in three important areas, he noted: demand, technologies and the organization itself. Here's how trends and technologies will impact IT over the next five years.

1) Reaction to "Nonstop Demand"

Server loads are growing 10 percent every year; network bandwidth is going up by 35 percent; and storage capacity is expanding by 50 percent, reported Cappuccio. Business is "pushing IT to go faster and faster and faster" even as IT is struggling to keep the lights on, he said.

IT is responding with two distinct ways of operating, he noted. "One is to react as fast as possible. The other is steady pace, doing things as logically and as risk-free as you can to protect the business." All the while, IT's domains of responsibility are being whittled away by other parts of the organization.

Gartner asked C-level executives around the world where they allocate responsibilities for digital business and emerging trends. "The CIO is not the one getting the most responsibility," explained Cappuccio. "They're focused on tactical, day to day, keeping things running."

Meanwhile, the envelope for trying new things is being pushed by business unit heads, marketing and sales, CFOs and COOs. Business "is taking control of what was considered an IT architecture in the past and now doing things their own way," Cappuccio said. That can be "chaotic," he added, "but some say it also makes the business more agile."

In fact, every business unit has become "like a little technology start-up," said Cappuccio. "If central IT does not get involved, does not begin to work with those business units — not to control what they're doing but to understand what they're doing — and learn how they can enable them to do things even faster, we're going to lose control."

In this dynamic environment, there is no place for IT departments with the lingering reputation for saying no and putting up blockades to new requests, because the business unit "is going to go someplace else."

The sting when that happens is this, Cappuccio added: No matter who provides the IT services required, IT is still held responsible for the outcome. "If the environment becomes incredibly complex and we lose control, we still own that customer experience, which means we need to understand how all the pieces tie together."

2) The Internet of Things

Gartner defines IoT as extremely small devices that are self-aware and self-discovering. These sensors may support their own mesh networks so that as devices are deployed, they find each other and "report back," said Cappuccio. They're also often location-aware and in some cases don't require batteries.

IoT is sparking a new way of "looking at business and capturing information about clients and looking at creating a new level of automation to make the business more efficient," he noted. He pointed to the use of sensors attached to hand-cleaning stations in hospitals to scan badges as nurses and doctors wash their hands. "All that data is collected. If anytime down the road there's a lawsuit, and somebody gets infected and they're blaming the hospital, they can go back and track the sequence of every motion and use that as defense against those lawsuits. That's happening today," he said.

Most organizations are in a "look-see mode" right now, Cappuccio added. "IT needs to be aware of it. If they're not, it's something that the business will do anyway outside of IT control, which is not long-term good."

3) Software-Defined Infrastructure

"Software-defined" is creeping into the data center. Software-defined networks, storage and even power are making up what Gartner refers to as "software defined infrastructure." As Cappuccio explained, "Conceptually, what they're all about is this: creating a new way to operate, orchestrate and automate — putting configuration control at a higher plane than it was. Rather than having individuals go out there and optimize at the device level to get the best performance or the best use of that resource, if I can do it at a control plane, I can enhance workload and traffic flow and automation and eventually enhance the overall efficiency of the operation."

Right now, organizations are testing the potential of this technology in "small labs," but the promise is there. "Eventually, I'd be able to manage these environments whether they were on premise or off-premise. It truly becomes a virtual environment," said Cappuccio. "I could begin to move workloads based on actual business needs and performance needs and time of day, and I could move them wherever they need to be."

A related trend is "proactive infrastructures." In this scenario, data centers are beginning to use predictive and prescriptive analytics to help IT staff gain a sense of what will happen in real time as the machines are running or what would happen if a particular system change were made.

The example offered by Cappuccio was this: If you know there's a storm coming and you might lose power to the data center, you could move key applications somewhere else to keep the organization running. "Suddenly I've got an environment that's a lot more risk averse without having to build out a massive Tier 4 data center," he said.

One obstacle to ready adoption of proactive infrastructures is the human one, he noted. Whereas this kind of response takes human effort now, "as you move down the food chain toward predictive and prescriptive, the amount of human effort gets reduced, the amount of automation gets increased and the amount of efficiency of the environments goes up." But IT still shies away from the idea of software deciding when to shut down software.

4) The Evolution of Systems

Two contrary evolutions are going on right now. One is the integrated system, and the other is the disaggregated system. In the first, vendors pull together multiple devices — servers, storage, memory, etc. — to solve a problem and then sell it as a system. As these evolve, Cappuccio said, they become globally shared. "You can plug and play as you need."

As a result, he observed, something interesting has happened. Many organizations no longer buy best-of-breed based on the technology; they choose their system based on the vendor delivering it. "Can they actually support this as they say they can? What's their history? Do I have faith and trust in them?"

Because the price point is higher for an entire system than it is for its individual components, senior administration is becoming involved. As Cappuccio pointed out, "They're more interested in the relationship and the trust of the vendor and the price [as] we start making decisions based on best-of-brand instead of best-of-breed."

Also, the choice of vendor becomes a "de facto standard," he said. "You can break out of that, but people generally tend to stay with the decision they made at least for a couple of generations of products."

Exactly opposite of that trend is another where an organization chooses to build its own systems. "That's the whole genesis behind the Open Compute Project," he noted. In this movement, IT can build a computing environment from many manufacturers at a very low price point with extremely high performance. "If I wanted to upgrade processors, I could do that without having to worrying about upgrading the whole server."

The questions that IT needs to ask itself before undertaking this approach to systems acquisition are whether it has the contacts for buying outside of its normal channels and whether it can get the support it needs. This is not unlike the open software movement of the 1990s, Cappuccio said. "We're just beginning to see the advantages of this. It's something we expect IT to continue to watch."

IT service continuity is a variation on the theme of the new data center. This offshoot of disaster recovery and business continuity surfaced during Hurricane Sandy. Although plenty of organizations were ready when the storm hit and were able to keep their data centers running, a variable surfaced that they hadn't counted on: The roads were closed, which meant fuel trucks couldn't get to the data centers to deliver replacement fuel to keep generators running.

"The realization came that it's not about making sure the data center didn't go down," said Cappuccio. "It's about making sure those services that are client-facing continue to run regardless." The mission for IT became how to design an infrastructure "that could sustain service continuity regardless of whether we go down or not."

The answer is to take a hybrid approach, he said — some work on premises, some co-located and some cloud-based, and all of it backed up with the ability to shift work from one site to another as the need arises.

5) New Directions for IT

Forget about the notion of IT being either focused on maintaining services as reliably as possible or agile and constantly on the run. Organizations need both, insisted Cappuccio. "Bi-modal IT," as Gartner calls it, describes the idea that IT needs to pursue both roles: traditional and exploratory.

What characterizes each? As Cappuccio laid it out, traditional IT sets reliability as the goal and price for performance as the value. The approach is plan-driven and approval-based, the talent good at "conventional" projects and process. Cycle times are long — months or years. The other mode sets agility as the goal; revenue, brand and customer experience as the value; and requires talent that's good with new and "uncertain" projects. Cycle times are short — days or weeks.

The trick, he added, "is to segment applications and workloads so you know which belongs where." Also, don't assume that because something falls into one bucket on one day that it won't change the next: "Just because something is agile and mobile today, tomorrow it may not be."

Just as the IT organization has evolved, so has the need for IT skills. Among those most in need are the categories you could probably guess: mobile, hybrid computing, integration and process, and big data and analytics. But what's needed above all others right now, said Cappuccio, is the skill to understand "how these pieces tie together." The team may have experts in storage, servers, virtualization and related subjects; but does the team have somebody who can understand the "cascading effects should one piece fall, fail or slow down?"

Essentially, Cappuccio noted, IT leaders need to cross-pollinate knowledge. That may require a change in how people are evaluated. Rather than putting the emphasis on "how smart you are in one space, it [needs to be] how many spaces can you tie together?" Appointing one or two members of the staff to undertake cross-training could be a real morale booster across the board, he added. What motivates highly skilled IT people isn't money or accolades. "It's learning. It's continually challenging themselves." When other people see that they're excited about learning new things, "It becomes a motivational tool for them too."

comments powered by Disqus

Campus Technology News

Sign up for our newsletter.

I agree to this site's Privacy Policy.