One size does not fit all
By Raj Thakur, Director and GM, Servers & Converged Systems, Enterprise Group, HP South Pacific
Monday, 07 December, 2015
Forward thinkers are aggregating pools of end-to-end resources to power a new style of business.
If there’s a defining characteristic of business in this millennium, it’s that failure comes faster. Think about the shift from VHS and DVDs to on-demand content streaming services, and the impact this had on retail stores. Giants in the industry stumble when they fail to innovate — the inability to adapt quickly to the changing technology landscape, the on-demand economy and consumer behaviour can be catastrophic.
However, businesses in Asia Pacific and Japan (APJ) are starting to take notice and understand the importance IT plays in innovation. In fact, according to a recent survey, 54% of firms in APJ now view their IT teams to be at the heart of their business and believe IT is fundamental to their success. In addition, 87% of business executives in APJ understand that they need to collaborate with the IT department in order to meet business objectives.
At the same time, emerging technologies such as cloud computing, advanced mobility and big data present new business opportunities. The trouble is, many IT organisations aren’t equipped to capitalise on these trends quickly enough to deliver differentiated services as they’re created. Simply put, they’re saddled with traditional IT systems that are inefficient, slow and manually driven.
A new approach is needed. Rather than seeing infrastructure as a collection of servers, storage and networking gear, forward-thinkers are aggregating pools of end-to-end ‘Compute’ resources for use from the edge to core, up and down an integrated workload stack, and with an advanced set of economics and automated operational approaches to power a new style of business.
Flexible consumption models
There was a time when technology needed to be a fixed point. Servers and software could be tightly configured to handle a limited number of operations, squeezing cost out of the enterprise. Automation allowed for efficient handling of processes that rarely changed, because they didn’t need to. This ‘one size fits all’ approach will no longer work.
In the Compute era, IT leaders need to offer users and departments flexible consumption models for achieving business outcomes. We’re already seeing this dynamic at work in the public cloud as online retailers scale up resources to handle the holiday shopping rush. What if this same flexibility was afforded to the business unit manager needing to unify a distributed development team ahead of a key deadline? What if business leaders could simply define their goals and order internal IT resources to support them, on-demand, like any other service?
Financing should be just as flexible. Traditional, top-down IT may work for some companies. Others may prefer a managed hosting model where owned resources are governed and apportioned by a third party. Others may prefer to rely on the public cloud. A growing number are pooling all their in-house gear and software for use as a service that IT leaders broker and departments consume according to budgetary limits.
This is not a nice-to-have but rather is a strategic imperative. Business moves too fast, especially when so much of it is governed by systems of engagement. Adapting to the users who engage with these systems — from mobile banking and e-commerce to online music stores — is no longer optional. Systems in the Compute era are designed with this sort of flexibility in mind, breaking the fixed, brittle moulds created by their predecessors and built with three distinct characteristics for serving business needs:
- Converged. Discrete servers are ineffective for serving ever-changing markets. Instead, we need pools of resources, virtualised and converged with networking, storage and management, that can be shared by many applications as well as managed and delivered as a service.
- Composable. In the Compute era, infrastructure isn’t metal, it’s fluid. Pools of processing power and storage are captured in a networked fabric and disaggregated so they can be quickly composed to service workloads and then decomposed back into the pool for others to use as the occasion calls for it. Importantly, this work is performed entirely in software and, as such, requires new architecture in order to implement.
- Workload optimised. There’s a reason why legacy IT systems are rigidly implemented. Rigidity, when applied to a specific problem, puts optimal resources at work in the right place. Flexible, assemble-on-demand Compute infrastructures confer this same level of customised performance but without calcifying the underlying system.
The evolving enterprise
How far can we push the Compute model? That remains to be seen, but there’s no doubt we’ve come a long way already. Organisations that used to spend thousands of dollars on licensing to slice up inefficient servers to get more value from them are now instructing their IT chiefs to build service bureaus that collect and distribute precious Compute resources where they’re needed, just in time.
We’ve already added analytics capabilities that allow systems to pre-emptively add Compute power to departments known to need it at certain times of the day or year, like when an e-tailer needs extra processing to handle traffic during flash sales or new product launches. Longer term, we’ll have autonomic systems that mirror the human immune system, applying software patches as if they were white blood cells dispatched to heal a wound, such as a cybersecurity breach.
In that sense, Compute isn’t so much a technology model as it is an approach that’s flexible, service-oriented and designed to capture opportunities as they happen — and head off disasters before it is too late. Don’t let your company fail to capture the opportunities when the technology is at your doorstep. Get ready to accelerate Compute.
Is the Australian tech skills gap a myth?
As Australia navigates this shift towards a skills-based economy, addressing the learning gap...
How 'pre-mortem' analysis can support successful IT deployments
As IT projects become more complex, the adoption of pre-mortem analysis should be a standard...
The key to navigating the data privacy dilemma
Feeding personal and sensitive consumer data into AI models presents a privacy challenge.