The data centre of the next decade
We are in the middle of the third great revolution of technology delivery. The first was the mainframe era - where computing power was centralised and end-user devices were unintelligent terminals. Then came the PC era. Marked by massive increases in computing power, the pendulum swung completely with client computers having more power and servers being relegated in importance.
We are now in the third wave. Many services are centralised as data centres have increased in computing power and capacity while end users enjoy a massive variety in the types of equipment they can use and where they can use it. What does all this mean if we are planning a data centre strategy that will see us through the next decade?
The nature of business and what CIOs need to deliver to the business is changing at a pace that is impossible to react to. CIOs and decision makers are working at a time where business cycles are contracting and change is accelerating.
Robert Le Busque, Area VP Strategy and Development in Asia Pacific for Verizon explains: “When the curtain was falling on the Beijing Olympics the iPad didn’t exist, the iPhone was only an infant and the digital universe was five times smaller than it is today.”
The way businesses look at delivering applications and other services must adapt. During the mainframe and client-server eras, the IT department had control of the technology supply chain from infrastructure to applications. According to Trevor A Bunker, a Vice President with CA Technologies, CIOs will be designing and managing data centres that bear little resemblance to those of today.
“The data centre of the future, from the CIO’s view, will not include the infrastructure. The infrastructure will be completely decoupled. I don’t think that when CIOs think about the data centre that they’ll even concern themselves with the infrastructure.”
This begs the question - what is a data centre?
In our view, the data centre is where applications, business communications and business logic reside. Typically, the data centre has also included physical assets like servers, storage and networking equipment but those have always been in place to serve the business.
Bunker believes that the physical infrastructure will be less important as time goes on. “Infrastructure provides limited competitive advantage. Infrastructure is ubiquitous. Everyone has pretty much equal access. The real competitive advantage is going to emerge from how we use the IT services running on the infrastructure regardless of where the infrastructure is.”
Still, what about the data centre of 2025? What will it look like? Will we still be constructing large rooms filled with hermetically sealed hallways and aisles of blinking lights? It’s hard to see a future where data centres aren’t part of the picture.
According to Gartner Research VP Phillip R Sargeant, data centres are becoming far denser with the amount of power that’s required in a smaller space, the amount of heat that is being generated and even the physical weight of equipment. All of this means that the choices CIOs need to make when building a data centre aren’t the same now as they were even just five years ago.
There is not a single piece of data that suggests energy prices are going to fall in the foreseeable future. Every year we see the cost of electricity rise. The federal government’s carbon tax hasn’t yet had a significant impact on prices but there’s little doubt that large users of power will continue to see the bottom line being impacted. Although there’s no universally agreed statement on what will happen to power prices over the next decade, you can expect your bills to increase by between 5% and 15% per year over the next decade.
Google’s answer to this is to build data centres where there’s access to cheap, reliable power. The locations Google chooses for new data centres show that access to cheap cooling, in order reduce power costs, is as significant a decision as proximity to communications.
Google’s data centre in Hamina, Finland, is able to take advantage of local seawater for its cooling system. Phillip Sargeant of Gartner says: “There are a lot of providers of data centres today building data centres in areas that they perhaps haven’t thought about before. They want to make use of the characteristics of the location. With cold locations they can use outside air for cooling for example.’
A critical measure of data centre power use is the power usage effectiveness index. PUE is a measure of how much power is used in a data centre for all the elements in the data centre. The aim to achieve a PUE of 1.0 - where all of the power being used by the data centre goes directly to computing. When the ratio rises above 1.0, the ‘excess’ power is being used for functions that support the computing operations.
Fujitsu recently upgraded a data centre in Noble Park, a suburb of Melbourne, with the aim of reducing the PUE to 1.7. The 6700 m2 data centre was built in 1988. Built to Tier III standards, it incorporates four main data halls for cabinet and cage installations. The company reports all greenhouse gas emissions produced by Noble Park, as well as all others in its Australian data centre network, to the National Greenhouse and Energy Reporting System.
Location, location, location
It’s interesting that when looking back at past research what the issues were. In 2005, a Cisco’s advice focused on protection from hazards, easy accessibility and features that accommodate future growth and change. A significant part of a research paper described how to plan for natural disasters and even listed earthquake statistics.
Today, the location requirements are quite different. The accelerating density of computing power and rapidly increasing reliance on an ‘always on’ infrastructure means that our expectations of data centres have changed. While the considerations highlighted in Cisco’s report are still important, there are new things to consider for the data centre of the next decade and beyond.
In order to minimise the operational costs associated with running a data centre, businesses may need to reconsider locations. In the past, it made sense to put the business and data centre close together. However, given that connections across and between continents are getting faster and more reliable, it’s possible to choose locations with access to cheaper power and better cooling.
In the aftermath of the earthquakes that devastated Christchurch in early 2011, we toured the region and visited a new data centre operated by Computer Concepts limited. CCL’s facility avoided being damaged - although the placard on the door telling us that the building had been checked was a poignant reminder of the damage not far away - but it also highlighted key considerations. CCL’s facility was planning to secure its own water supply for cooling so that it wasn’t solely dependent on power-hungry refrigeration.
Phillip Sargeant of Gartner highlighted to us that some data centre operators are now looking to go it alone when it comes to power as well. “There are two or three data centres using trigeneration where people have their own power plants to power data centres. Typically, some use natural gas to provide power into their own data centres,” he said.
One of the challenges of any power generation technology is the inherent loss that occurs. For example, when electricity is produced at a gas-powered power station, significant amounts of energy are lost in the form of heat. Trigeneration seeks to avoid that by using natural gas to provide electricity, heating and cooling.
In addition, natural gas is a far more environmentally friendly, and therefore cheaper, fuel than coal or many other alternatives. As carbon emissions become a greater impost on the bottom line, being able to produce energy with lower carbon emissions can make a financial difference.
A recent trigeneration implementation by the National Australia Bank cost $6.5m but was expected to deliver $1m per annum in savings.
What about the cloud?
There’s no doubt that the decisions around what to do about your business’s data centre needs will turn to the elephant in the room - the cloud. Past decisions were driven by different needs. As Trevor Bunker of CA Technologies puts it: “Whether it was the purely centralised model years ago, then client-server, each of the evolutions we’ve done for business applications has relied on one thing. That’s LAN-speed network connections. That’s how we built our enterprise apps. We assumed that they would run in the enterprise for the enterprise.”
But those assumptions have been superseded. Bunker adds that “Anyone who’s thinking about building a data centre - I would really have to ask them why. Why would you make that capital investment today? Is it really that strategic and that valuable to your business? Is it a competitive advantage for you? Many answer with frankly, it probably isn’t. But it’s how we’ve done things in the past.”
Issues of data sovereignty, confidentiality, reliability, connectivity and commercial arrangements dominate any discussion of cloud services. It’s interesting that service providers are starting to take a more active role in our region with Rackspace opening a new data centre in Australia and making specific mention of how it won’t be subject to the Patriot Act although there’s considerable debate about the veracity of that statement.
Both IDC and Gartner have recently published research suggesting that a hybrid approach will be a viable option. So, it’s likely that your data centre in 2025 will have some local services and some either externally hosted or delivered as cloud applications. The physical footprint of your premises will no longer bound your data centre.
What will your data centre look like in the next decade and beyond?
It will be denser with more computing power per square metre than today. But it will also require more power and generate more heat. You’ll be a lot smarter about where you build the data centre - if you build one at all - and you’ll probably start by looking at the energy and carbon footprint as closely as the physical specifications of the equipment.
You’ll consider making it either energy self-sufficient or less dependent on power from the grid.
Where there’s no competitive advantage or a clear cost benefit, you’ll probably use cloud services where providers can deliver on your operational needs and energy management goals.
What is clear is that the days of companies building large rooms with raised floors, expensive temperature management and large capital investments are fading because the criteria for making the investment decisions are changing.
Is the Australian tech skills gap a myth?
As Australia navigates this shift towards a skills-based economy, addressing the learning gap...
How 'pre-mortem' analysis can support successful IT deployments
As IT projects become more complex, the adoption of pre-mortem analysis should be a standard...
The key to navigating the data privacy dilemma
Feeding personal and sensitive consumer data into AI models presents a privacy challenge.