Why saving money on data protection is a false economy
By Dave Russell, Vice President of Enterprise Strategy, Veeam
Tuesday, 27 September, 2022
With the looming threat of a recession, many businesses will be scaling back budgets, adjusting their plans and looking for new ways to cut costs in preparation for the economic downturn.
As IT budgets may be squeezed in the near future, teams will be re-evaluating priorities — balancing CapEx/OpEx against transformation plans and security considerations, among others. But what happens to data protection budgets? With three in four organisations suffering a ransomware attack last year alone, enterprises that cut costs on data protection should be wary.
Data protection budgets should be increasing, not shrinking
As enterprise tech stacks grow larger and more complex year-on-year, the amount of data businesses create and keep is scaling up at a similar rate. Because of this, we’re seeing an acceleration in the number of strategic workloads and applications that are considered ‘mission critical’. However, businesses’ ability to protect these workloads, in the event of a disaster, is not in sync.
The Veeam Data Protection Trends Report 2022 shows that 90% of organisations have an ‘availability gap’ between the service level agreement (SLA) expected uptime and how quickly IT teams can return to productivity. However, more concerning is that 89% of organisations have a ‘protection gap’ between the data they can afford to lose without affecting service and what data is actually backed up and protected.
Unfortunately, this gap is only getting wider. While data protection budgets have been increasing to improve system availability and faster disaster recovery, they’re still not increasing enough to keep up with accelerating strategic workloads. The impact of a recession and tighter IT budgets on this situation is difficult to predict. While decelerating digital expansion would theoretically give data protection strategies a chance to catch up, ‘through-crisis-innovation’ has often been the secret to surviving an economic downturn, meaning applications and workloads may continue to scale. If protection budgets don’t rise alongside this, the gap will only grow wider.
Outages, whether caused by internal error or external breaches, cost an estimated $2000+ per minute and last an average of 78 minutes, according to the report. Thankfully in many industries, such as financial services, data protection is highly regulated and so is non-discretional. But verticals where regulations aren’t as tight leave room for enterprises to underfund and put themselves at risk, and if the gap is unsolved, or worse, left to widen, the impact and frequency of outages will grow larger still.
Not knowing what to protect is costing enterprises
Against the backdrop of an economic downturn, how are IT teams meant to address this protection and availability gap? If budgets are tight, it's more crucial than ever to ensure that investment is going to the right places. If any savings/optimisation is found, it needs to be re-invested straight back into the data protection strategy.
An all-to-common reason that data protection budgets don’t stretch as far as they need to is simply that enterprises aren’t protecting and backing up the right data to begin with.
It all comes down to knowing what to protect and there is still a long way to go. In an ideal world we protect and back up everything all the time, but in practice, it comes down to knowing what is mission-critical and what isn’t. In the event of an outage or attack what are the things that need to be recovered if you are to get back up and running as quickly as possible?
This sounds simple enough, but even within an organisation there are differing priorities. Therefore, what classifies as ‘mission-critical’ will differ from team to team. If you misclassify ‘critical’ data, you risk money, resources and longer downtime securing the wrong things in the wrong order.
This dissonance can mean enterprises can even ‘overprotect’ themselves, tying up resources and leaving potential gaps due to underfunding elsewhere. If multiple teams, for example, the storage team, the backup team, and the administrators, are protecting and backing up things in multiple ways, not only is this incredibly inefficient and expensive, but it doesn’t lend itself to fast disaster recovery. To quickly recover from downtime, you need one approach and one strategy.
To solve this, data protection teams need to perform a ‘business impact assessment’ to accurately define not what data is important but what is the most important. Prioritise the mission-critical information and build a recovery plan that protects and restores this information as fast as possible. Think of it like a lifeboat exercise — if you can only put 10 people on one boat, who do they need to be?
Getting the most value out of your tech stack
Enterprises need to continue to invest in backup and disaster recovery, but with budgets tightening, what kind of solutions are the best bets? Ultimately there is no universal answer as it depends on requirements and product fit, but there are considerations to be made if budgets are stretched.
From a licensing perspective, the ability to reuse infrastructure within a new solution can decouple the additional branded hardware costs that are often bolted on to software licenses. Likewise, a universal license means your backup system is not vendor-locked, allowing you to easily transfer your backups from on-premises to the cloud or from one cloud vendor to another.
Another cost consideration for backup and disaster recovery is the complexity of the solution and the associated labour costs that can come with this. Protecting data across workloads, particularly in hybrid infrastructures or with complex architectures like Kubernetes, is a big task but an equally complex backup solution can quickly make your cost structure unmanageable. Selecting a product that is intuitive or that prioritises usability can prevent the need for as much dedicated staff or needing to send teams for specialised education.
Despite uncertain economic times, enterprises’ digital transformation plans will continue. Businesses cannot stand still. IT and data protection teams have a big task ahead keeping up with ramping workloads and ensuring they close the gap between technology and how well it is backed up and protected. As budgets constrict, enterprises need to optimise every per cent and make sure the right workloads and applications are prioritised and protected, and a simple, flexible, reliable and powerful backup solution is in place. Only then can enterprises ensure they’re sufficiently protected and ready for turbulent times ahead.
Strategies for navigating Java vulnerabilities
Java remains a robust and widely adopted platform for enterprise applications, but staying ahead...
Not all cyber risk is created equal
The key to mitigating cyber exposure lies in preventing breaches before they happen.
How AI can help businesses manage their cyber risks
Artificial intelligence can be a powerful ally in the fight against cyberthreats.