Consider costs and mitigate the risk of ransomware when selecting the right data protection solution
Data protection was very simple back in the ’90s when I started my tech career. Most of the time it was just a matter of scheduling a full daily backup to be written to an attached tape drive. This worked fine for most companies because there was a relatively small amount of data on each server, and it was easy for backup operators to take these tapes offsite or store them in a fire-proof safe.
Simple solutions such as these will not scale with the amount of data that we’re using today. The many ways that companies generate and use data require a modern data protection solution. As a result, companies need to think carefully about their data protection requirements before deciding on the best solution to fit those needs.
New challenges and considerations
Companies are also facing a landscape full of threats, regulations, and data management issues that weren’t around back in the ’90s. Here are a few concerns that are common:
- Protecting against ransomware and cyber-attacks that threaten your business
- A huge increase in the amount of data you need to protect
- Data located in the public cloud such as Microsoft Office 365
- Meeting GDPR compliance and ensuring personal data is safeguarded
Of all of these, the security of your backup is arguably the most important and causes the most concern right now. But costs are also increasingly a factor. What should you keep in mind when you are choosing a backup solution?
One of the things that surprises me is that customers continue to run their backups on platforms that are susceptible to cyberattacks. Due to its popularity, Windows Server is one of the most targeted platforms on the planet. Over the years, cybercriminals have found many backdoors into the platform to infiltrate Windows Server — from the early days of Windows NT all the way through to Windows Server 2019.
Cyber-attacks are becoming increasingly prevalent in everyday life. One of the leading mid-market backup products that built its reputation providing strong virtual server backups has been heavily targeted by cybercriminals. That company’s own community forum site has many instances where users have documented how a ransomware or malware attack has disabled the backup service, rendering the software useless.
Attackers will target the backup software by encrypting config files, deleting registry keys, and disabling the dedupe indexes and hash files so the backup data cannot be accessed. There are even cases where replicated backups have also been taken out as they are running on the same Windows domain.
I read another article recently about how hackers will interrogate your AD servers to ascertain the backup service user account and password and log into your backup manager (either on-premises or in the cloud) and delete the backup sets on disk, rendering the customer incapable of restoring their own servers and data. Any vendor that runs their backup service on a Windows server platform is at a higher risk — simply because the underlying platform is still a priority platform for cybercriminals to target.
Another key factor in protecting your business is ensuring you have the correct backup architecture in place. This will help to mitigate against the risk of your backup data being compromised by cybercriminals or a true DR event.
There is a well–known industry practice called the 3-2-1 backup strategy. 3-2-1 ensures that data is protected and backup copies of the data are available when needed. The basic concept of the 3-2-1 backup strategy is that three copies are made of the data to be protected. The copies are then stored on two different types of storage media, and one copy of the data is sent offsite or offline. Every backup vendor promotes this methodology because you do not want all your eggs in one basket if the backup server gets destroyed/compromised. Without a useable backup, you are left completely at the mercy of the cybercriminal when you try to recover from an attack.
Businesses worldwide need more storage every year. IDC believes there are currently 33 zettabytes of storage in the world today ( a zettabyte is 1,000 exabytes, which is 1 million petabytes and 1 billion terabytes.) By 2025, they expect this to rise to 175 ZB of storage. Much of this growth will be fuelled by IoT devices collecting data, which will result in a huge surge in cloud storage consumption growth.
Even if your business is not based around IoT devices, your storage requirements will continue to grow. It doesn’t matter what line of business you are in or if your data is on-premises or in the cloud or both — businesses are creating more data and keeping data longer.
When sizing a data protection solution for the next three to five years, it’s important to factor anticipated growth into the calculation. Even customers who do not anticipate much growth in their production storage will undoubtedly grow faster than planned. Most vendors will have you use CAGR (compound annual growth rate) measurements built into the storage calculators to ensure you purchase a solution that has enough backup storage for your needs. I recommend that you overestimate rather than underestimate projected storage growth.
Office 365 market adoption
Microsoft Office 365 adoption has accelerated in the past two years. In 2018, market adoption was around 56 percent, and in 2019 this grew to 79 percent. The use of Microsoft Office 365 is standard practice in most businesses today. There has also been rapid growth in Microsoft Teams use since the COVID-19 pandemic started earlier this year and more people are working from home.
You need to keep in mind that the data sitting in the Microsoft cloud is yours and not Microsoft’s. It is your responsibility to protect this data, and backup should not be overlooked. Please factor Office 365 into your data protection plan. When reviewing your compliance mandates such as GDPR, it can be easy to overlook your data sitting in the Microsoft cloud. Even Microsoft states in the Microsoft Service Agreement (Section 6B) that you should use a third-party application to back up your data and configuration as they cannot guarantee they can if there was an outage.
Consider the costs
I also want to discuss my concerns about purchasing a software–only solution. Naturally, software–only offerings at first glance seem cost-effective and flexible. However, I have seen with my own eyes customers who have experienced the hidden costs of such a solution. TCO (total cost of ownership) can be high with software–only solutions.
You have to purchase:
- Multiple high spec Intel servers with multiple CPU cores and lots of RAM
- Windows Server licenses
- SSD Volumes for dedupe data/indexes
- SAS/SATA storage for your data
- Multiple high-speed networking cards, etc. for protecting your on-premises environment
- Replacement servers (They must be upgraded every four to five years)
- You may need more of all of these as you scale
You can reduce some of the costs using virtual servers at remote sites where the data footprint is smaller. But you still will have the challenge to hold and retain long–term backups for compliance reasons. This means you either have to purchase a lot more storage or go back to using tape technologies so you can archive the backups for long periods of time. Do you really want to do that?
In addition to cost considerations, you have the software–only implementation project to consider. To ensure the solution is architected properly, ideally, it should be installed and configured by an expert. Project management may be required so that the solution will work quickly, and so that you can start to get the ROI (return on investment) you expect from the solution. These professional services costs can be high. In some cases, they can cost you more than the data protection software itself.
The costs can go even higher when you want to leverage the cloud with a software–only solution. Public cloud can make a lot of sense to give you an offsite copy of your data for disaster recovery (DR) or be the primary backup target for your Office 365 backups.
If you have to set up cloud storage, there are some factors to keep in mind:
- You have to choose and sign up with a public cloud provider, e.g. Azure or AWS.
- You will need to have to build powerful VMs with fast storage.
- Storage in the cloud can be a bit of a minefield. You have to decide whether you need hot storage, cold storage, zone–redundant storage, geo–redundant storage. All of these may have different pricing models depending on how often you access the data. That can be tough to figure out as you might not know how often you will need to access the cloud storage.
- You also need to build and configure the networking and firewalls to ensure your cloud environment is secure.
- You have to install your backup software in the cloud and join the dots with your on-premises backup infrastructure, which takes a lot of work and effort to keep running.
- This workload has to be managed by the IT department, which may not be familiar with the public cloud.
Managing and monitoring infrastructure as a service (IaaS) could require a steep learning curve by your IT staff.
I have seen many hard lessons learned by customers making the wrong choices that wound up being far more expensive than originally anticipated. When selecting the ideal backup solution, there are several key factors that will make your life much easier:
- Identify data protection solutions that do not run on targeted platforms. This will remove a huge amount of risk to you and your business.
- Plan for storage growth. Many customers get this wrong and end up having to purchase more storage or larger backup appliances halfway through the term of their contract.
- Be wary of the hidden costs of buying a product that is software only. The purchase price may tempt you, but the TCO can be high. These projects put a huge amount of extra work on your IT team, who are already very busy. Therefore, the implementation cost needs to be factored into the plan.
- Public cloud can be a fantastic platform for off-site data protection and a great platform for Microsoft Office 365 protection. Be wary of doing public cloud DIY. Building an IaaS platform to host your data protection software is complex. The recurring costs can be very high and can increase over time.
- A solution that includes a hardware refresh helps keep costs predictable over time. It can also minimize ongoing costs, especially in years four and five.
- Choosing a managed service cloud or SaaS will reduce your headaches. It will be someone else’s problem to manage the infrastructure — and those people are experts in that field. SLAs can ensure the platform is always available to support your business.
- Look for a single offering that can provide on-premises backup and cloud–based backups, such as Microsoft Office 365. It can make your life a lot easier if you have one throat to choke, so to speak, and one support team to work with if something goes wrong.
Barracuda can help
Barracuda has a great range of data protection solutions that address all these challenges. We offer:
- On-premises data protection using physical and virtual backup appliances running on a hardened Linux OS
- Point–in–time configuration rollback to ensure backups are safeguarded from ransomware and malware attacks
- Multi–site protection/replication and off-site replication into a managed cloud DR service (SaaS)
- Unlimited cloud storage so you can control your costs over time and vault monthly and yearly backups for up to seven years for long–term data protection
- Barracuda also offers Instant Replacement that will ship you a replacement appliance the next business day if your hardware fails. Instant Replacement also includes a brand-new backup appliance every four years to manage your TCO more effectively.
Barracuda Cloud-to-Cloud Backup provides Office 365 data protection that protects your Microsoft Office 365 data and backs up directly to the Barracuda Cloud
- Unlimited cloud storage so backups can be retained forever if required
- It also makes sense to keep your Microsoft Office 365 data in the cloud and just manage the process with no infrastructure to worry about (on-prem or in the cloud).