Thoughts and Updates from the Hedvig Blog

Subscribe to the blog
Get updates sent directly to your inbox

Software-defined storage: A modern approach to backup

by Eric Carter on January 6, 2017

EC-DD-Badge.pngLast weekend I was going through a stack of old boxes as part of my ongoing quest to clean out the garage and came across my old EMC Data Domain badge. It made me think just how far we haven’t come in backup — not Data Domain in particular; they did a lot of innovative things back in the day. I’m thinking principally of how most businesses today, in an age of distributed computing, various permutations of cloud, and digital transformation, still rely on traditional backup. 

Today’s businesses generate seemingly countless mountains of data at a pace that is only accelerating. The amount of data stored by businesses, for example, multiplies by 60 percent each year. The digital universe more than doubles every two years, and will balloon to 44 trillion gigabytes in 2020 from 4.4 trillion gigabytes in 2013. That’s an astounding amount of data that must be backed up and protected in case of hardware/software failure, malicious actors, human error, or natural disasters and the like.

Some 45 percent of overall storage capacity is dedicated to backup and archive data, and 82 percent of companies have at least 10 (yes, TEN) copies of their data backed up or archived.

backup-stats.png

No doubt most of you in IT can relate: backup (or some form of data protection) is necessary, but can be the bane of our existence. Traditionally, you’d use your backup software, on a per server basis every day,  to copy all or some of the newly stored data to some sort of tape library or storage array. Then along came some clever companies with technologies like deduplication, helping to make things more efficient by eliminating the bloat of storing upwards of 95 percent of the highly redundant backup data. This all worked well enough, but these purpose-built backup appliances were pretty costly, and when you ran out of space, you were left to guesstimate how much storage you’d need for the next 3 to 5 years so you can purchase and roll in a new device to meet your foreseeable future needs.

Surely by now there must be a better, easier and less expensive way, right? Thankfully, there is and it’s a combination of a bit of the old with ample helpings of the new. That better way to deal with the onslaught of data that we need to protect is to take advantage of modern software-defined storage techniques, epitomized by the Hedvig Distributed Storage Platform. Now, I’m not proposing that you ditch all the software and backup schedules you currently have in place. Instead, I’m suggesting that you investigate how you might replace where you’re storing all that backup information. Specifically, you can easily point your existing backup software (whether it’s Veritas NetBackup & Backup Exec, Commvault Simpana, Veeam Backup & Replication, EMC NetWorker, IBM Spectrum Protect, HP Data Protector, Arcserve UDP or something else) to a scale-out, software defined storage solution. For our solution, this means using a Hedvig Virtual Disk as backup-to-disk target.

backup-diagram.png

Hedvig storage software runs on commodity x86 or ARM-based servers or cloud instances to form a dynamically scalable multi-site storage resource pool. This combination lowers your overall cost of backup storage by taking advantage of off-the-shelf hardware, but it also means that you can flexibly accommodate storage needs – including those unexpected spikes that inevitably occur by simply adding resources incrementally as needed (no more bogus capacity exercises and unnecessary purchases).

The Hedvig solution delivers a comprehensive suite of enterprise storage capabilities, like inline global deduplication, compression, snapshots, clones, and replication to help meet any protection, disaster recovery, and availability requirement. A single logical storage cluster can span two or more data centers and clouds, giving you the flexibility to set policies and locate data copies where you need them to ensure your business is protected while eliminating the cost and complexity of managing disparate solution solutions.

Here are some more advantages of doing backup with Hedvig:

  • Customize storage to fit your service levels. Set features on per-volume basis to best fit your protection and disaster recovery requirements.
  • Integrate with any backup and archiving app. Block, file and object storage interfaces giving maximum flexibility for leveraging a Hedvig cluster with your existing  software.
  • Deliver predictable, high-speed ingest rates. Ensure data is protected within backup windows.
  • Improve RPO and RTO service level agreements. Protect data more frequently and speed recovery to eliminate downtime and data loss.
  • Protect data across sites and clouds. Automatically replicate data to offsite data centers and clouds like AWS, Azure, and GCP  for disaster avoidance and high availability.
  • Create point-in-time snapshots and clones. Support off-host, application-consistent backups, and rollback volumes for quick recovery.
  • Scale seamlessly with an elastic cluster. Scale capacity on the fly with your choice of standard commodity servers.
  • Eliminate forklift upgrades. Refresh hardware without disruption by adding new nodes and removing old nodes from the cluster.

Most of the businesses we work with today are focusing on a bimodal IT approach to technology - stewarding legacy solutions while implementing modern IT practices. Backup is one of those areas where organizations can blend investments they’ve already made with newer ways of managing systems and data. I anticipate bimodal is the way things are going to be for some time. As long as companies still need to make discrete copies of their data why not take advantage of software-defined storage to help?

I would be remiss not to mention of course the innovation taking place in the entire data protection stack. For instance, much work is being done to leverage modern software and architectures such as those emerging with hyperconverged backup and copy data management.  Who knew that the often less-than-exciting domain of backup could be ripe for so much innovation?

By focusing on an approach that is simpler, more modern, flexible, scalable, and cost effective, customers are figuring out how to elevate backup and data protection from the drag it used to be to something far more efficient and automated.

Now, if only finally cleaning out my garage was this easy!

To learn more about using software-defined storage for backups (and archives, too!), check out our infographic. Just click below.

View Infographic

Eric Carter

Eric Carter

Eric is Senior Director of Marketing at Hedvig. He joins Hedvig from Riverbed, EMC, and Legato Systems and enjoys rocking the guitar along with his acoustic cover band in his free time. Eric has a BA in Public Relations from Pepperdine University.
See all authored articlesÚ