MDP – Yeah You Know Me

Data protection is a funny thing. Much like insurance, most folks understand that it’s important, normally dread having to use it, and dislike the fact that it costs money “just in case something goes wrong”. But then they get hit by ransomware, or Judy in Accounting absolutely destroys a critical spreadsheet, and they realise it’s probably not such a bad thing to have this “data protection”. Books are weird too. Not the idea that we’ll put a whole bunch of information in a file and make it accessible to people. Rather, that sometimes that information is given context and then printed out, sold, read, and stuck on a shelf somewhere for future reference. Indeed, I was a voracious consumer of technical books early in my career, particularly when many vendors were insisting that this was the way to share knowledge with end users. YouTube wasn’t a thing, and access to manuals and reference guides was limited to partners or the vendors themselves. The problem with technical books, however, is that if they cover a specific version of software (or hardware or whatever), they very quickly become outdated in potentially significant ways. As enjoyable as some of those books about Windows NT 4.0 might have been for us all, they quickly became monitor stands when Windows 2000 was released. The more useful books were the ones that shared more of the how, what, when, and why of the topic, rather than digging in to specific guidance on how to do an activity with a particular solution. Particularly when that solution was re-written by the vendor between major versions.

Early on in my career I got involved in my employer’s backup and recovery solution. At the time it was all about GFS backup schemes and DDS-2 drives and per-server protection schemes that mostly worked. It was viewed as an unnecessary expense and given to junior staff to look after. There was a feeling, at least with some of the Windows stuff, that if anything went wrong it would likely go wrong in a big way. I generally felt ill at ease when recovery requests would hit the service desk queue. As a result of this, my interest in being able to bring data back from human error, disaster, or other kinds of failure was piqued, and I went out and bought a copy of Unix Backup and Recovery. As a system administrator, it was a great book to have at hand. There was a nice combination of understandable examples and practical application of backup and recovery principles covered throughout that book. I used to joke that it even had a happy ending, and everyone got their data back. As I moved through my career, I maintained an interest in data protection (it seemed, at one stage, to go hand in hand with storage for whatever reason), and I’ve often wondered what people do when they aren’t given the appropriate guidance on how to best do data protection to meet their needs.

All of this is an extremely long-winded way of saying that my friend W. Curtis Preston has released his fourth book, the snappily titled “Modern Data Protection“, and it makes for some excellent reading. If you listen to him talk about why he wrote another book on his podcast, you’ll appreciate that this thing was over 10 years in the making, had an extensive outline developed for it, and really took a lot of effort to get done. As Curtis points out, he goes out of his way not to name vendors or solutions in the book (he works for Druva). Instead, he spends time on the basics (why backup?), what you should backup, how to backup, and even when you should be backing up things.

This one doesn’t just cover off the traditional centralised server / tape library combo so common for many years in enterprise shops. It also goes into more modern on-premises solutions (I think the kids call them hyper-converged) and cloud-native solutions of all different shapes and sizes. He talks about how to protect a wide variety of workloads and solution architectures, drills in on the importance of recovery testing, and even covers off the difference between backup and archive. Yes, they are different, and I’m not just saying that because I contributed that particular chapter. There’s talk of traditional data sources, deduplication technologies, and more fashionable stuff like Docker and Kubernetes.

The book comes in at a svelte 350ish pages, and you know that each chapter could have almost been a book on its own (or at least a very long whitepaper). That said, Preston does a great job of sticking to the topic at hand, and breaking down potentially complex scenarios in a concise and simple to digest fashion. As I like to say to anyone who’ll listen, this stuff can be hard to get right, and you want to get it right, so it helps if the book you’re using gets it right too.

Should you read this book? Yes. Particularly if you have data or know someone who has data. You may be a seasoned industry veteran or new to the game. It doesn’t matter. You might be a consultant, an architect, or an end user. You might even work at a data protection vendor. There’s something in this for everyone. I was one of the technical editors on this book, fancy myself as knowing about about data protection, and I learnt a lot of stuff. Even if you’re not directly in charge of data protection for your own data or your organisation’s data, this is an extremely useful guide that covers off the things you should be looking at with your existing solution or with a new solution. You can buy it directly from O’Reilly, or from big book sellers. It comes in electronic and physical versions and is well worth checking out. If you don’t believe me, ask Mellor, or Leib – they’ll tell you the same thing.

  • Publisher: O’Reilly
  • ISBN: 9781492094050

Finally, thanks to Preston for getting me involved in this project, for putting up with my English (AU) spelling, and for signing my copy of Unix Backup and Recovery.

Random Short Take #53

Welcome to Random Short Take #53. A few players have worn 53 in the NBA including Mark Eaton, James Edwards, and Artis Gilmore. My favourite though was Chocolate Thunder, Darryl Dawkins. Let’s get random.

  • I love Preston’s series of articles covering the basics of backup and recovery, and this one on backup lifecycle is no exception.
  • Speaking of data protection, Druva has secured another round of funding. You can read Mellor’s thoughts here, and the press release is here.
  • More data protection press releases? I’ve got you covered. Zerto released one recently about cloud data protection. Turns out folks like cloud when it comes to data protection. But I don’t know that everyone has realised that there’s some work still to do in that space.
  • In other press release news, Cloud Propeller and Violin Systems have teamed up. Things seem to have changed a bit at Violin Systems since StorCentric’s acquisition, and I’m interested to see how things progress.
  • This article on some of the peculiarities associated with mainframe deployments in the old days by Anthony Vanderwerdt was the most entertaining thing I’ve read in a while.
  • Alastair has been pumping out a series of articles around AWS principles, and this one on understanding your single points of failure is spot on.
  • Get excited! VMware Cloud Director 10.2.2 is out now. Read more about that here.
  • A lot of people seem to think it’s no big thing to stretch Layer 2 networks. I don’t like it, and this article from Ethan Banks covers a good number of reasons why you should think again if you’re that way inclined.

Druva Update – Q3 2020

I caught up with my friend W. Curtis Preston from Druva a little while ago to talk about what the company has been up to. It seems like quite a bit, so I thought I’d share some notes here.

 

DXP and Company Update

Firstly, Druva’s first conference, DXP, is coming up shortly. There’s an interesting range of topics and speakers, and it looks to be jam packed with useful info. You can find out more and register for that here. The company seems to be going from strength to strength, enjoying 50% year-on-year growth, and 70% for Phoenix in particular (its DC product).

If you’re into Gartner Peer Insights – Druva has taken out the top award in 3 categories – file analysis, DRaaS, and data centre backup. Preston also tells me Druva is handling around 5 million backups a day, for what it’s worth. Finally, if you’re into super fluffy customer satisfaction metrics, Druva is reporting an “industry-leading NPS score of 88” that has been third-party verified.

 

Product News

It’s Fun To Read The CCPA

If you’re unfamiliar, California has released its version of the GDPR, know as the California Consumer Privacy Act. Druva has created a template for data types that shouldn’t be stored in plain text and can flag them as they’re backed up. It can also do the same thing in email, and you can now do a federated search against both of these things. If anything turns up that shouldn’t be there, you can go and remove problematic files.

ServiceNow Automation

Druva now has support for automated SNOW ticket creation. It’s based on some advanced logic, too. For example, if a backup fails 3 times, a ticket will be created and can be routed to the people who should be caring about such things.

More APIs

There’s been a lot of done work to deliver more APIs, and a more robust RBAC implementation.

DRaaS

DRaaS is currently only for VMware, VMC, and AWS-based workloads. Preston tells me that users are getting an RTO of 15-20 minutes, and an RPO of 1 hour. Druva added failback support a little while ago (one VM at a time). That feature has now been enhanced, and you can failback as many workloads as you want. You can also add a prefix or suffix to a VM name, and Druva has added a failover prerequisite check as well.

 

Other Notes

In other news, Druva is now certified on VMC on Dell. It’s added support for Microsoft Teams and support for Slack. Both useful if you’ve stopped storing your critical data in email and started storing it in collaboration apps instead.

Storage Insights and Recommendations

There’s also a storage insights feature that is particularly good for unstructured data. Say, for example, that 30% of your backups are media files, you might not want to back them up (unless you’re in the media streaming business, I guess). You can delete bad files from backups, and automatically create an exclusion for those file types.

Support for K8s

Support for everyone’s favourite container orchestration system has been announced, not yet released. Read about that here. You can now do a full backup of an entire K8s environment (AWS only in v1). This includes Docker containers, mounted volumes, and DBs referenced in those containers.

NAS Backup

Druva has enhanced its NAS backup in two ways, the first of which is performance. Preston tells me the current product is at least 10X faster than one year ago. Also, for customers already using a native recovery mechanism like snapshots, Druva has also added the option to backup directly to Glacier, which cuts your cost in half.

Oracle Support

For Oracle, Druva has what Preston describes as “two solid options”. Right now there’s an OVA that provides a ready to go, appliance-like experience, uses the image copy format (supporting block-level incremental, and incremental merge). The other option will be announced next week at DxP.

 

Thoughts and Further Reading

Some of these features seem like incremental improvements, but when you put it all together, it makes for some impressive reading. Druva has done a really impressive job, in my opinion, of sticking with the built in the cloud, for the cloud mantra that dominates much of its product design. The big news is the support for K8s, but things like multi-VM failback with the DRaaS solution is nothing to sneeze at. There’s more news coming shortly, and I look forward to covering that. In the meantime, if you have the time, be sure to check out DXP – I think it will be quite an informative event.

 

 

Random Short Take #40

Welcome to Random Short Take #40. Quite a few players have worn 40 in the NBA, including the flat-top king Frank Brickowski. But my favourite player to wear number 40 was the Reign Man – Shawn Kemp. So let’s get random.

  • Dell EMC PowerProtect Data Manager 19.5 was released in early July and Preston covered it pretty comprehensively here.
  • Speaking of data protection software releases and enhancements, we’ve barely recovered from the excitement of Veeam v10 being released and Anthony is already talking about v11. More on that here.
  • Speaking of Veeam, Rhys posted a very detailed article on setting up a Veeam backup repository on NFS using a Pure Storage FlashBlade environment.
  • Sticking with the data protection theme, I penned a piece over at Gestalt IT for Druva talking about OneDrive protection and why it’s important.
  • OpenDrives has some new gear available – you can read more about that here.
  • The nice folks at Spectro Cloud recently announced that its first product is generally available. You can read the press release here.
  • Wiliam Lam put out a great article on passing through the integrated GPU on Apple Mac minis with ESXi 7.
  • Time passes on, and Christian recently celebrated 10 years on his blog, which I think is a worthy achievement.

Happy Friday!

Random Short Take #33

Welcome to Random Short Take #33. Some terrific players have worn 33 in the NBA, including Keith Closs and Stephon Marbury. This one, though, goes out to the “hick from French Lick” Larry Joe Bird. You might see the frequency of these posts ramp up a bit over the next little while. Because everything feels a little random at the moment.

  • I recently wrote about what Scale Computing has been up to with Leostream. It’s also done a bit with Acronis in the past, and it recently announced it’s now offering Acronis Cloud Storage. You can read more on that here.
  • The good folks at Druva are offering 6 months of free subscription for Office 365 and Endpoint protection (up to 300 seats) to help businesses adjust to these modern ways of working. You can find out more about that here.
  • Speaking of cloud backup, Backblaze recently surpassed the exabyte mark in terms of stored customer data.
  • I’ve been wanting to write about Panzura for a while, and I’ve been terribly slack. It’s enjoying some amount of momentum at the moment though, and is reporting revenue growth that looks the goods. Speaking of Panzura, if you haven’t heard of its Vizion.AI offshoot – it’s well worth checking out.
  • Zerto recently announced Zerto 8. Lots of cool features have been made available, including support for VMware on Google Cloud, and improved VMware integration.
  • There’s a metric shedload of “how best to work from home” posts doing the rounds at the moment. I found this one from Russ White to be both comprehensive and readable. That’s not as frequent a combination as you might expect.
  • World Backup Day was yesterday. I’ll be writing more on that this week, but in the meantime this article from Anthony Spiteri on data displacement was pretty interesting.
  • Speaking of backup and Veeam things, this article on installing Veeam PN from Andre Atkinson was very useful.

And that’s it for now. Stay safe folks.

 

 

Random Short Take #30

Welcome to Random Short Take #30. You’d think 30 would be an easy choice, given how much I like Wardell Curry II, but for this one I’m giving a shout out to Rasheed Wallace instead. I’m a big fan of ‘Sheed. I hope you all enjoy these little trips down NBA memory lane. Here we go.

  • Veeam 10’s release is imminent. Anthony has been doing a bang up job covering some of the enhancements in the product. This article was particularly interesting because I work in a company selling Veeam and using vCloud Director.
  • Sticking with data protection, Curtis wrote an insightful article on backups and frequency.
  • If you’re in Europe or parts of the US (or can get there easily), like writing about technology, and you’re into cars and stuff, this offer from Cohesity could be right up your alley.
  • I was lucky enough to have a chat with Sheng Liang from Rancher Labs a few weeks ago about how it’s going in the market. I’m relatively Kubernetes illiterate, but it sounds like there’s a bit going on.
  • For something completely different, this article from Christian on Raspberry Pi, volumio and HiFiBerry was great. Thanks for the tip!
  • Spinning disk may be as dead as tape, if these numbers are anything to go by.
  • This was a great article from Matt Crape on home lab planning.
  • Speaking of home labs, Shanks posted an interesting article on what he has running. The custom-built rack is inspired.

Independent Research Firm Cites Druva As A Strong Performer in latest Data Resiliency Solutions Wave

Disclaimer: This is a sponsored post and you’ll probably see the content elsewhere on the Internet. Druva provided no editorial input and the words and opinions in this post are my own.

Druva was among the select companies that Forrester invited to participate in their latest Data Resiliency Solutions Wave, for Q3 2019. In its debut for this report, Druva was cited as a Strong Performer in Data Resilience. I recently had an opportunity to speak to W. Curtis Preston, Druva’s Chief Technologist, about the report, and thought I’d share some of my thoughts here.

 

Let’s Get SaaS-y

Druva was the only company listed in the Forrester Wave™ Data Resiliency Solutions whose products are only offered as a service. One of the great things about Software-as-a-Service (SaaS) is that the vendor takes care of everything for you. Other models of solution delivery require hardware, software (or both) to be installed on-premises or close to the workload you want to protect. The beauty of a SaaS delivery model is that Druva can provide you with a data protection solution that they manage from end to end. If you’re hoping that there’ll be some new feature delivered as part of the solution, you don’t have to worry about planning the upgrade to the latest version; Druva takes care of that for you. There’s no need for you to submit change management documentation or negotiate infrastructure outages with key stakeholders. And if something goes wrong with the platform upgrade, the vendor will take care of it. All you need to worry about is ensuring that your network access is maintained and you’re paying the bills. If your capacity is growing out of line with your expectations, it’s a problem for Druva, not you. And, as I alluded to earlier, you get access to features in a timely fashion. Druva can push those out when they’ve tested them, and everyone gets access to them without having to wait. Their time to market is great, and there aren’t a lot of really long release cycles involved.

 

Management

The report also called out how easy it was to manage Druva, as Forrester gave them their highest score 5 (out of 5) in this category. All of their services are available via a single management interface. I don’t recall at what point in my career I started to pay attention to vendors talking to me about managing everything from a single pane of glass. I think that the nature of enterprise infrastructure operations dictates that we look for unified management solutions wherever we can. Enterprise infrastructure is invariably complicated, and we want simplicity wherever we can get it. Having everything on one screen doesn’t always mean that things will be simple, but Druva has focused on ensuring that the management experience delivers on the promise of simplified operations. The simplified operations are also comprehensive, and there’s support for cloud-native / AWS resources (with CloudRanger), data centre workloads (with Druva Phoenix) and SaaS workloads (with Druva inSync) via a single pane of glass.  Although not included in the report, Druva also supports backing up endpoints, such as laptops and mobile devices.

 

Deduplication Is No Joke

One of Forrester’s criteria was whether or not a product offered deduplication. Deduplication has radically transformed the data protection storage market. Prior to the widespread adoption of deduplication and compression technologies in data protection storage, tape provided the best value in terms of price and capacity. This all changed when enterprises were able to store many copies of their data in the space required by one copy. Druva uses deduplication effectively in its solution, and has a patent on its implementation of the technology. They also leverage global deduplication in their solution, providing enterprises with an efficient use of protection data storage. Note that this capability needs to be in a single AWS region, as you wouldn’t want it running across regions. The key to Druva’s success with deduplication has been also due to its use of DynamoDB to support deduplication operations at scale.

 

Your Security Is Their Concern

Security was a key criterion in Forrester’s evaluation, and Druva received another 5 – the highest score possible – in that category as well. One of the big concerns for enterprises is the security of protection data being stored in cloud platforms. There’s no point spending a lot of money trying to protect your critical information assets if a copy of those same assets has been left exposed on the Internet for all to see. With Druva’s solution, everything stored in S3 is sharded and stored as separate objects. They’re not just taking big chunks of your protection data and storing them in buckets for everyone to see. Even if someone were able to access the storage, and put all of the pieces back together, it would be useless because all of these shards are also encrypted.  In addition, the metadata needed to re-assemble the shards is stored separately in DynamoDB and is also encrypted.

 

Thoughts

I believe being named a Strong Performer in the Forrester Wave™ Data Resiliency Solutions validates what Druva’s been telling me when it comes to their ability to protect workloads in the data centre, the cloud, and in SaaS environments. Their strength seems to lie in their ability to leverage native cloud tools effectively to provide their customers with a solution that is simple to operate and consume. If you have petabytes of seismic data you need to protect, Druva (and the laws of physics) may not be a great fit for you. But if you have less esoteric requirements and a desire to reduce your on-premises footprint and protect workloads across a number of environments, then Druva is worthy of further consideration. If you wanted to take a look at the report yourself, you can do so here (registration required).

VMware – VMworld 2019 – HBI3487BUS – Rethink Data Protection & Management for VMware

Disclaimer: I recently attended VMworld 2019 – US.  My flights and accommodation were paid for by Digital Sense, and VMware provided me with a free pass to the conference and various bits of swag. There is no requirement for me to blog about any of the content presented and I am not compensated by VMware for my time at the event.  Some materials presented were discussed under NDA and don’t form part of my blog posts, but could influence future discussions.

Here are my rough notes from “HBI3487BUS – Rethink Data Protection & Management for VMware”, presented by Curt Hayes (Cloud and Data Center Engineer, Regeneron) and Mike Palmer (Chief Product Officer, Druva). You can grab a PDF copy of my notes from here.

 

The World is Changing

Cloud Storage Costs Continue To Decline

  • 67 price decreases in AWS storage with CAGR of (60%) – AWS
  • 68% (110+) of countries have Data protection and privacy legislation – United Nations
  • 40% of IT will be “Versatilists” by 2021 – Gartner
  • 54% of CIOs believe streamlining storage is best opportunity for cost optimisation – ESG
  • 80% of enterprises will migrate away and close their on-premises DCs by 2025 – Gartner
  • 256% of increase in demand for Data scientists in last 5 years – Indeed

Druva’s 4 Pillars of Value

  • Costs Decrease – storage designed to optimise performance and cost reduces per TB costs, leaving more money for innovation
  • Eliminate Effort – Capacity management, patching, upgrades, certification, training, professional services gone.
  • Retire HW/SW silos – Druva builds in data services: DR, Archive, eDiscovery and more
  • Put Data to work – eliminating silos allows global tagging. Searchability, access and governance.

The best work you can do is when you don’t have to do it.

Curt (customer) says “[d]ata is our greatest asset”.

Regeneron’s Drivers to Move to Cloud

Challenges

Opportunities

Ireland backup platform is nearing end-of-life

Regeneron has a perfect opportunity to consider cloud as an alternative solution for backup and DR

3 distinct tools for managing backups

Harmonize backup tool set

Expansion and upgrades are costly and time-consuming

Minimize operational overhead

Need to improve business continuity posture

Instantly enable offsite backups & disaster recovery requirement

Scientists have tough time accessing the data they need

Advanced search capabilities to offer greater value added data services

Regeneron’s TCO Analysis

Druva Enables Intelligent Tiering in the Cloud

Traditional, expensive, and inflexible on-premises storage

  • Limited and expensive to scale and store
  • Complex administration
  • Lack of visibility and data silos
  • Tradeoff between cost and visibility for Long Term Retention requirements

Modern, scalable and cost-effective multi-tier storage

  • Scalable, efficient cloud story
  • Intelligent progressive tiering of data for maximum cost effiency with minimum effort
  • Support cloud bursting, hot/cold data
  • Cost efficient storage on most innovative AWS tiers
  • Enable reporting / audit on historical data

Regeneron’s Adoption of Cloud Journey

  • DC modernisation / consolidation
  • Workload migration to the cloud – Amazon EC2
  • Simplify and streamline backup / recovery and DR
  • Longer-term retention for advanced data mining
  • Protecting cloud applications – Sharepoint, O365, etc
  • Future – do more with data

 

How Did Druva help?

Basics

  • Cheaper
  • Simpler
  • Faster
  • Unified protection

Future Proof

  • Scalable
  • Ease of integration
  • No training
  • Business continuity

Data Value

  • Search
  • Data Mining
  • Analytics

Looking Beyond Data Protection …

 

Thoughts and Further Reading

I think the folks at Druva have been doing some cool stuff lately, and chances are quite high that I’ll be writing more about them in the future. There’s a good story with their cloud-native architecture, and it was nice to hear how a customer leveraged them to do things better than they had been doing previously.

Two things really stood out to me during this session. The first was the statement “[t]he best work you can do is when you don’t have to do it”. I’ve heard it said before that the best storage operation is one you don’t have to do, and I think we sometimes lose site of how this approach can help us get stuff done in a more efficient fashion, ultimately leading to focussing our constrained resources elsewhere.

The second was the idea of looking beyond data protection. The “secondary storage” market is riding something of a gravy train at the moment, with big investment from the VC funds in current and next-generation data protection (management?) solutions. There’s been some debate over how effective these solutions are at actually deriving value from that secondary data, but you’d have to think they’re in a prime position to succeed. I’m curious to see just what shape that value takes when we all start to agree on the basic premise.

Sponsored sessions aren’t everyone’s cup of tea, but I like hearing from customers about how it’s worked out well for them. And the cool thing about VMworld is that there’s a broader ecosystem supporting VMware across a number of different technology stacks. This makes for a diverse bunch of sessions, and I think it makes for an extremely interesting vendor conference. If you want to learn a bit more about what Druva have been up to, check out my post from Tech Field Day 19 here, and you can also find a useful overview of the product here. Good session. 3.5 stars.

Druva – In The Cloud, Of The Cloud, Protecting The Cloud

Disclaimer: I recently attended Tech Field Day 19.  My flights, accommodation and other expenses were paid for by Tech Field Day. There is no requirement for me to blog about any of the content presented and I am not compensated in any way for my time at the event.  Some materials presented were discussed under NDA and don’t form part of my blog posts, but could influence future discussions.

 

Druva recently presented at Tech Field Day 19. You can see videos of their presentation here, and download my rough notes from here. Here’s a photo of Jaspreet Singh kicking things off.

 

Let’s Talk About You

What do people want in a backup system?

I’ll tell you what we want. What we really, really want. Less Spice Girls ear worms. And a data protection service. It seems simplistic, but it’s true. A lot of organisations are tired of being IT organisations and they just want to consume services from companies that are IT organisations. That’s not a copout. They want to make money doing the things they’re good at. It’s one of the big reasons public cloud has proven so popular. Druva offers a service, and are positioning themselves as being to backups what Salesforce is to CRM. The key selling point is that they can do data protection simpler, faster, cheaper, and safer. And you get the two big benefits of SaaS:

  • There’s nothing to maintain; and
  • New features are made available immediately.

Am I The Ideal Druva Customer?

Are you a good fit though? If you’re running modern / virtualised workloads, Druva want to talk to you. To wit, if you find yourself in one of these categories you should be okay:

  • “Versatilist” Users;
  • Cloud focus or initiative;
  • Hybrid cloud environment;
  • Distributed workloads, including laptops;
  • SaaS adopter (SFDC, O365, G Suite); and
  • Moving away from legacy Unix and apps.

The more distributed your company is – the better Druva looks.

Who’s not a good fit for Druva though? Enterprises that:

  • Must have an on-premises backup system;
  • Have no desire to leverage cloud; and
  • Want a backup system for legacy OS / apps.

Poor enterprises, missing out again.

 

Challenges Solved by Druva

Curtis knows a bit about data protection, and he’s been around for a while now, so he remembers when not everything was peaches and cream in the data protection world. He talked about the various trends in data protection over the years and used the below table as an anchor point. The gist of it is that a solution such as the one Druva has doesn’t have quite as many challenges as the more “traditional” data protection systems we were using through for the last 20 plus years (yes, and longer still, I know).

! $ ? Challenges
$ ? Design, maintain, refresh physical backup server & storage
! $ ? Patch & upgrade backup server OS
! $ ? Patch & upgrade backup server software
! $ ? Manage multiple vendors (server, backup sw, tape, disk)
! Tape can be lost or stolen ???
$ ? Tape requires constant performance tweaking
$ Tape requires offsite vaulting vendor ???
$ Hardware typically bought in advance
$ ? Over-provision compute / storage (growth and variable load)
$ ? Not easy to scale
$ Unexpected / variable costs
$ Massive capital expenditures
! First large backup
! Any large restore

Every vendor can look good when you take tape out of consideration. It has an awful a lot of advantages in terms of capacity and economy, but the execution can often be a real pain. Druva also compete pretty well with the “hyper-converged” backup vendors, although I think they get a bad rap for having a focus on hardware that isn’t necessarily as much of a problem as some people think. The real killer feature for Druva is the cloud-native architecture, and the SaaS story in general.

 

Thoughts and Further Reading

It’s no secret that I’ve been a fan of Curtis for years, so when he moved to Druva I was intrigued and wanted to hear more. But Druva isn’t just Curtis. There are a whole bunch of people at the company who know cloud, and data protection, and have managed to put them together into a solution that makes a lot of sense. And I like what I’ve seen thus far. There’s a really good story here, particularly if you’re all in on cloud, and running relatively modern applications. The heritage in endpoint protection has helped them overcome some obstacles that other vendors haven’t had to deal with yet. They’re also willing to admit that not everything is perfect, particularly when it comes to getting that first large backup done. They also believe that “[w]ithin the limits of physics they can scale to meet the needs of most customers”. You’re not going to be able to achieve RPO 0 and RTO 0 with Druva. But that’s what things like replication are for. What they do offer, however, is an RTO of minutes, not hours. A few other things they don’t do include VM live mount and native support for Azure and GCP.

What Druva do do well is understand that customers have requirements that can be satisfied though the use of protection data. They also understand the real operational value (in terms of resiliency and reduced spend) that can be had with SaaS-based offerings. We all talk a tough game when it comes to buying what we think is the absolute best solution to protect our data, and rightly so. A business’s data is (hopefully) one of its most critical assets, and we should do anything we can to protect it. Druva are as dedicated as the next company to that philosophy, but they’ve also realised that the average business is under constant pressure to reduce costs wherever possible. Now you don’t just get to access the benefits of running your applications in the cloud – you can also get the benefit of protecting them in the cloud too.

Tape was hard to do well, and many of us have horror stories about things going wrong. Cloud can be hard to do well too, and there are plenty of stories of cloud going horribly wrong. Druva isn’t magic, but it does help take away a lot of the complexity that’s been frequently attached with protecting cloud-native workloads.

Random Short Take #18

Here are some links to some random news items and other content that I recently found interesting. You might find them interesting too. Episode 18 – buckle up kids! It’s all happening.

  • Cohesity added support for Active Directory protection with version 6.3 of the DataPlatform. Matt covered it pretty comprehensively here.
  • Speaking of Cohesity, Alastair wrote this article on getting started with the Cohesity PowerShell Module.
  • In keeping with the data protection theme (hey, it’s what I’m into), here’s a great article from W. Curtis Preston on SaaS data protection, and what you need to consider to not become another cautionary tale on the Internet. Curtis has written a lot about data protection over the years, and you could do a lot worse than reading what he has to say. And that’s not just because he signed a book for me.
  • Did you ever stop and think just how insecure some of the things that you put your money into are? It’s a little scary. Shell are doing some stuff with Cybera to improve things. Read more about that here.
  • I used to work with Vincent, and he’s a super smart guy. I’ve been at him for years to start blogging, and he’s started to put out some articles. He’s very good at taking complex topics and distilling them down to something that’s easy to understand. Here’s his summary of VMware vRealize Automation configuration.
  • Tom’s take on some recent CloudFlare outages makes for good reading.
  • Google Cloud has announced it’s acquiring Elastifile. That part of the business doesn’t seem to be as brutal as the broader Alphabet group when it comes to acquiring and discarding companies, and I’m hoping that the good folks at Elastifile are looked after. You can read more on that here.
  • A lot of people are getting upset with terms like “disaggregated HCI”. Chris Mellor does a bang up job explaining the differences between the various architectures here. It’s my belief that there’s a place for all of this, and assuming that one architecture will suit every situation is a little naive. But what do I know?