I miss Tru64, and Solaris for that matter. I don’t miss HP-UX. And I definitely won’t miss AIX. Read about the death of Unix over at El Reg – Unix is dead. Long live Unix!
The I3.metal is going away very soon. Remember this is from a sales perspective, VMware is still supporting the I3.metal in the wild, and you’ll still have access to deploy on-demand if required (up to a point).
Welcome to Random Short Take #81. Last one for the year, because who really wants to read this stuff over the holiday season? Let’s get random.
Curtis did a podcast on archive and retrieve as part of his “Backup to Basics” series. It’s something I feel pretty strongly about, so much so that I wrote a chapter in his book about it. You can listen to it here.
I love Backblaze. Not in the sense that I want to marry the company, but I really like what the folks there do. And I really like the transparency with which they operate. This article giving a behind the scenes look at its US East Data Center is a fantastic example of that.
And, to “celebrate” 81 Random Short Takes (remember when I used to list my favourite NBA players and the numbers they wore?), let’s take a stroll down memory lane with two of my all-time, top 5, favourite NBA players – Kobe Bryant and Jalen Rose. The background for this video is explained by Jalen here.
Take care of yourselves and each other, and I’ll hopefully see you all on the line or in person next year.
Speaking of streaming, this article covered some of the best mechanisms to purchase digital content with. I still prefer buying discs, but I’m a bit weird too.
Finally, I’ve been a fan of John Birmingham’s writing since I was a misspending my youth at university in the 90s, so it makes sense that I’d enjoy his food reviews too (mainly because it’s not just about food). It should come as no surprise that I, too, love pork rillette.
Welcome to Random Short Take #76. Summer’s almost here. Let’s get random.
The nice folks at StorPool have announced StorPool Storage v20. I was lucky enough to catch up with Boyan and the team recently, and they told me about their work on supporting NVMe/TCP, StorPool on Amazon AWS, and NFS File Storage. It’s great stuff and worth checking out.
Long term retention – all the kids are doing it, but there are some things you need to think about. Preston has posted a great article on it here.
USB-C? Thunderbolt? Whatever it’s called, getting stuff to connect properly to your shiny computers with very few useful ports built-in can be a problem. This article had me giggling and crying at the same time.
Backblaze has come through with the goods again, with this article titled “How to Talk to Your Family About Backups“. I talk to my family all the time about backups (and recovery), and it drives them nuts.
I loved this article from Preston on death in the digital age. It’s a thorough exploration not only of what happens to your data when you shuffle off, but also some of the challenges associated with keeping your digital footprint around after you go.
Finally, if you’re looking at all-flash as an option for your backup infrastructure, it’s worth checking out Chris’s article here. The performance benefits (particularly with recovery) are hard to argue with, but at scale the economics may still be problematic.
Retrospect recently announced an update to its Backup (18.5) product. I had the opportunity to speak to JG Heithcock (GM, Retrospect) about the announcement and thought I’d briefly share some thoughts here.
You can now detect anomalies in systems based on customisable filters and thresholds tailored to individual environments. It still relies on someone doing something about it, but it’s definitely a positive step forward. You can also configure the anomaly detection to work with Retrospect’s scripting / orchestration engine, kicking off various processes when something has gone wrong.
Retrospect Management Console Integration
This capability has been integrated wth the Management Console, and you can now view anomalies across a business or partner’s entire client base in a single pane of glass.
[image courtesy of Retrospect]
Improved Microsoft Azure Blob Integration
You can now set individual immutable retention policies for different backup sets within the same Azure Storage Container. This capability was already available with Retrospect’s AWS S3 integration.
Streamlined Immutable Backup User Experience
Automatically create cloud buckets with immutable backups supported by default. There’s also support for StorCentric’s Unity S3 capability out of the box.
Is tape dead? Maybe. But there are still people using it, and this release includes support for LTO-9, with capacities up to 18TB (45TB compressed).
Retrospect Backup 18.5 is a free upgrade to Retrospect Backup 18. While it doesn’t set the world on fire in terms of a broad range of features, there’s some stuff in here that should get existing users excited, and give those considering the product a little more to mull over. Retrospect has been chipping away slowly but surely over the years, and I think it provides the traditional SME market with something that’s been difficult to get until recently: a solid data protection solution, with modern capabilities such as ransomware detection and object storage support, for a price that won’t send customers in that segment packing. I think that’s pretty good, and I look forward to see how things progress over the next 6 – 12 months.
Speaking of Pure Storage, my friend Jon wrote about his experience with ActiveCluster in the field recently. You can find that here. I always find these articles to be invaluable, if only because they demonstrate what’s happening out there in the real world.
Want some press releases? Here’s one from Datadobi announcing it has released new Starter Packs for DobiMigrate ranging from 1PB up to 7PB.
Data protection isn’t just something you do at the office – it’s a problem for home too. I’m always interested to hear how other people tackle the problem. This article from Jeff Geerling (and the associated documentation on Github) was great.
John Nicholson is a smart guy, so I think you should check out his articles on benchmarking (and what folks are getting wrong). At the moment this is a 2-part series, but I suspect that could be expanded. You can find Part 1 here and Part 2 here. He makes a great point that benchmarking can be valuable, but benchmarking like it’s 1999 may not be the best thing to do (I’m paraphrasing).
Speaking of smart people, Tom Andry put together a great article recently on dispelling myths around subwoofers. If you or a loved one are getting worked up about subwoofers, check out this article.
I had people ask me if I was doing a predictions post this year. I’m not crazy enough to do that, but Mellor is. You can read his article here.
In some personal news (and it’s not LinkedIn official yet) I recently quit my job and will be taking up a new role in the new year. I’m not shutting the blog down, but you might see a bit of a change in the content. I can’t see myself stopping these articles, but it’s likely there’ll be less of the data protection howto articles being published. But we’ll see. In any case, wherever you are, stay safe, happy holidays, and see you on the line next year.
Welcome to Random Short take #64. It’s the start of the last month of the year. We’re almost there.
Want to read an article that’s both funny and informative? Look no further than this beginner’s guide to subnetting. I did Elizabethan literature at uni, so it was good to get a reminder on Shakespeare’s involvement in IP addressing.
On a more serious note, data hoarding is a problem (I know this because I’ve been guilty of it), and this article from Preston outlines some of the reasons why it can be a bad thing for business.
Still on data protection, Howard Oakley looks at checking the integrity of Time Machine backups in this post. I’ve probably mentioned this a few times previously, but if you find macOS behaviour baffling at times, Howard likely has an article that can explain why you’re seeing what you’re seeing.
Zerto recently announced Zerto In-Cloud for AWS – you read more about that here. Zerto is really starting to put together a comprehensive suite of DR solutions. Worth checking out.
Finally, this article over at Blocks and Files on what constitutes a startup made for some interesting reading. Some companies truly are Peter Pans at this point, whilst others are holding on to the idea that they’re still in startup mode.
Talk to people in the tech sector today, and you’ll possibly hear a fair bit about how ransomware is a real problem for them, and a scary one at that. Most all of the data protection solution vendors are talking about how they can help customers quickly recover from ransomware events, and some are particularly excited about how they can let you know you’ve been hit in a timely fashion. Which is great. A good data protection solution is definitely important to an organisation’s ability to rapidly recover when things go pop. But what about those software-based solutions that themselves have become targets of the ransomware gangs? What do you do when someone goes after both your primary and secondary storage solution? It costs a lot of money to deliver immutable solutions that are resilient to the nastiness associated with ransomware. Unfortunately, most organisations continue to treat data protection as an overpriced insurance policy and are reluctant to spend more than the bare minimum to keep these types of solutions going. It’s alarming the number of times I’ve spoken to customers using software-based data protection solutions that are out of support with the vendor just to save a few thousand dollars a year in maintenance costs.
The StorONE Solution
So what do you get with S1:Backup? Quite a bit, as it happens.
[image courtesy of StorONE]
You get Flash-based data ingestion in an immutable format, with snapshots being taken every 30 seconds.
[image courtesy of StorONE]
You also get fast consolidation of multiple incremental backup jobs (think synthetic fulls, etc.), thanks to the high performance of the StorONE platform. Speaking of performance, you also get quick recovery capabilities, and the other benefits of the StorONE platform (namely high availability and high performance).
And if you’re looking for long term retention that’s affordable, you can take advantage of StorONE’s ability to cope well with 90% capacity utilisation, rapid RAID rebuild times, and the ability to start small and grow.
Thoughts and Further Reading
Ransomware is a big problem, particularly when it hits you across both primary and secondary storage platforms. Storage immutability has become a super important piece of the puzzle that vendors are trying to solve. Like many things though, it does require some level of co-operation to make sure non-integrated systems are functioning across the tack in an integrated fashion. There are all kinds of ways to attack this issue, with some hardware vendors insisting that they’re particular interpretation of immutability is the only way to go, while some software vendors are quite keen on architecting air gaps into solutions to get around the problem. And I’m sure there’s a tape guy sitting up the back muttering about how tape is the ultimate air gap. Whichever way you want to look at it, I don’t think any one vendor has the solution that is 100% guaranteed to keep you safe from the folks in hoodies intent on trashing your data. So I’m pleased that StorONE is looking at this problem and wanting to work with the major vendors to develop a cost-effective solution to the issue. It may not be right for everyone, and that’s fine. But on the face of it, it certainly looks like a compelling solution when compared to rolling your own storage platforms and hoping that you don’t get hit.
Doing data protection well is hard, and made harder by virtue of the fact that many organisations treat it as a necessary evil. Sadly, it seems that CxOs only really start to listen after they’ve been rolled, not beforehand. Sometimes the best you can do is be prepared for when disaster strikes. If something like the StorONE solution is going to be the difference between losing the whole lot, or coming back from an attack quickly, it seems like it’s worth checking out. I can assure you that ignoring the problem will only end in tears. It’s also important to remember that a robust data protection solution is just another piece of the puzzle. You still need to need to look at your overall security posture, including securing your assets and teaching your staff good habits. Finally, if it seems like I’m taking aim at software-based solutions, I’m not. I’m the first to acknowledge that any system is susceptible if it isn’t architected and deployed in a secure fashion – regardless of whether it’s integrated or not. Anyway, if you’d like another take on the announcement, Mellor covered it here.
If you’re paying attention to any data protection solution vendors at the moment, you’re no doubt hearing about ransomware attacks. These are considered to be Very Bad Things (™).
Ransomware comes in through zero-day exploit or email attachments
Local drive content encrypted
Network shares encrypted – might be fast, might be slow
Encrypted file accessed and ransom message appears
How It Happens
Ransomware attacks are executed via many means, including social engineering, software exploits, and “malvertising” (my second favourite non-word next to performant). The timing of these attacks is important to note as well, as some ransomware will lay dormant and launch during a specific time period (a public holiday, for example). Sometimes ransomware will slowly and periodically encrypt content , but generally speaking it will begin encrypting files as quickly as possible. It might not encrypt everything either, but you can bet that it will be a pain regardless.
Defense In Depth
Ransomware protection isn’t just about data protection though. There are many layers you need to consider (and protect), including:
Human – hard to control, not very good at doing what they’re told.
Physical – securing the locations where data is stored is important.
End Points – BYOD can be a pain to manage effectively, and keeping stuff up to date seems to be challenging for the most mature organisations.
Networks – there’s a lot of work that needs to go into making sure workloads are both secure and accessible.
Application – sometimes they’re just slapped in there and we’re happy they run.
Data – It’s everything, but super exposed if you don’t get the rest of this right.
The folks at Datadobi tell me DobiProtect is the ideal solution for protecting the data layer as part of your defence in depth strategy as it is:
Designed for the scale and complexity of file and / or object datasets
A solution that compliments existing capabilities such as storage system snapshots
Easy to deploy and does not impact existing configurations
A solution that is cost effective and flexible
Where Does It Fit?
DobiProtect plays to the strength of Datadobi – file and object storage. As such, it’s not designed to handle your traditional VM and DB protection, this remains the domain of the usual suspects.
[image courtesy of Datadobi]
The software-only nature of the solution, and the flexibility of going between file and object, means that it’s pretty easy to deploy as well.
[image courtesy of Datadobi]
From an architecture perspective, it’s pretty straight forward as well, with the Core handling the orchestration and monitoring, and software proxies used for data movement.
[image courtesy of Datadobi]
I’ve been involved in the data protection business in some form or another for over two decades now. As you can imagine, I’ve seen a whole bunch of different ways to solve problems. In my day job I generally promote modern approaches to solving the challenge of protecting data in an efficient and cost-effective fashion. It can be hard to do this well, at scale, across the variety of workloads that you find in the modern enterprise nowadays. It’s not just some home directories, a file server, and one database that you have to protect. Now there’s SaaS workloads, 5000 different database options, containers, endpoints, and all kinds of other crazy stuff. The thing linking that all together is data, and the requirement to protect that data in order for the business to do its business – whether that’s selling widgets or providing services to the general public.
Protecting file and object workloads can be a pain. But why not just use a vendor that can roughly do the job rather than using a very specific solution like DobiProtect? I asked D’Halluin the same question, and his response was along the following lines. The kind of customers Datadobi is working with on a regular basis have petabytes of unstructured data they need to protect, and they absolutely need to be sure that it’s being protected properly. Not just from a quality of recovered data perspective, but also from a defensible compliance position. It’s not just about pointing out to the auditors that the data protection solution “should” be working. There’s a lot of legislation and stuff in place to ensure that it needs to be more than that. So it’s oftentimes worth investing in a solution that can reliably deliver against that compliance requirement.
Ransomware attacks can be the stuff of nightmares, particularly if you aren’t prepared. Any solution that is helping you to protect yourself (and, more importantly, recover) from attacks is a Very Good Thing™. Just be sure to check that the solution you’re looking at does what you think it will do. And then check again, because it’s not a matter of if, but when.