Primary Data Announces Survey Results (And A Very Useful Tool)

Survey Says …

Primary Data conducted a “Storage Census” at VMworld this year, and I had the opportunity to talk about the results with Lance Smith (CEO, Primary Data). You can read the press release here.

 

The Enterprise Responds

The majority of the 313 respondents worked at large enterprises with over 1,000 to 5,000 and more employees, and their companies were mostly over ten years old.

[image courtesy of Primary Data]

 

Performance and Cold Data

The survey also found that:

  • 38 percent of respondents selected performance as their biggest challenge for IT, making it the most pressing IT issue named in the survey;
  • Data migrations were the second most common headache for 36 percent of those surveyed;
  • Budget challenges (35 percent) and cloud adoption (27 percent) closely followed among the top concerns for respondents;
  • The majority of organizations estimate that at least 60 percent of their data is cold; and
  • 27 percent of organizations manage twenty or more storage systems, 44 percent manage ten or more storage systems.

Smith said that “[p]erformance remains a noisy problem, and the lack of insight into data forces IT to overprovision and overspend even though the professionals surveyed know that the majority of their data is actually sitting idle”. This leads me to the next part of Primary Data’s announcement: Data Profiler.

 

Data Profiler Could Be A Very Useful Tool

The team then ran me through a demo of a new tool available via DataSphere called Data Profiler. The Data Profiler provides a number of data points that change according to the chosen objectives, including

  • Cost per storage tier,
  • Number of files per tier,
  • Capacity per tier; and
  • Total cost of the analyzed global storage ecosystem.

Data tiers can be easily added to the Data Profiler to evaluate policies using the resources available in each customer’s unique environment.

[image courtesy of Primary Data]

The cool thing about this tool is you can do all your own modelling by plugging in various tiers of storage, cost per MB/GB/TB, etc and have it give you a clear view of potential cost savings, assuming that your data is, indeed, cold. You can then build these policies (Objectives) in DataSphere and have it move everything around to align with the policy you want to use. You want your hot data to live in the performance-oriented tiers, and you want the cold stuff to live in the cheap and deep parts. This is a simple thing to achieve when you have one array. But when you have a range of arrays from different vendors it becomes a little more challenging.

 

Conclusion

I wasn’t entirely surprised by the results of the survey, as a number of my enterprise customers have reported the same concerns. Enterprise storage can be hard to get right, and it’s clear that a lot of people don’t really understand the composition of their storage environment from an active data perspective.

I’ve been a fan of Pd’s approach to data virtualisation for some time, having observed the development of the company over the course of a number of Storage Field Day events. Any tool that you can use to get a better idea of how to best deploy your storage, particularly across heterogeneous environments, is a good thing in my book. I was advised by the team that this is built into DataSphere at the moment, although it’s easy enough to run a small script to gather the data to send back to their SEs for analysis. I believe there may also be an option to install the tool locally, but I’d recommend engaging with Primary Data’s team to find out more about that. As with most things in IT, the solution isn’t for everyone. If you’re running a single, small array, you may have a good idea of what it costs to use various tiers of storage, how old your data is and how frequently it’s being accessed. But there are plenty of enterprises out there with a painful variety of storage solutions deployed for any number of (usually) good reasons. These places could potentially benefit greatly from some improved insights into their environments.

I’ve said it before and I’ll keep saying it – Enterprise IT can be hard to do right. Any tools you can use to make things easier and potentially more cost effective should be on your list of things to investigate.