Silver Bulletin

Building Data Confidence for Cloud-Native Applications

Posted by: Steve Miller

Old Storage meets Cloud

The growth of cloud-native applications is spawning a new dawn for the data storage sector and re-igniting interest in software defined.

Storage purchasing practice has not changed much in twenty years and the big players then are still the big players today. There are a few exceptions where specialists have ridden trends to build decent positions in the market, think Veeam for virtual data protection or Pure for high-speed Flash storage but the top three remain the same, DellEMC, Netapp and HPE.

Disruption is coming but not from any one commercial entity. No, only a fundamental change to the computing model will upset the form book. The storage and backup solutions originally designed for first monolithic and then virtual applications won’t fit the requirements for cloud-native applications. I’m not talking about “lift-n-shift” cloud using VMware to migrate existing workloads; I’m talking about the data management needs of newly written applications for containers, which can share multiple solutions on a single OS kernel.

Wall Street darlings still?

It’s difficult to track the growth of open source software because revenues are not easy to follow on Wall Street. Compare that to the corporate tech stocks like Netapp which supply data storage to enterprise datacenters. On Friday 2nd August, Netapp announced revenues 17% down on the previous year which saw its share price fall by 22%. Similarly, Pure Storage’s stock is at 50% of its highs for the year. The disruption which first hit the tech giants like HP and IBM is now hurting the specialist players.

This disruption will accelerate because the open source movement is getting much more powerful. IBM’s purchase of Red Hat for $34 billion is testament to that and a good signal of further penetration into the enterprise or container technology.

Enter Containers

Despite the hype around container orchestration software like Kubernetes, corporate enterprises have yet to fully embrace the challenge of re-engineering for containers. Containers, though more efficient than Virtual Machines and more agile, have been considered too fragile for serious IT operations staff. Developers love the speed and simplicity of containers. The process is so easy they can afford to focus on writing great code and nothing else. The ephemeral nature of containers meant that data could be easily lost which meant no IT admin could sanction their use in steady-state production workloads. This situation is changing fast.

Last year, a Kubernetes release went GA with persistent volumes, which affords data a place to live, even when containers spin down. These volumes can reside on the usual protocols; block, file and object. Another major development at the start of 2019 was the Kubernetes 1.13 release which went GA with the Container Storage Interface (CSI). This was the starting pistol for vendors to pile into developing drivers for CSI now that they have a stable development and support interface.

For even greater confidence, production workloads require robust data backup and recovery systems. Most major backup applications from vendors like Veritas, Dell, IBM and Commvault, don’t natively support containers. Though still in Beta, Kubernetes has released APIs for its volume snapshot feature. Looking back to when VMware provided VAPI it unleashed a wave of innovation and enabled powerhouse commercial entities such as Veeam. The imperative of digital transformation is driving cloud investment and open source development. According to Grand View Research, containers are expected to grow in adoption by 26% CAGR between 2019 and 2025. Several vendors including IBM are working on supporting containers in their backup/recovery software later in 2019.

Future Storage

As applications built on container technology move from testing into production and full-scale operations, they will need a stable bedrock of enterprise-class data storage and backup infrastructure in place. Developers won’t accept the constraints of traditional storage administration, they will expect rapid provision, self service and ease of use. IT admins will still be needed to manage the overall storage stack, but they will have to get out of the developers’ way for general use. To deliver storage in a uniform and dynamic manner across on premise and public clouds, with less intervention by storage administrators, requires a software-defined approach.

To finish, its not all about where your data resides. In all likelihood, it will become more portable as applications are written for containers. What’s important is that if you need to retain your data then you need the security of persistent volumes and a way of protecting the data that isn’t hampered by the ephemeral nature of the platform.


Posted by: Steve Miller on August 4, 2019

Listed in

We use cookies to improve your experience on our website. By browsing this website, you agree to our use of Cookies.