At Silverstring we have some avid blog writers, from storage story teller Anton James, to our very own CEO Alistair Mackenzie. So, we thought when better to recap our most loved blogs than Valentine’s Day. We didn’t like the idea of our blogs sitting there lonely, getting little attention on such a romantic “holiday”, after all everyone deserves a little special attention now and again.
In our last few blog posts we’ve covered a number of different, key subjects that really outline the interesting state of the current market. It’s said that the only consistent thing about the IT marketplace is that it is always changing. For the first time in a few years, in our specialist sector, this is currently more true than ever before.
On August 26th IBM announced v7.1.3 of Spectrum Protect (formerly TSM). The announcement included improvements to the user interface to please VMware administrators and a new device class to leverage public cloud storage.
Topics: Data Deduplication
We all know tape is the cheapest way to store large amounts of backup data. A tape requires no power, cooling or hardware maintenance and the amount of tape you need is greatly reduced by TSM’s incremental forever backup model (a model that recently extended to VMware backup, too).
Storing the data is without doubt quick and cheap, but the time required to recover the numerous small backup objects from it can often mean you miss business defined RTO (Recovery Time Objectives) when you need to recover an entire client.
On 20th August 2013, IBM announced Tivoli Storage Manager Suite for Unified Recovery v6.4.1.
Relevant to anyone interested in data deduplication technology, for the first time customers can use IBM ProtecTIER deduplication to lower terabytes (TB) managed, something that has consequent implications when comparing cost models for TSM.
Let’s start at the beginning
In 2011, IBM introduced a new way of ordering licences for TSM based on capacity rather than processors or processor cores. Called TSM Suite for Unified Recovery (SUR), the license consisted of a bundle of 10 different modules including database and mail agents, virtual backup and SAN-based backup.
A data reduction solution whereby exact matches of data are effectively linked to one another for quicker and more efficient processing, rather than having to process exactly the same data more than once, deduplication has been growing in prominence in recent years, due largely to the substantial increase in data that we’re seeing across all industries.
Whilst deduplication itself is fantastic and can mean organisations save a considerable amount when it comes to data storage and transfer requirements, one of its main problems isn’t to do with the process itself, but the way in which it’s marketed.
Just like retailers who try to entice customers with advertising, deduplication is very often advertised in a way that suggests it can completely remove your data storage problems.