Is Your Data Hot or Not?
Posted in news
In recent Forbes article titled, Is Your Data Hot or Not, Primary Data CEO Lance Smith discussed how poor visibility into data activity affects modern business. He notes that while new uses of data are creating the need for more storage, storage inefficiency is a big reason for the $175 billion dollars Gartner expects companies to spend on datacenter systems in 2017.
The reason for much of this inefficiency is because it’s historically been difficult to move data, so IT has to overspend on storage that will deliver performance for data’s peak expected demand. For the rest of its lifetime, most data unnecessarily eats up expensive storage capacity.
Lance explains that “not only is data hard to move because it interrupts applications, but it’s also hard for enterprises to see what’s going on -- especially the activity level of the data files stored on various storage systems. When there are performance or cost problems, it’s difficult to determine what data is hot or not so that steps can be taken to fix them. As many enterprises are now seeking to migrate to the cloud in order to save costs and increase agility, it’s also becoming critical for enterprises to identify what data can be safely archived.”
The article outlines how this problem can be solved through the use of a metadata engine, such as DataSphere, that gathers data about all data, as well as knowledge of the capabilities of all storage resources in its global namespace. Armed with this visibility, a metadata engine can make all storage devices simultaneously accessible to applications.
This enables the metadata engine to automatically move data to the right place at the right time to meet data requirements, as business needs evolve. When data becomes cold, it can be moved to the cloud. If it heats back up, it can be moved back to on-premises storage—without any intervention from IT, and without impacting applications.