Easily Scale Your NetApp Ecosystem into the Cloud

Easily Scale Your NetApp Ecosystem into the Cloud

Posted in news

As the volume of data that enterprises are storing increases exponentially, it’s no surprise that IT pros are eager to add cloud and object storage to their environments in order to reign in the costs of data growth. Even so, most enterprises aren’t making the jump as quickly as they’d probably like. We wanted to find out why.

A survey we conducted of nearly 375 IT professionals at NetApp Insight in Las Vegas sheds light on key challenges IT pros cited as hurdles to object storage adoption. As we look forward to NetApp Insight in Berlin, let’s take a closer look at how DataSphere solves these challenges. The survey results are depicted below.

Figure 1. Key challenges to enterprise object storage adoption.

Accelerate Cloud Adoption with Seamless Integration

Nearly 35% of survey respondents stated difficulty integrating object storage into existing infrastructure as a key challenge to adoption. Integration is an issue because each cloud or object store represents a separate storage silo that must be managed separately. DataSphere enables you to manage data across heterogeneous systems, such as NetApp FAS, NetApp StorageGRID Webscale (SGSW), and public clouds by abstracting data management within a single global namespace. DataSphere software can then transparently move data between primary, object, and public cloud storage, according to policies you define, without disrupting applications.

DataSphere makes all cloud and object archives “active” by storing data as files that can automatically be retrieved should applications need them again. The ability to keep data accessible as files also minimizes public cloud bandwidth charges, since DataSphere retrieves just the files applications need. It also means that enterprises don’t have to reconfigure or reformat applications to use object data—this was the second biggest challenge to object storage adoption, according to 30% of survey respondents.

Is Your Data Hot or Not?

We also wanted to gauge how much data enterprises might be able to archive if they adopted object/cloud storage. Accordingly, we asked what percentage of data IT pros thought was cold. 57% of respondents estimated 70-80% of their data was cold. Asked when data could be considered “cold,” 30% considered data not accessed in the last three to six months could be considered cold, while 60% felt data that could be considered cold if it had not been accessed in the last year. The following diagrams show the responses to these questions:

Figure 2. 57% of respondents estimate 70-80% of data in their enterprise is cold.

Figure 3. 60% of respondents said data could be considered “cold,” if it had not been accessed in the last year.

DataSphere can help you easily identify and then automatically move data to object/cloud storage. Using file metadata, DataSphere can identify cold data, as defined by your criteria for aging, and transparently move that data without impacting applications. DataSphere even includes a Data Profiler tool that enables you to add tiers, such as cloud or object to their environment and instantly calculate savings so you can quickly determine whether an object or cloud investment can be cost justified.

Want to learn more about how DataSphere can eliminate your biggest object and cloud adoption challenges? If you are attending NetApp Insight in Berlin, stop by stand B4 or connect with us at deepdive@primarydata.com to schedule a meeting or demo.

Contact Form

Channel Partner