DataSphere separates how an application logically views data to free data from where it is physically stored. It is a completely new element in modern IT architecture built to maximize the value of existing infrastructure and help enterprises easily integrate with the cloud.
Figure 1 The DataSphere metadata engine integrates existing infrastructure and the cloud into a global namespace, and automatically moves the right data to the right place at the right time to maintain objectives.
The storage and vendor-agnostic DataSphere architecture unites different types of storage into a global namespace and automatically places data on the most appropriate storage resource to meet business and IT objectives across performance, protection and price. This helps enterprises overcome performance bottlenecks, integrate with the cloud for savings and active archival, and easily adopt new resources from any vendor to achieve unprecedented improvements in performance, efficiency and scalability.
Legacy Architecture Problems Wake IT Up at Night
Typically, and historically, an IT administrator must configure an application where its data files are located. Once created, data is perpetually stuck in its original storage silo, regardless of any changes to application demands, business performance needs, cost or reliability requirements. As a result, IT routinely over-provisions the assigned computational, networking and storage resources to ensure that peak demands are met over the lifetime of the application. The resulting waste consumes valuable resources and budget, yet even with significant overspending, storage still often fails to meet unpredicted business demands.
IT admins have their hands full with the constant deployment and management of both new and existing applications. When an application exceeds the architectural limits of its assigned resources (whether computational, networking or storage), IT gets that dreaded call. Once IT is needed to fix a problem, business has already been lost. But it doesn’t have to be this way.
Your Most Valuable Asset: Metadata
As a metadata engine, DataSphere is designed to separate and offload the architecturally rigid relationship between applications and where their data is stored. Offloading metadata access with DataSphere delivers predictable, low-latency metadata operations by guaranteeing that metadata operations do not get “stuck” in the queue behind other data requests. Rather than having to wait for sequential operations to complete, DataSphere can leverage parallel access with the latest optimizations of the standard NFS v4.2 protocol. Leveraging NFS v4.2 significantly speeds up metadata and small file operations by requiring less than half of the protocol-specific network round trips compared to NFS v3.
DataSphere collects metadata of a client’s data access and how it experiences storage (IOPS, latency, bandwidth and availability). Intelligent analytics are then applied against business requirements to achieve desired levels of performance, cost and reliability. DataSphere makes real-time automated decisions for data placement, moves data without disruption in order to overcome or prevent outages, and maintains compliance to service level agreements or objectives.
Automatically Accelerate Performance and Savings
DataSphere provides clients access to billions of files across multiple storage devices in parallel. Performance is accelerated by balancing I/O load at a file by file level across the storage devices and by offloading the metadata tasks from the storage devices so they are free to serve more data.
DataSphere collects telemetry in the form of metadata to learn the IOPS, bandwidth and latency from each client for each file accessed. This provides a rich understanding how storage devices are performing, which files are active and if application data is out of compliance. If so, data is automatically moved to the right storage tier without disruption to running applications.
Pay-As-You-Grow Subscription Model
Not only does DataSphere enable IT to optimize storage efficiency from flash to the cloud, but it also helps you scale easily with subscription pricing that matches what your enterprise needs to manage now. There’s no limit to the number of files or capacity managed per DataSphere, which means there is no need to overprovision years in advance – DataSphere can grow right along with you at the pace that you define.