Why DataSphere

What’s your problem? Fix it with automated data management.

DataSphere optimizes the economics and performance of your enterprise infrastructure. Virtualizing all your data into a global namespace, and making it mobile across tiers of storage and cloud, significantly improves the utilization of your infrastructure by enabling the placement of the right data, at the right place, at the right time.

Applications Benefit from Storage Awareness

Bringing the intelligence of machine learning to automated data management, DataSphere is a completely new element in modern IT architecture. Built to maximize the value of existing infrastructure, DataSphere automates migrations from old to new storage, automates data tiering, manages data across storage from different vendors, scales-out NAS arrays, achieves storage QOS, integrates into virtualized environments, and helps enterprises easily adopt cloud storage.

When applications are unaware of the storage attributes available to them – as they have been for decades - they suffer from a multitude of issues:

  • Bottlenecks hinder performance
  • Cold data compromises hot capacity
  • Overprovisioning creates significant overspending
  • Migration headaches keep data stuck until retirement
  • Vendor lock-in limits agility and increases costs

DataSphere learns how applications experience storage, and moves data to the most appropriate resource without application interruption.

Economics

IT shouldn’t have to make sacrifices. Get performance, scale, AND savings

DataSphere takes a whole new approach to data management. DataSphere assimilates data into a global namespace that stretches across the enterprise and into the cloud. Its metadata engine leverages machine learning intelligence to simplify and automate the care and feeding of your data across your storage infrastructure. IT shouldn’t have to make sacrifices. Get performance, savings AND scale – without having to rip and replace a single storage system.

Reduce CapEx

Defer new storage purchases by doing more with your existing investments, and lower costs of application deployment by reducing lifetime resource needs up front. A typical customer with 1PB of data can reduce storage spend over 5 years by up to 72% by automatically archiving old data to the cloud, and deferring the costs of major upgrades.

Reduce OpEx

Automate and optimize daily management. Let IT focus on work that adds value, reduce human error and labor costs, and simplify troubleshooting through machine learning. DataSphere's continual monitoring measures, detects, and resolves data performance problems, often before admins can report them.

ROI Calculator

This is a typical scenario for an enterprise customer with one petabyte of data to manage, but every company is different. Input the parameters specific to your installation and see how much you can save by deploying DataSphere in your environment.

Use the Primary Data TCO Calculator to see how much DataSphere might save you >

On Ramp to The Cloud

DataSphere seamlessly integrates your datacenter infrastructure with cloud/object storage (S3).

DataSphere seamlessly integrates cloud-based storage with your datacenter infrastructure. With support for the ubiquitous Amazon S3 API, adding cloud end-points to your storage pool is non-disruptive and simple. You can add cloud storage (buckets) from any number of vendors (including on-premises object storage), with full control over the data that goes to a cloud provider, whether it be for active archiving, or selective backups.

WAN optimization is already built into DataSphere, as all data that is sent to the cloud is automatically de-duplicated and compressed. During transfer, a secure link is used to ensure that data remains safe.

Data mobility between the cloud and on-premises storage is automatic and transparent to the user. Files that require performance when opened are automatically transferred back to ensure application needs are met.

Finally, your data is not stuck in one cloud. Mobility between clouds is as simple as a single click and the data transfer is initiated, with data reduction automatically applied along the way.

On Ramp to the Cloud | Primary Data

ZERO INTEGRATION TIME

Use any storage. Keep apps running. No software changes required. DataSphere automatically learns your environment, assimilates the metadata, and you’re ready to go.

DataSphere integrates easily into existing customer environments, enabling customers to be up and running within minutes of completing the DataSphere installation, with no required software changes on the client. Integrating storage into the DataSphere global namespace requires only the assimilation of data’s metadata, rather than copying primary data to a new system. The DataSphere in-place data assimilation process is non-disruptive for clients, as the metadata is collected on-demand as well as a background process. This eliminates the risk to business continuity common to migrations, while cutting “import” times to just minutes.

Live Data Mobility

No Downtime. DataSphere moves live data without any application disruption.

Live data mobility makes data migration easy and automatic, whether you are decommissioning an out-of-service array, or adopting a new storage tier from a new vendor.

DataSphere simplifies hot-spot remediation when your data finds itself on an overworked storage array, at an inconvenient time. It can automatically detect and resolve hot-spots, moving data between heterogeneous storage platforms to alleviate oversubscription, including moving cold data to the cloud or any S3 end-point using standard open-source based infrastructure.

Live Data Mobility | Primary Data

Get More Out of Your Existing Storage

Use what you’ve got.

As a metadata engine, DataSphere can integrate existing storage without moving any data. This makes it easy to start managing large amounts of data with DataSphere by integrating existing storage into the global namespace with minimal disruption.

DataSphere can turn existing resources into heterogeneous scale-out NAS and rebalance hot and cold data across storage tiers to reduce over-provisioning. DataSphere is storage-agnostic, allowing it to leverage the underlying value of all storage in the global namespace. The capabilities of storage manifest as attributes in the storage catalog.

Integrate Existing Storage | Primary Data

Data Reporting at Scale

Manage Over a Billion Files from a Single UI

  • See your data in motion
  • Instantly identify performance hot spots
  • Global storage reporting helps you manage data at scale

Date reporting at Scale | Primary Data

Build the Optimal Storage Infrastructure, Customized for your Data

DataSphere Data Profiler Analyzes your Data and your Cost to Create Perfect Tiers

The Data Profiler can operate independently as a standalone application or within Primary Data’s DataSphere platform. It analyzes how data is used across the storage infrastructure to evaluate data management strategies and project the savings possible by improving utilization across resources including NAS, cloud, or on-premises object storage resources. When deployed with DataSphere, enterprises can quickly put their customized data management strategies into action with automated data migration, tiering storage across their enterprise infrastructure and adding low cost cloud storage for cold data management.

Support -The Data Profiler uses statistics from a target share(s) and gives the IT user the ability to model the existing environment to then build a new tiered architecture by optimizing the use of existing storage, and replacing or adding new storage to meet budgetary and planning goals. In addition, the Data Profiler highlights:

  • Total cost of data ownership within the target storage environment
  • Cost Savings vs. previous or reference configurations
  • Benefits of a tiered storage configuration
  • Cost, amount and the number of files per storage tier
  • The advantage of low cost object storage for cold less used data

Date reporting at Scale | Primary Data

Enterprise Ready

DataSphere protects against data loss, and automates recovery.

With DataSphere you can be confident your data is safe and your business protected. DataSphere solutions maintains high availability by replicating four copies of metadata at all times. DataSphere software can be updated live without disrupting applications. DataSphere servers recover gracefully from power interruptions.

Support – Primary Data provides 24/7 global support, connecting you directly with an engineer through phone or email.

Storage Planning Made Simple

With predictive analysis based on actual data activity, DataSphere automatically tracks the objectives describing the care, feeding, and cost of your data. DataSphere rebalances data across storage tiers to eliminate the cost of overprovisioning and the risk of oversubscription. Alerts and notifications ensure admins know when they need to deploy more storage of a given type, and can recommend changes to objectives for better load balancing. When more performance or capacity is needed, IT can deploy new resources from any vendor in minutes instead of days or weeks. Once IT no longer has to worry about having to overprovision to deliver application performance, they can achieve significant savings.

Intelligent Objectives Describe

CARE FEEDING COST
Security IOPS Per object
Durability Latency Per byte stored
Availability Bandwidth Per byte read
Sovereignty Read and write Per byte written
Changeability Sustained and burst Per read operation
Recoverability Sequential and random Per write operation

Contact Form

Channel Partner