DataSphere for Managed Service Providers


Information and communications technology (ICT) is generally viewed as the convergence of telecommunications with computers, networking, storage, and software. These elements are the building blocks of telecommunications services, and over the years, leading telecommunication companies (Telcos) have taken the opportunity to leverage their network, infrastructure and financial assets to offer enterprises comprehensive solutions beyond basic telecommunications. Many of these Managed Service Providers (MSPs) now see growth ahead in cloud services.

Staying in step with customer demands and IT trends has resulted in Large Scale Service Providers operating delivery models that focus on a portfolio of business consulting, IT management, security control and infrastructure as a service. As with other utility services, consumption of communication services are usually metered, but no two customers are alike, so adoption does not follow a single template.

  • Dynamically add or subtract resources based on customer demand
  • Tailor storage allocation to meet the unique needs of customer applications
  • Ensure customer SLAs are met automatically
  • Easily integrate new technologies to differentiate and grow with customers
  • Reduce infrastructure costs and grow margins

To balance the need for profitability while maintaining technical leadership for customers, IT infrastructure-related MSPs have been walking the line of strategic hardware investment versus tactical customer demand. In other words, an MSP will not implement a new service or purchase new equipment unless there is committed customer demand. However, MSPs are compensated on services that meet specific criteria, specifically defined as Service Level Agreements (SLA). In order to meet these SLAs, traditionally, new application deployments have been architected with more resources than needed to ensure SLAs are met over the lifetime of the application, which parallels the in-house enterprise IT approach to meeting application requirements. This act of overprovisioning is expensive, but a safe bet.

Products | Primary Data

The Questions MSPs Can't Afford to Leave Unanswered

As new, software-driven approaches to data management emerge, a number of questions arise that could finally give MSPs the flexibility they need to resolve the conflicting challenges of maintaining technology leadership while offering solutions that align with current customer demands. For example, what if resources could be dynamically added or subtracted based on customer demand or lack thereof? This has been solved in computing thanks to virtual machines that can easily be spun up or down on demand, yet storage still requires more overhead and must be architected, procured, installed, tested and deployed. Could the storage demand for an application’s data be tailored for the type of storage that meets its needs? Can the storage SLAs or objectives for an application’s data be changed over time as the value of the data drops compared to the cost of ownership? DataSphere makes these capabilities possible so MSPs can get the right data to the right place at the right time and create balance between customer needs and service capabilities.

DataSphere: Right Data, Right Place, Right Time

DataSphere is a metadata engine that separates an application’s logical view of its data from where it is physically stored. This breakthrough delivers unprecedented improvements in performance, efficiency and scalability. The DataSphere platform leverages the latest advancements in NFS 4.2 environments natively, or can be deployed with DataSphere Extended Services (DSX) in legacy, SMB (Windows) or Unix environments for easy management and to gain insight into how a customer’s application makes use of its data. With this knowledge, DataSphere can place, move and distribute data to different storage types or tiers without disrupting an application's access, even while the data is in-flight.

Leveraging Intelligent Storage Awareness

Applications and storage have long been blind to each other’s capabilities and needs. The majority (if not nearly all) of today’s enterprise applications do not know the attributes of the storage where its data resides. Applications cannot tell if the storage is fast or slow, premium or low cost. This blindness forces Service Providers to overprovision performance or capacity to ensure their SLAs.

Conversely, storage does not know what data is the most important to an application. It only knows what was recently accessed. This underscores the shortcomings of using caching tiers to fix performance. Placing recently accessed data in caching tiers does increase performance if thatsame data happens to be accessed again. However, caching tiers do not have the intelligence needed to retain important data for mission-critical applications, which can cause serious inconsistent performance or require more cache capacity as data grows.

DataSphere gathers metadata intelligence in real-time to understand how applications experience the latency, IOPS and bandwidth provided by different storage. It also collects telemetry on the data that applications access, such as which files are open, closed, the dates and times they are modified, as well as any other metadata. Using this information, Service Providers can tailor infrastructure to suit the needs of an end-customer’s application. This can be done by offering specific services with well-defined SLAs or objectives, and without committing entire storage deployments to a single use case.

Differentiating MSP Services

Innovative and cost-effective customer solutions are the core of an MSP’s business. Storage has been the least flexible component in the MSP data center for decades, but with DataSphere, storage becomes as simple to deploy and manage as virtual machines.

With DataSphere, a service catalog of storage choices can be maintained and offered to customers to increase choice and profitability. Rather than being limited to statically-allocated services, with DataSphere, the MSP can freely choose between vendors and technology types to deliver the most optimal and cost-effect service. With a dynamic scale-out architecture enabled by DataSphere, MSPs can deliver storage on-demand and avoid costly over-provisioning situations, while also facilitating easy but aggressive adoption of the latest technologies to stay competitive.

DataSphere offers file-level granularity for optimal flexibility, enabling MSPs to simplify and optimize their environments while meeting the diverse needs of their customers. With file-level granularity, Data Lifecycle Management can be supported across all workloads rather than being limited to a particular storage system.

With the dynamic DataSphere environment, MSPs can easily grow and shrink their deployments to meet the needs of even the most demanding customers. DataSphere is also capable of tiering out snapshots to another storage tier for quick recovery that avoids using capacity on premium storage tiers.

Reduce Infrastructure Costs and Grow Margins

At the end of the day, service providers are in the business of creating value and generating revenue that turns a profit. However, Telco-based service providers and other managed service providers (MSPs) are challenged by the onslaught of cloud computing and web service companies that are capturing customers after having taken the “build it and they will come” attitude.

Using DataSphere, service providers can now view data demands at file level granularity and make use of objectives to create tiered services across a spectrum of storage types. More importantly, they can now consolidate data from several applications with similar storage price-performance requirements. Having this option reduces the cost of overprovisioning and guarantee SLAs on a per file or object basis.

For example, take four different applications that have different amounts of data that needs high, medium and low performance. Traditionally, a single application’s data would be placed on the same store, LUN or Volume, such as a high performance all-flash-array. That means the small percentage of data that is latency sensitive to an application’s throughput is being serviced appropriately, and the remaining data is underutilized for the storage on which it resides.

Products | Primary Data

Figure 1 - Traditional Application Deployment

Products | Primary Data

Figure 2 - Data and Storage Tiering with DataSphere

This is where a tiered, multi-tenant solution can combine the capabilities of different storage attributes, including cost, performance and reliability, across all the data needs. Storage utilization increases and application throughput is greater, reducing overall infrastructure expense and total cost of ownership. More importantly, the cost of goods decreases for improved MSP margins.

Invest as Your Customers Grow

DataSphere is a software platform that is vendor- and storage protocol-agnostic. Service Providers using DataSphere achieve instant ROI through more efficient use of existing resources. Compounding these savings, DataSphere makes it easy to realize the benefits of using heterogenous resources from any storage vendor, and to integrate new technologies without the need to plan for major upgrades and manual migrations. This helps keep the business focused on meeting customers’ existing and future service needs without disruption.

Products | Primary Data

Figure 3 - The DataSphere software platform is vendor- and protocol-agnostic

As applications consume increasing volumes and varieties of data, it seems unrealistic for companies to slow storage spending, yet DataSphere enables just that. By aligning data to the right resource for business needs, DataSphere reduces infrastructure needs, expands storage choice, and increases operational efficiency to dramatically reduce datacenter costs.

Increase Performance and Agility While Cutting Costs

DataSphere gives MSPs the ability to automate the placement and movement of data across all storage, including object storage. This makes it possible to accelerate performance by ensuring cold data is automatically moved off performant storage tiers, even as applications become more powerful and user and workload diversity increases.

DataSphere can delay the need to purchase more capacity by freeing terabytes of capacity on existing storage, while seamlessly integrating traditional storage with object storage as an active archive. DataSphere also automates many core management tasks to enable IT to support other tasks. Whichever use case best meets your architecture and business needs, DataSphere can help you achieve dramatic cost savings through better use of hardware, simplified management, and a smaller footprint.

Calculate Your Savings Use the Primary Data TCO Calculator on our web site to see how much DataSphere might save you.

Connect With Us

Enter your name and email to receive news and updates from Primary Data. Fields marked with an * are required:

Contact Form

Channel Partner