Product and service reviews are conducted independently by our editorial team, but we sometimes make money when you click on links. Learn more.
 

Primary Data Reveals DataSphere Platform, Offering Abstraction Layer For Storage Infrastructure

By - Source: Primary Data

Primary Data has taken the covers off of its data virtualization platform, which connects disparate storage systems, offers a way to effortlessly migrate data between storage tiers, and manages everything from virtual machines to SQL databases through smart data administration policies.

DataSphere is a data virtualization platform, which means that it connects to all of the different storage devices already present in the data center. Whether you're using a SAN or NAS on premises, or already started putting some of your data in the cloud, DataSphere provides a mechanism to act as a translator of sorts between the different storage devices. Whether the storage is in the cloud or local to the data center, it can be presented in whatever form you wish. You can separate your data sets across multiple volumes, and each of those volumes can, behind the scenes, be on separate storage devices.

DataSphere operates out of band, so it doesn't get in the middle between clients and the data. This can be thought of almost like a hypervisor for storage devices. Once the storage is part of DataSphere, the storage is unaware of the extra layer between it and the clients. The layer of abstraction overlaid on the storage is what makes transitioning the data between storage easy.

Because the presentation to the client is no longer directly connected to the storage, a network share, virtual disk or database could be moved from one storage device to another storage device without ever changing anything on the client side. As far as the VMware or Hyper-V host is concerned, the virtual disk is still stored on the same storage device. But behind the scenes, the actual storage hardware hosting that software defined network may be changed for any number of reasons.

In fact, one of the key features of DataSphere is its management architecture. By using policies, you can identify where your data lives -- not just "use this SAN for VMs and the cloud for archived files." Because all of the storage is now managed by DataSphere, you can set your policies by everything from I/O throughput, latency, or even by the cost of the storage. With this approach, system administrators can work with application architects to identify the actual resources that an application needs. The policies prevent management from being restricted to simply "adding more space" but offers a tool to design a structure for the application where, as an example, the files in one directory are kept on the fastest storage device, while all other directories are stored on the cheapest storage.

Primary Data is demonstrating DataSphere at VMworld, so if you're going to be there, you'll definitely want to get your eyes on it for yourself.

Comments