Hitachi Data Systems today announced a unique vision of virtualized data running on virtualized infrastructure along with its new integrated virtual storage platform, a hardware/software combination designed to make this vision a reality for very high-end, mission-critical environments.
Making the announcement, HDS CEO Jack Domme outlined the major challenges of the new environment in which IT operates:
- An a highly siloed legacy infrastructure in which systems from different vendors do not interoperate.
- Huge growth in unstructured data that is growing much faster than IT budgets that needs to be dynamically managed without interruption mission critical processes.
- Huge growth in virtualized applications, many being created in the cloud outside traditional IT channels, that need to access that data independent of the applications that created it.
- Frozen IT budgets that require customers to make do with what they have.
To meet these challenges, Mr. Domme presented a vision that, he said, HDS has been working toward for several years, and the latest major step forward in realizing that vision, the HDS Virtual Storage Platform.
The vision is virtualized data on a virtualized environment. The virtualized storage environment unifies the legacy silos and islands of storage and boosts utilization from the 15%-20% common in unvirtualized data centers up to 80%+, allowing the customer to delay new storage purchases driven by the explosion of unstructured data that companies are facing.
That by itself, however, is not enough, Domme said. HDS envisions virtualized data running on this virtualized storage. Virtualization frees data from the application that created it, making it available to any application in the environment, including the multitude of new thin apps that are being developed on virtualized servers in the cloud, that companies want to use to get more value from that data. It also allows the HDS management software to move data objects from one tier to another in the environment based on the amount of use it is getting, invisible to the applications using that data. Little used data might move down tiers while remaining searchable and fully available to the applications in the virtualized environment.
It also means that that data can remain available to new applications for decades into the future. He used the example of a blood test taken 30 years ago that might be needed for comparison with a new blood test from that patient taken today. In the new HDS virtualized infrastructure that test may have migrated to the lowest tier in the system, but it is still available to the application in use today, even though the application that originally created the data object may have long since been replaced.
To accomplish this, he said, the new version of the HDS data management toolset has been expanded up, out, and deep to provide the abilities it needs to manage data in the virtualized environment. This latest step does not realize the full vision, he admitted. Indeed, some of it is still the subject of bleeding edge research at top universities such as Harvard and Stanford. However, he said, it is a major step forward in providing what HDS customers need.
Action Item: Hitachi's vision is unique and very different from that offered by EDS or Oracle. The question is how much of its promise is realized in its new Virtual Storage Platform, and in particular in its new data management capabilities, and how much is still marketing promise.