Archive for category EMC
I’d like to explore the topic of how system and storage architectures are changing and the impact this will have on application delivery and organizational productivity.
Allow me to put forth the following premise:
Today’s enterprise IT infrastructure limits application value.
What does that mean? To answer this, let’s first explore the notion of value. The value IT brings to an organization flows directly from the application to the business and is measured in terms of the productivity of the organization. Infrastructure in-and-of itself delivers no direct value; however the applications, which run on infrastructure directly affect business value. Value comes in many forms but at the highest level it’s about increasing revenue and/or cutting costs; and ultimately delivering bottom line profits.
Flash competitors are aggressively jockeying for position as the market heats up. It’s a tale of two styles. On the one hand, EMC’s entrance into the all-flash array market targets traditional IT segments. It will both pressure competitive offerings and its own high-end block storage business. EMC is positioning to cannibalize its own base before others cut too deep into the EMC muscle; but it must walk a fine line. At the other end of the spectrum, Fusion-io is uniquely positioned to serve the hyperscale market and currently stands alone with a software-led strategy that leverages atomic writes and delivers new value to database workloads.
“The storage needs of business and application owners are simple: Give me storage when I need it. Provide services appropriate for my application in the most cost-effective manner. Charge me for what I use, don’t charge me for unnecessary waste.
Service-oriented storage has the potential to meet business needs by inherently offering the ability to:
- Provision storage capacity and function that meets application requirements based on performance, scalability, availability, cost and security needs of the business.
Last weekend, the Wall Street Journal published a report citing sources that claim Amazon’s AWS business exceeded $2B in 2012 and will generate $3.8B in 2013, an 81% growth rate. The numbers are getting crazy. Some of these same and other sources have the AWS market (unclear what this means) hitting $38B by 2015 and AWS revenue reaching $20B by the end of the decade. The Journal article cited comments from Amazon CEO Jeff Bezos claiming that AWS can be at least as large as the company’s retail business. By comparison, Amazon’s retail operation is expected to grow 25% this year to $73.6B.
EMC has been touting its “Cloud Meets Big Data” messaging for nearly two years now, and today it took a major step in transforming that message into reality.
EMC announced that it is forming a new “virtual organization” focused on Big Data and application development in the cloud. EMC is calling the new organization the Pivotal Initiative and it will include 800 employees from EMC’s Greenplum and Pivotal Labs divisions, and 600 employees from VMware’s vFabric, Cloud Foundry, GemFire, SpringSource and Cetas organizations. EMC owns over 80% of VMware, where former EMC COO Pat Gelsinger joined as CEO earlier this fall.
He who shall not be named sent me this thumbnail today. At any rate, the cat was already let out of the bag last month by Dave Raffo and several folks on Twitter but it looks like the ink is dry and there’s no turning back on VF Cache as the official name for Project Lightning.
What is Known About VF Cache?
The storage world is getting ready for the launch of EMC’s Project Lightning. EMC has invited press, analysts and the world to an announcement on February 6th to see the unveiling of the server-based flash product and strategy to manage data using EMC automated tiering software.
EMC’s strategy with Project Lightning is to extend the storage stack closer to the server. For the past two decades, we’ve seen storage function steadily move from server/host to storage/SAN. EMC started this trend with its Symmetrix disk array, which initially connected to virtually all types of OSes and host processors. That vision extended to the SAN and the external RAID, storage network concept became the standard architecture for storing, protecting and sharing mission critical data.
When IT Consumers Become Technology Providers—A Vertically-Led Paradigm Shift Powered by the Cloud and Big Data
Until the conception of the World Wide Web and commercialization of the Web browser in the mid-1990’s, the IT industry was characterized by global monopolies that dominated the technology business. Despite the amazing growth trajectory of IT in the past sixty years, there really have only been two great monopolies in the history of this business—IBM and the virtual monopoly of Microsoft and Intel.
This week, SAS Institute unveiled a new analytics tool that it will offer in conjunction with data warehouse vendors Teradata and EMC-Greenplum. Called SAS High Performance Analytics, the tool will live inside the data warehouse, a technique known as in-database analytics that is becoming more and more popular in the era of Big Data.
By embedding scoring and modeling capabilities inside the database, in-database analytics allows users to run complex analytics against large data sets without having to transfer the data to a separate analytics or business intelligence application. Loading large volumes of data into an analytics platform can take hours or even days, and in some cases isn’t even possible. As a result, users must often be content to analyze just samples sets of data, which can sometimes lead to inaccurate analysis.
Die-hard technologists often ignore the services side of the IT business, looking at it as a necessary evil, a cost-center threatening to get out of control, or too abstract to fully investigate. What exactly is systems integration again?
But a look at the financials of several leading OEMs shows that services now makes up a significant part of their businesses:
|OEM||Percentage Revenue from Services*|