Last week I called my friend and colleague Stan Zaffos over at Gartner and asked him to come on theCUBE to talk about his new Magic Quadrant that I thought he’d published already. Stan said the report would actually be coming out that very day (the 21st) but he was headed into a meeting and would call me back to talk about it. I haven’t heard from him yet but this weekend I got a glimpse of the most recent “Gartner Magic Quadrant for General-Purpose Disk Arrays.”
Now I know the blogoshphere (and Gartner’s competition) loves to trash the whole idea behind the Magic Quadrant but I’m not here to do that. I can only imagine how miserable it must be to put one of these things together. Every vendor on the planet trying to beg, borrow, threaten, steal or cajole their way to the upper right hand corner…having to check facts, read spec sheets, sit through vendor briefings, talk to customers, defend results, etc. Meanwhile your buy-side client is using your document as a blunt instrument to beat up a sell-side client on price because their dot is 5mm lower than the other guy. And a vendor’s sales rep is high fiving you because you just threw holy water on their product. You want to hide and you’re constantly being interrupted by nonsense (that admittedly you caused) but you just want to get back to work.
Ok – so I understand. This is why everyone rolls their eyes and wants to hammer the Magic Quadrant but I have no contempt for the poor souls that have to write these things – only deep sympathy; especially in this case. I’ve known Stan Zaffos for more than twenty-five years (and Roger Cox – another co-author for nearly ten) and I can tell you he takes great pride in his work. He’s passionate, smart, opinionated, knowledgeable, very detail oriented and honest. Same goes for Roger. So I can only assume they have done their best to represent their informed views on this topic. The challenge in using this research is that Gartner has decided – for good reason in my view – to consolidate a broad range of market segments into a single report.
Specifically, from what I can tell, this effort spans traditional Tier 1 (e.g. mainframe class storage), traditional Tier 2 (e.g. midrange and high end modular products) and both the low and high end of the NAS market space—so at least five market segments merged into one. In reading the report, the authors indicate their reasoning is that the lines between midrange and high-end are blurring and often a range of solutions are allowed to compete for business during tech refreshes (i.e. high-end, midrange, NAS and unified storage). This makes perfect sense to me as the traditional Tier 1 players (IBM, HDS and EMC) are now being surrounded by 3PAR (HP), NetApp, Compellent (Dell), Isilon (EMC) and others including Oracle (Exadata) and DDN.
So what you get is the research version of the virtualization I/O blender effect that mashes all the products and companies into a big juicer and what comes out has very little differentiation. Gartner I believe made an attempt to address this by weighting markets a vendor plays in more heavily – essentially not punishing them for markets in which they don’t compete. This may however make the research even more difficult to interpret because if I understand the methodology correctly, Gartner is penalizing a vendor for participating in a segment and having a non-leading product in that segment but not penalizing a vendor that doesn’t play in that segment at all. It’s kind of like a college admissions process that doesn’t recognize a student that gets a B+ in an AP class has accomplished more than a student receiving an A- in a non-AP course. Because of this new methodology Gartner warns that comparisons with previous Magic Quadrants (which are more granular) are dubious. Of course I did this anyway.
I compared the 2011 Magic Quadrant for Midrange and High End Disk Arrays with this new one and Gartner is right – it’s not useful. Back then for example HP was the #1 leading visionary and tucked in just behind EMC and NetApp on execution. Ostensibly HP led in vision because of the 3PAR acquisition. But 3PAR’s visionary mojo in the current analysis gets buried inside of HP’s huge portfolio, which was I’m sure dinged in other areas that weren’t ‘blended’ in back in 2011. You could make a similar argument for IBM’s Storwize Vxxxx Series products– strong contenders that get diluted in the new methodology. Gartner actually calls out both 3PAR and Storwize as leading examples in its written analysis but both HP’s and IBM’s positions are likely diluted by the consolidated approach. Are the positions of IBM and HP unfair? No, probably not. The issue is, however that if a practitioner wants to move on either of these products he or she can’t use the research as effectively as possible because its all munged together. One other note is that while I’m giving product examples, the attributes of products according to the methodology section of the research account for well under half of the positioning of the dots for the players.
To be clear – I’m not saying Gartner is mistaken in doing this consolidation – I’m just trying to interpret the results and maybe even make some strong recommendations to my friends at Gartner as to how to make the research more relevant going forward. Presumably the thinking is that over time, this method will become more useful as comparisons can be made to show movements and positional changes based on acquisitions and other activities. My advice to the kids at Gartner, however is don’t bother. Spending as much time as you must have expended on this document isn’t worth it and could be applied elsewhere to add more value to your customers as I’ll explain here in more detail.
What the Current MQ Says
I don’t have a public link to share but will update this blog when one becomes available—which will invariably happen when some unknown company makes the list as the 20th out of 22 vendors and wants to pay Gartner for the right to link to the chart and collect leads. For now – here’s what the new Magic Quadrant looks like:
Update: Here’s the link to the latest Gartner Magic Quadrant.
A More Useful MQ
Here’s my take on this new view of the world:
First of all, it’s not new. We’re talking about external storage arrays mainly comprising solutions based on spinning rust. Nobody cares. This market is mature and since the acquisition of 3PAR, Compellent and Isilon it’s pretty stable (read boring).
Second – the market has become an oligopoly. Six companies control the chessboard. Market shares ebb and flow but no one has been able to command a lead like Cisco’s in networking (for example). In fairness and to EMC’s credit it has held the #1 spot forever but the whole game is changing and past is not prologue, a point which underscores the rear-view mirror nature of this market definition.
Third – the whole market is moving toward convergence, software-led infrastructure, hyperscale, open source, Hadoop, NoSQL, server-side flash, flash hybrids, all flash arrays, object stores, the cloud, analytics, automation and artificial intelligence. Yet with this current instantiation of the Magic Quadrant we’re trying to differentiate between a dozen or so companies’ external disk arrays. Forget it. The only thing that matters about this market is it’s funding R&D for the next wave of innovation – which will come from both organic investment and acquisition.
I do think this blended model of a Magic Quadrant has utility if applied to a more forward-thinking definition. Start with the premise that the storage business is permanently changing toward a developer-centric, software-led market. There are tectonic end customer industry shifts combining with new technologies such as flash, open source programming models, analytics, much greater processing power and intelligence to capitalize on an Internet of things. Assume traditional arrays will not be the source of competitive advantage but rather some new platform that can gain insights from data will be the lever.
I’d be fascinated to see a “Skating to the Puck” Magic Quadrant that evaluates the vendors’ vision and ability to execute on developing solutions that align with the buzzwords I’ve thrown out above. I’m talking about a dramatically simplified, secure, protected, software-led, services-oriented, platform approach to storage with a set of open APIs that intelligently brings together server-side and storage innovations; and can talk block or file through an object interface. In this world, all active data will reside in flash and a single level store will extend the programming model from memory through persistent storage to eliminate both the “horrible storage stack” and an endless array of stove-piped storage OSes and controllers.
I’ll get it started. The upper right hand quadrant is empty because no one is delivering on this today. New players like Amazon, Fusion-io, Violin, Nimble, Cloudera, Nutanix and others (sorry to my friends that aren’t doing $25M in revenue – that’s what it takes to get in Gartner’s MQ) are prominently in the mix with the whales, who are all clumped in an indiscernible pack in the middle of the quadrant – poised to acquire, re-invent themselves and execute on a new vision to completely change the economics of infrastructure at scale. Think about what Jeff Hammerbacher brought to Facebook and Cloudera and apply that to the enterprise storage business. If you don’t know what I’m talking about watch this video (skip 1 minute in).
This view completely re-defines the storage business. No longer is storage a very resilient, very expensive container that sucks up all your budget; rather it’s a flexible, open, cost effective resource, surrounded by open source software, where the people and processes around that storage entity enable massive value creation. The infrastructure is software-led and delivers unprecedented scale, simplicity, efficiency and most importantly, productivity – not just to IT but to the entire organization.
Now that would be “Magic?”