Posts Tagged Cloud Computing
This week there are two important enterprise technology conferences taking place. One – SAPPHIRE 2014 – will see an old guard enterprise tech giant attempt to show it is capable of adapting to a technology landscape increasingly dominated by the cloud and Big Data. The other – Hadoop Summit 2014 – will see dozens of start-ups born in this new world out to prove to cautious CIOs that their technologies and platforms are ready for enterprise-level workloads.
It’s an interesting juxtaposition. SAP is determined to join the ranks of the “cool” cloud and Big Data companies (Salesforce.com, Hortonworks, Amazon Web Services), while those cool companies are equally determined to join the “enterprise-grade” club dominated by IBM, Oracle and, yes, SAP.
IBM’s annual revenue last year dropped below $100 billion for the first time since 2010. The company’s fourth quarter results were particularly weak, coming in 5.5% below expectations. This was due in large part to IBM’s struggling hardware business, with revenue dropping a staggering 27%.
I’ve already laid out my predictions for Big Data in 2014, but I also wanted to let the Wikibon community know how my colleagues and I plan to cover Big Data in the year ahead. We’ve organized our research agenda into three major buckets.
Technology. Clearly the technologies and products that collectively make up Big Data – including Hadoop, NoSQL data stores, analytic databases, data visualization tools and more – are maturing at a rapid pace (much faster, for example, than relational databases did in the 1980s.) Big Data is also applicable across industries, meaning these technologies are inevitably and increasingly intersecting with adjacent technology movements, namely the cloud, mobile computing and social media. As we have for the last several years, Wikibon will devote significant coverage to these developments with an eye on putting technology innovations in context for enterprise Big Data practitioners (both technology practitioners and line-of-business practitioners.)
Flash competitors are aggressively jockeying for position as the market heats up. It’s a tale of two styles. On the one hand, EMC’s entrance into the all-flash array market targets traditional IT segments. It will both pressure competitive offerings and its own high-end block storage business. EMC is positioning to cannibalize its own base before others cut too deep into the EMC muscle; but it must walk a fine line. At the other end of the spectrum, Fusion-io is uniquely positioned to serve the hyperscale market and currently stands alone with a software-led strategy that leverages atomic writes and delivers new value to database workloads.
“The storage needs of business and application owners are simple: Give me storage when I need it. Provide services appropriate for my application in the most cost-effective manner. Charge me for what I use, don’t charge me for unnecessary waste.
Service-oriented storage has the potential to meet business needs by inherently offering the ability to:
- Provision storage capacity and function that meets application requirements based on performance, scalability, availability, cost and security needs of the business.
EMC has been touting its “Cloud Meets Big Data” messaging for nearly two years now, and today it took a major step in transforming that message into reality.
EMC announced that it is forming a new “virtual organization” focused on Big Data and application development in the cloud. EMC is calling the new organization the Pivotal Initiative and it will include 800 employees from EMC’s Greenplum and Pivotal Labs divisions, and 600 employees from VMware’s vFabric, Cloud Foundry, GemFire, SpringSource and Cetas organizations. EMC owns over 80% of VMware, where former EMC COO Pat Gelsinger joined as CEO earlier this fall.
Last month we highlighted the big list of big data infographics. This month we’re focusing on cloud computing. Our “big list” of cloud computing infographics on the business impact; from small business to enterprise and what the C-suite is thinking about with respect to public, private, and the hybrid cloud.
Let us know your feedback and what if we’re missing anything. You can click into each image thumbnail to view the full infographic and we’ve highlighted applicable third party references as well.
Devops – Art? Science? Hype? Talking about these things can be quite conceptual in many cases. How we integrate the Devops concept into the architecture of a company can quite literally be a blend of arts, discipline and the science of technology. Where that line is drawn is really variable and it depends on a number of factors. At the intersection of Development and Operations is a genetically embedded business agenda that begins with some issue, followed by a root cause analysis, and the production of an answer or solution. The solutions that have come to meet this have formed into this DevOps movement. Through this automated and focused approach, process versus technology and the balance of those elements can cover the spectrum and eventually answers to the end state, ideally factoring in an organization’s agility, capabilities, and ongoing goals. Getting to that state can be by necessity or by design, but in doing so, effectiveness can be accentuated throughout to ensure the best results.
Update: MapR and Google announced at Google I/O 2012 that MapR’s Hadoop distributions will be available on-demand via the new Google Compute Engine, validating Wikibon’s previous analysis. Pressure remains on Hortonworks, Cloudera, other Big Data vendors to shore up their cloud strategies.
For the company that invented MapReduce, Google didn’t have much of a presence in the commercial Big Data market until just last month (with the public release of BigQuery.) While Yahoo! engineers took Google’s concept and spearheaded the open source Hadoop movement, Google was happy to quietly develop its own Big Data platform for its own internal use.
One of the challenges to understanding cloud computing is that it’s not easy to visualize what the solution really looks like. Before heading to HP Discover, I had the opportunity to tour the SwitchNAP facility in Las Vegas. There are dozens of cloud solutions (including HP, EMC, Joyent, Nirvanix, VMware) hosted in the 407,000 square foot co-location facility, and there’s strong (e.g., guys with guns) cloud security. Taking the tour is a geek paradise – it’s like a James Bond villain stronghold: employees dressed in black, metal desks, red and blue LED lighting, and the most technologically advanced data center that I’ve seen. Switch is not only a showcase for the scalable, dense and efficient power and cooling of cloud solutions, but also has extra capabilities of a networking buying consortium and the US Cloud inter-cloud exchangeto enable lots of interesting cloud deployments.