Archive for category Cloud Computing
Recently, IBM announced a $1 billion initiative intended to improve the overall flash storage market and integrate flash storage in the company’s line of enterprise technology equipment, including servers, storage, and other products. The company feels that flash-based storage is an a tipping point in the marketplace and is poised to become much more widely used, thanks to the incredible performance gains offered by the technology. Further, as is the case with any technology, as it approaches a critical mass point, the overall costs of the technology begin to drop and this is certainly happening with flash storage. There are also other significant cost benefits to flash-based storage, such as reduced power consumption. At scale, such power savings can be real and significant.
Whether it’s considered a blessing or a curse, CIOs today have a multitude of options at their disposal when it comes to running workloads. In general, there are four options:
- On-premises – physical server.
- On-premises – virtual machine.
- Off-premises – hosted.
- Off-premises – cloud.
Over the past decade, the issue of whether to run on-premises workloads on physical hardware vs. virtual infrastructure has become pretty easy for organizations to assess, with the majority of new workloads being run inside virtual machines. That said, there are still a good number of applications deployed on physical hardware.
For many companies, the cloud remains an amorphous mystery that is only the stuff of speculation and conversation. For some companies, though, the cloud has become the business as they’ve embraced what the cloud can do for them.
One company recently turned upside down its entire business model and eliminated their physical product delivery service in favor of a cloud-based electronic service. This company, TrainSignal, provides computer-based training products for IT pros.
Disclaimer: I’ve done a lot of work for TrainSignal over the years, having created ten full-length video training courses.
“The storage needs of business and application owners are simple: Give me storage when I need it. Provide services appropriate for my application in the most cost-effective manner. Charge me for what I use, don’t charge me for unnecessary waste.
Service-oriented storage has the potential to meet business needs by inherently offering the ability to:
- Provision storage capacity and function that meets application requirements based on performance, scalability, availability, cost and security needs of the business.
Amazon’s aggressive push into the traditional enterprise space will place pressure on CIOs and enterprise IT suppliers alike. To release this pressure, CIOs must treat AWS as another tool in their bag, embrace the public cloud generally and help their organizations understand the right strategic fit for public cloud services; balancing convenience with compliance. Meanwhile, technology suppliers must differentiate by focusing on best-of-breed services, industry-specific capabilities and delivering business value deep within regions around the globe.
Last weekend, the Wall Street Journal published a report citing sources that claim Amazon’s AWS business exceeded $2B in 2012 and will generate $3.8B in 2013, an 81% growth rate. The numbers are getting crazy. Some of these same and other sources have the AWS market (unclear what this means) hitting $38B by 2015 and AWS revenue reaching $20B by the end of the decade. The Journal article cited comments from Amazon CEO Jeff Bezos claiming that AWS can be at least as large as the company’s retail business. By comparison, Amazon’s retail operation is expected to grow 25% this year to $73.6B.
EMC has been touting its “Cloud Meets Big Data” messaging for nearly two years now, and today it took a major step in transforming that message into reality.
EMC announced that it is forming a new “virtual organization” focused on Big Data and application development in the cloud. EMC is calling the new organization the Pivotal Initiative and it will include 800 employees from EMC’s Greenplum and Pivotal Labs divisions, and 600 employees from VMware’s vFabric, Cloud Foundry, GemFire, SpringSource and Cetas organizations. EMC owns over 80% of VMware, where former EMC COO Pat Gelsinger joined as CEO earlier this fall.
At its annual PASS Summit today, Microsoft announced that it will include in-memory OLTP capabilities in the next version of SQL Server, due out no earlier than 2014. Code-named Project Hekaton, the additional in-memory transactional-support capabilities will compliment Microsoft’s existing in-memory analytics tools, namely Excel PowerPivot and its xVelocity line, as well as its new Hadoop-based platform HDInsight.
The announcement serves to further solidify in-memory data processing as critical element of next generation Big Data architectures. Other well-known enterprise technology vendors, incuding SAP and Oracle, have likewise embraced in-memory processing capabilities, as have lesser-known but emergent players like Aerospike, DataStax and Kognitio.
Last month we highlighted the big list of big data infographics. This month we’re focusing on cloud computing. Our “big list” of cloud computing infographics on the business impact; from small business to enterprise and what the C-suite is thinking about with respect to public, private, and the hybrid cloud.
Let us know your feedback and what if we’re missing anything. You can click into each image thumbnail to view the full infographic and we’ve highlighted applicable third party references as well.
Devops – Art? Science? Hype? Talking about these things can be quite conceptual in many cases. How we integrate the Devops concept into the architecture of a company can quite literally be a blend of arts, discipline and the science of technology. Where that line is drawn is really variable and it depends on a number of factors. At the intersection of Development and Operations is a genetically embedded business agenda that begins with some issue, followed by a root cause analysis, and the production of an answer or solution. The solutions that have come to meet this have formed into this DevOps movement. Through this automated and focused approach, process versus technology and the balance of those elements can cover the spectrum and eventually answers to the end state, ideally factoring in an organization’s agility, capabilities, and ongoing goals. Getting to that state can be by necessity or by design, but in doing so, effectiveness can be accentuated throughout to ensure the best results.