Posts Tagged IBM
IBM’s annual revenue last year dropped below $100 billion for the first time since 2010. The company’s fourth quarter results were particularly weak, coming in 5.5% below expectations. This was due in large part to IBM’s struggling hardware business, with revenue dropping a staggering 27%.
Then-CEO Sam Palmisano launched IBM’s Smarter Planet initiative five years ago during a speech at the Council on Foreign Relations. IBM would focus its energies, Palmisano said, on helping governments and companies understand and analyze the voluminous data streaming off connected devices and industrial equipment to improve operational efficiencies and deliver better services to citizens and customers.
Since then, IBM has largely had the Industrial Internet, as the concept of has come to be called, to itself. The company’s Smarter Planet division has played a key role in making IBM the biggest Big Data company on the planet and was a lone bright spot in IBM’s otherwise disappointing Q1 2013 results.
HP stated on a recent analyst call that its VirtualSystem best-of-breed integrated system is the “only real alternative to VCE” [Vblock]. While HP may have VCE in its competitive sights, all of the major storage vendors have been ramping up efforts in the converged infrastructure space.
While the number of virtual machines (VMs) that can be deployed on any infrastructure will vary by workload and there are many other capabilities (such as energy efficiency, cost, support, performance, and application support) that should be considered in evaluating stacks, it can be seen that not all stacks are geared for all environments.
We are in the middle of a Data Center boom where tech companies all over the world are trying to compete for bigger, better, and more efficient info storage facilities. These Data Centers are used to accomplish a variety of online needs ranging from storing Facebook pages to Cloud technology. Take a glimpse into the innovative future of Data Storage with this list of Data Centers newly completed and still under construction worldwide.
Over a year ago, I posed the question, “Does 10Gb Ethernet change the Competitive landscape?” Cisco has been the dominant player in networking, for over a decade no competitor ever captured even ten percent of the market. While Ethernet is continuing its march into new markets and new applications, the market landscape has definitely changed. Fresh off of VMworld, there is a buzz in the networking world around new opportunities and architectures.
The Big Trends
With VMworld beginning in Las Vegas this week, we are sure to hear all about new and innovative ways to expand your organization’s approach to cloud computing. “Project Horizon” was previewed at last year’s VMworld, a cloud-based management service that aims to establish a users cloud identity. With Project Horizon and the seemingly thousands of other cloud projects occurring, the demand for massive data centers is on the rise. As their need continues to grow, the immense power they use has become so much of an issue that metrics were created to measure how efficient data centers are. One of these metrics is Power Usage Effectiveness (PUE), a ratio of total amount of power used by the facility to the power delivered to computing equipment. An ideal PUE is 1.0, which would mean the computing equipment is using all of the power coming into the facility. However, a PUE of 1.0 is very difficult to achieve due to the need for lighting, cooling, and other various systems used in the facilities that are not considered computing devices. An additional way companies are trying to reduce cost and power consumption is by building modular data centers. The modular data center approach adds capacity as it is needed in manageable, cost-effective increments. Below you will see five traditional data centers that use alternative energy as a power source, as well as a brief look at currently available modular data centers.
In a recent interview with InformationWeek, Microsoft CEO Steve Ballmer claimed that IBM and Oracle don’t understand Big Data. For Ballmer and Microsoft, Big Data doesn’t depend so much on the size of the data, but on the type of data being processed and analyzed.
Specifically, for a data processing and analytics project to qualify as Big Data, it must encompass not just internal corporate data, but also third-party data that resides outside the firewall, according to Ballmer. He said IBM and Oracle limit their Big Data approaches to internal data, thus they are not in fact Big Data by his definition.
While cloud may be the focus of marketing and press campaigns, VMware server virtualization is still one of the primary growth engines for enterprise data center environments today. CIOs have reaped benefits through consolidation and agility of server virtualization, but have had to deal with the ripple effects of how virtualization breaks storage (and networking). Last year, Wikibon took a close look at the integration journey that is required to allow VMware virtualization to continue its growth by creating higher performance storage solutions that can move into mission critical applications. Every storage vendor has a strong push into virtualization in general, and VMware specifically, and while it is a complex story as to who is “the best”, Wikibon did extensive research to peel back the onion on storage integration with VMware. We have posted the full results of the VMware Storage Integration research; this article and others will add some color to the report.
This week, SAS Institute unveiled a new analytics tool that it will offer in conjunction with data warehouse vendors Teradata and EMC-Greenplum. Called SAS High Performance Analytics, the tool will live inside the data warehouse, a technique known as in-database analytics that is becoming more and more popular in the era of Big Data.
By embedding scoring and modeling capabilities inside the database, in-database analytics allows users to run complex analytics against large data sets without having to transfer the data to a separate analytics or business intelligence application. Loading large volumes of data into an analytics platform can take hours or even days, and in some cases isn’t even possible. As a result, users must often be content to analyze just samples sets of data, which can sometimes lead to inaccurate analysis.