Posts Tagged Google
Update: MapR and Google announced at Google I/O 2012 that MapR’s Hadoop distributions will be available on-demand via the new Google Compute Engine, validating Wikibon’s previous analysis. Pressure remains on Hortonworks, Cloudera, other Big Data vendors to shore up their cloud strategies.
For the company that invented MapReduce, Google didn’t have much of a presence in the commercial Big Data market until just last month (with the public release of BigQuery.) While Yahoo! engineers took Google’s concept and spearheaded the open source Hadoop movement, Google was happy to quietly develop its own Big Data platform for its own internal use.
We are in the middle of a Data Center boom where tech companies all over the world are trying to compete for bigger, better, and more efficient info storage facilities. These Data Centers are used to accomplish a variety of online needs ranging from storing Facebook pages to Cloud technology. Take a glimpse into the innovative future of Data Storage with this list of Data Centers newly completed and still under construction worldwide.
It’s been talked about for years. Even before iTunes revolutionized the music industry when it was first released in 2001, people have said the future is in cloud computing.
Cloud computing is a general term for anything that involves delivering hosting services over the Internet. A cloud service has three distinct characteristics that differentiate it from traditional hosting. It is sold on demand, typically by the minute or by the hour, it is elastic – a user can have as much or as little of a service as they want at any given time; and the service is fully managed by the provider.
Here is a graphical look at the cloud computing landscape. From private and enterprise cloud providers (such as IBM, HP/3Par, and VMware), to public cloud services (like Google and Microsoft Azure), to the Hybrid cloud (Verizon and the NRE Alliance of newScale, rPath and Eucalyptus as examples).
The Wikibon community prides itself on its research. Our community’s primary goal has been in helping technology professionals solve business problems through a sharing of IT advisory knowledge. We do this through regular Peer Incites, case studies, and community research.
Data centers touch all our lives. Businesses rely on data centers to house mission critical information and run operational initiatives across the organization.Today’s largest data centers feature state-of-the-art technology, operation rooms spanning thousands of square meters, and are required to hold billions of pieces of customer and business information. As demand for cloud services increase these centers comprise tens or sometimes hundreds of thousands of servers, multi-petabyte storage systems and increasingly are situated in locations where cheap energy is plentiful.
In pictures, here is an inside look at ten of the world’s largest data centers.
How Google, Microsoft and Oracle are Driving Competition in the Storage Industry
What you Need to Know
There is a competitive battle brewing in the on-premise storage business and it’s not between EMC/NetApp or EMC/IBM. It’s stemming from a move by independent software vendors specifically Microsoft and Oracle, to bundle more storage function into their application stacks, push storage function closer to the host and commoditize the storage hardware layer. The move to integrate storage function into the application stack is real and in some cases can add substantial value to organizations. But there is a price to pay and IT executives need to understand the strategies and implications for long term success. Underpinning these trends is Google’s decade long march toward simplification and cloud services; which is not only driving software vendors like Microsoft crazy; it’s also causing them to drive down perceived costs wherever possible and grab as much value in their stacks as they can.
Here’s the bottom line. IT execs have three choices:
Microsoft is now pulling out all the stops. It’s telling clients that I spoke with to think about maintenance expiring on Exchange 2003 (I think you can still buy an extended service plan through 2014 if you give up your third child) and that SAN is not a recommended configuration for 2010. Microsoft is telling customers to worry about complexity and SAN can be a single point of failure. The logic put forth is that if I lose a DAS device I only lose part of my storage whereas if my SAN goes out…all my data is inaccessible. Interesting logic I thought.
Isaac Asimov, one of the greatest science fiction writers of all time, wrote a short story entitled “The Last Question” in 1956. It begins:
The last question was asked for the first time, half in jest, on May 21, 2061, at a time when humanity first stepped into the light. The question came about as a result of a five dollar bet over highballs, and it happened this way …