I’ve been communicating with a number of Wikibon colleagues this weekend regarding data center energy consumption in the U.S.
I used as a baseline the August 2007 EPA report that cited U.S. data centers consumed 61B kWh in 2006 which accounted for 1.5% of all U.S. energy consumption. The estimated operational costs of this consumption is $4.4B. This same report projects that by 2011 this figure could reach 100B kWh at a cost of $7.4B.
I wanted to share with you that the consensus in my discussions is that given the heat density projections for servers specifically, these figures could approach 112B kWh by 2011 which is about a 13% compound annual growth rate (CAGR). Essentially the feeling is the EPA figures are conservative.
So my question is this. Is the community comfortable with the following statement:
According to Wikibon.org, power consumption in mid-to-large U.S. data centers is growing faster than projected by the EPA in 2007. We believe that power consumption in U.S. data centers will grow approximately 13% per year and will approach 112B kWh by 2011 at a cost of more than $8B U.S.