Slashdot reported that the US Environmental Protection Agency (EPA) delivered a study on Server and Data Center Energy Efficiency. Here is the 13 page summary and the full (133 pg) report. Here are a few highlights from the summary:

“The energy used by the nation’s servers and data centers is significant. It is estimated that this sector consumed about 61 billion kilowatt-hours (kWh) in 2006 (1.5 percent of total U.S. electricity consumption) for a total electricity cost of about $4.5 billion. This estimated level of electricity consumption is more than the electricity consumed by the nation’s color televisions and similar to the amount of electricity consumed by approximately 5.8 million average U.S. households (or about five percent of the total U.S. housing stock).”

“Under current efficiency trends, national energy consumption by servers and data centers could nearly double again in another five years (i.e., by 2011) to more than 100 billion kWh (Figure ES-1), representing a $7.4 billion annual electricity cost.”

The EPA presents scenarios for energy use by servers & data centers in the US from 2007-2011. Based on increasing degrees of server consolidation, adoption of energy efficient servers and power management, the EPA predicts that the *annual* cost savings in 2005 dollars (i.e. excluding inflation) would be between $1.6 to $5.1 billion in the US. Yes, I know that $1.6B to $5.1B is a small figure when compared to the total US IT market spending on energy. I need to think about this more. I know that the “you’re saving the planet” justification isn’t going to fly with everyone (although it would with my sister-in-law :-). I believe that the benefits of cleaner IT will come down to cost savings even with higher rates of IT usage. I just don’t have the data to back this statement yet.

Some of you may have caught the news last week that IBM consolidated over 3,900 servers onto about 30 mainframes running Linux. The move is expected to reduce server footprint by 85% and cut costs by $250M over 5 years. Very cool that Linux & mainframes are being used to save $$$ and reduce the environmental impact of IT.

This is another step in IBM’s Project Big Green:

“The project, involving high-density computing systems that use server and storage virtualization, and energy-efficient power and cooling systems, is part of IBM’s goal to double its data-center capacity by 2010 without increasing energy usage or carbon emissions.”

It will be interesting to see how the server consolidation and virtualization trend impacts Linux adoption. On one hand, Linux on higher-end servers should be attractive to customers running Unix applications and seeking energy and cost savings. On the other hand, Linux has enjoyed its largest success on commodity servers that run at very low rates of utilization. Consolidating commodity servers onto a higher density, larger, more efficient server with higher utilization shouldn’t impact the number of Linux licenses. So maybe the net impact would be minimal on Linux adoption?

Thoughts?