By Dr. Jonathan Koomey, a project scientist at Lawrence Berkeley National Laboratory:
“For some reason, the power used by computers is a source of endless fascination to the public. Most folks think that the power used by computers is a lot more than it actually is, and that it’s growing at incredible rates. Neither one of these beliefs is true, but they reflect a stubborn sense that the economic importance of IT somehow must translate into a large amount of electricity use. That incorrect belief masks an important truth: Information technology has beneficial environmental effects that vastly outweigh the direct environmental impact of the electricity that it consumes.
Back in 1999, a cleverly written article was published in Forbes magazine, claiming that the Internet used 8% of all US electricity, that all computers (including the Internet) used 13% of US electricity, and that this total would grow to half of all electricity use in ten to twenty years. Most major U.S. newspapers and business magazines, many respected institutions, and politicians of both political parties cited these assertions (the first one even came up in Doonesbury at about the same time). Alas, most people took leave of their critical faculties when evaluating them.
Joe Romm, Amory Lovins, and I spent a few person years of effort between us demonstrating in the scientific literature that these assertions were all false (for a compilation of that work, go here). The Internet, as defined by the Forbes authors, used less than 1% of US electricity in 2000, all computers used about 3%, and there is no way, short of repealing the laws of arithmetic, for the total to grow to half of all electricity use in one to two decades (see the Epilogue in Koomey 2008 for a summary). Joe also showed that the high-level statistics on growth in energy, electricity, and carbon emissions in the Internet era all showed exactly the opposite of what the above claims would imply: the growth rates were significantly lower in the Internet era (1996 to 2000) than in the preceding four year period, even though GDP growth was higher in the latter period.
Unfortunately, variations of these myths persist to this day. In early 2009, the normally reliable Sunday Times of London reported that generating the electricity needed for a Google search emitted half as much carbon as did boiling a cup of tea, but this claim proved to be spurious (see Mills and Koomey 2009). As recently as April 12, 2010, Energy Tribune published an op-ed by Robert Bryce repeating the falsehoods in the Forbes article and confusing several important issues on this topic. And the ongoing concern over total electricity used by data centers continues to generate news coverage (for example, see this article in the Guardian), even though these facilities use only about 1% of world electricity use and their efficiency is improving rapidly over time.
In my view, the really important story is that while computers use electricity, they are not a huge contributor to total electricity consumption, and while it’s a good idea to make computers energy efficient, it’s even more important to focus on the capabilities information technology (IT) enables for the broader society. Computers use a few percent of all electricity, but they can help us to use the other 95+% of electricity (not to mention natural gas and oil) a whole lot more efficiently.
As an example of this latter point, consider downloading music versus buying it on a CD. A study that is now “in press” at the peer-reviewed Journal of Industrial Ecology showed that the worst case for downloads and the best case for physical CDs resulted in 40% lower emissions of greenhouse gases for downloads when you factor in all parts of the product lifecycle (Weber et al. 2009). When comparing the best case for downloads to the best case for physical CDs, the emissions reductions are 80%. Other studies have found similar results (see Turk et al. 2003, Sivaraman et al. 2007, Gard and Keoleian 2002, and Zurkirch and Reichart 2000). In general, moving bits is environmentally preferable to moving atoms, and whether it’s dematerialization (replacing materials with information) or reduced transportation (from not having to move materials or people, because of electronic data transfers or telepresence) IT is a game changer.
Another area where IT can help us is in getting smarter and more capable, so we can use our resources more efficiently. This could take the form of better sensors and controls in buildings and industry, like the wireless sensor networks that can be quickly and cheaply distributed in existing structures without wiring. Or it could involve more widespread use of software to make better energy-related decisions, such as Lawrence Berkeley National Laboratory’s Home Energy Saver or the private sector tool called Wattbot, both of which I’ve worked on over the years. Or it could involve computer controls in automobile engines, which reduce criteria pollutant emissions and improve fuel economy at the same time. Or it might mean smart meters that track electricity use minute by minute. Or it might involve the various companies that scan utility bills for big corporations and “roll up” those bills into analysis software that gives companies visibility into their actual energy costs (see, for example, AdvantageIQ). All of these examples and more are enabled by cheap, abundant, and powerful information technology.
And there is good reason to believe that trends in information technology are going to make these positive developments even more pervasive and important. We’re all familiar with Moore’s law, which describes the rate of change in transistors per chip over time (doubling every year from the mid-1960s to the mid-1970s, and doubling every two years since the mid-1970s), with correspondingly rapid reductions in costs per transistor. However, few people are aware that there’s a similarly regular trend on the electrical efficiency of computers that has persisted for two decades longer than Moore’s law, and applies to all electronic information technology, not just microprocessors. The electrical efficiency of computation, defined as the number of computations we can do per kilowatt-hour consumed, has doubled roughly every year and a half since the mid 1940s (see Koomey et al. 2009, below).
This trend has important implications for mobile computing technologies because these devices are constrained by battery storage. The power needed to perform a task requiring a fixed number of computations will fall by half every 1.5 years, enabling mobile devices performing such tasks to become smaller and less power consuming, and making many more mobile computing applications feasible. Alternatively, the performance of mobile devices could continue to double every 1.5 years while maintaining the same battery life (assuming battery capacity doesn’t improve). Some applications (like laptop computers) will likely tend towards the latter scenario, while others (like mobile sensors) will take advantage of increased efficiency to become less power hungry and more ubiquitous.
IT is one technology that should give us hope about meeting an aggressive warming limit of 2 degrees C (or less) from preindustrial times. Never before has society had to confront a challenge like this, but never before have we had such a powerful technology moving so rapidly in the right direction. And if we combine ubiquitous mobile computing with rapid advances in solar photovoltaic technologies (like in the Big Belly trash compactor, for example), the possibilities for truly game changing societal innovation are breathtaking.
Of course, this story is as much about personal and institutional change as it is about technology, and without a focus on the human and organizational evolution (as well as a stiff price on carbon) we’ll continue on our currently unsustainable path. But one important piece of facing the climate challenge is falling rapidly into place: Information technology allows us to dematerialize, reduce transportation emissions, and get smarter faster. There’s no time to waste in putting it to work.”