Worldly Philosopher: The Digital Revolution's Energy Costs

This week's Worldly Philosopher, Gregor Semieniuk, writes on the trade-off between increased computing power and climate change.


Gregor Semieniuk

Many commentators believe that exponential increases in computing power will lead to tremendous improvements in human welfare - at almost no cost per additional unit, or "marginal" cost.

Erik Brynjolfsson and Andrew McAffee (BM hereafter) express this view in their new book, "The Second Machine Age," and give examples of new technologies that are only possible thanks to recent and ongoing advances in information technology: self-driving cars, real-time translation software, and smart robots that can be taught new movement routines by guiding their arms, rather than programming a new software.

While these innovations are truly breathtaking in their technological sophistication, the authors are wrong to assert that these products and services come at "almost zero marginal cost of reproduction" (BM p. 62). Information technology (IT) - and the "information economy" it fuels - is not energy-neutral. Rather, its energy needs are quite costly, coming from fossil fuels that emit greenhouse gases and continue to supply 80% of the world's energy.

Technologies' use of energy is also costly in its contribution to climate change, which is now widely agreed to have adverse consequences for human welfare (IPCC 2014 and Tony Bonen's blog; Duncan Foley (2013) examines IT's growth trajectory from a classical political economy perspective). Economists and policy makers need to re-examine the claim that life-improving digital technologies are cost-free after their initial development costs.

Figure 1

"Moore's law" predicts the growth of IT by postulating that computing power becomes cheaper at a constant rate, because of more sophisticated chip production techniques. Specifically, it says that a dollar buys twice the chip power after every two years. The "law" has held up fairly well over the past several decades (BM, p. 41). Figure 1 shows what a doubling of chip power available every two years implies: In year twenty, one can purchase thousand times more powerful computers than in year zero for the same cost (after forty years the number is already a million). This exponential cost reduction in computing power, unseen in any other, non-digital machinery, underlies recent technologies that seemed too expensive to realize only a few years ago.

But computer chips are powered by electricity, for their manufacturing, for their operation and for cooling systems to deal with the heat generated by computers. With more and more chips used, the energy consumption is on the rise. The electricity consumed by data centers gives us one way to measure electricity use of digital technologies. These "server farms" are the pivot in today's internet traffic, cloud storage, and every other digital technology that connects to the web, e.g. the real-time translation app or the self-driving car that accesses data about traffic jams on the web.

Figure 2

Figure 2 uses Jonathan Koomey's (2011) estimates of worldwide data center electricity consumption measured in billions of kilowatt hours based on server sales and servers' energy efficiency. In the figure, I take a number halfway between his "upper" and "lower" bound estimates for 2010 consumption as a data point and extrapolate the trend till 2015 by fitting a line to the logarithm of the data and exponentiating it.

The average annual electricity use growth rate according to the fit is 3.2%. According to Koomey's estimates, in 2010 data centers used an estimated 1.3% of worldwide electricity as compared to only 0.5% in 2000. Hence, in the first decade of the millennium - that saw rapid advances in computing power, - digital technologies increased their share in the also growing worldwide energy consumption by an estimated 160%.

BM explain that we have seen only the flimsy beginning of digital technologies' capabilities because computing power will continue to grow at constant rates, increasing its power to drive technologies by unheard of absolute quantities every year. But the electricity data tells us this is likely to trigger significant increases in energy use as well, as the extrapolation of the fit suggests.

Naturally, one may hope that increased efficiency of electricity use in data centers will keep the overall energy use low, as will more efficient hardware manufacturing processes. However, this is a hope and not a certainty, for any input efficiency gain needs to grapple with an economic problem we know as Jevons' Paradox.

William Stanley Jevons observed that in 19th century England, more energy efficient production processes using coal incentivized greater coal consumption (Alcott 2008). What appears to be a paradox reflects the simple fact that reduced coal usage meant that the quantity of goods produced with less coal per unit increased, which in turn more than offset the coal savings per unit of production.

The modern version of the paradox is called the rebound effect (Binswanger 2001). There are disputes over the magnitude of the effect, that is, the exact increase in the number of energy-using machines as each machine's energy efficiency increases. But it is clear that efficiency gains do not translate easily into overall energy savings, especially in the case of rapidly expanding digital technologies.

So BM's unfettered optimism about digital technologies costlessly improving our material living standards needs to address the problem of energy use. (I have only focused here on server farms, but the development of computing-intensive technologies, like smart robots, will also mean increased use of decentralized computers needing electricity for functioning.)

We need to pay more attention to digital technologies' energy use, and put greater emphasis on reducing their fossil energy consumption. Human society needs to grapple with climate change. The information technology sector already accounted for 2% of global carbon emissions in 2008 (The Climate Group 2008), and if Moore's law continues to hold, this share could rise significantly in coming years. Economics teaches us that usually nothing is cost-free, and this applies to the growth of digital technology as well.

Literature mentioned:

Alcott, B., M. Giampietro, K. Mayumi and J. Polimeni. 2012. The Jevons Paradox and the Myth of Resource Efficiency Improvements, London: Earthscan.

Binswanger, M. 2001. Technological progress and sustainable development: what about the rebound effect? Ecological Economics 36. 119–132.

Bonen, A. 2014. The Social Cost of Carbon as a Lower Bound for Policy. Blog post, Accessible on this site at: http://www.economicpolicyresearch.org/research/worldly-philosopher-the-social-cost-of-carbon-as-a-lower-bound-for-policy

Brynjolfsson, E. and A. McAffee. 2014. The Second Machine Age. Work, Progress, and Prosperity in a Time of Brilliant Technologies. New York: W.W. Norton & Co.

Climate Group. 2008. SMART 2020: Enabling the low carbon economy in the information age. The Climate Group. Accessible at: http://www.smart2020.org/publications/

Foley, D. K. 2013. Rethinking Financial Capitalism and the "Information" Economy. David Gordon Memorial Lecture at the ASSA meetings, San Diego. Accessible on the SCEPA website at: http://www.economicpolicyresearch.org/s/Session_3_-_Foley_Gordon_Lecture.pdf

IPCC. 2014. Mitigation of Climate Change: Contribution of Working Group III to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change. Cambridge UK and New York, NY: Cambridge University Press.

Koomey J. 2011. Growth in Data Center Electricity Use 2005 to 2010. Analytics Press, completed at the request of The New York Times. Accessible at http://www.analyticspress.com/datacenters.html

Previous
Previous

Worldly Philosopher: A Debate on Mainstream Economics: A Gadget is a Dangerous Thing

Next
Next

Worldly Philosopher: The Social Cost of Carbon as a Lower Bound for Policy