When you first step into a big data center, such as the one housed in Building 513 at CERN—the research institution that is the home of the Large Hadron Collider, and birthplace of the World-Wide Web—the impact can be overwhelming. It takes the guts, or processor cores, of approximately 110,000 computers to handle the vast amount of data that comes in from the billions of collisions of hydrogen ions inside the collider, and about 10,000 servers to host it all. About one petabyte of data is processed each day, or the equivalent of about 210,000 DVDs, and the collective noise generated by all these cores whirling away in the “Centre de Calcul” to digest their slice of the pie is deafening; one has to shout to be heard. The heat the cores generate is similarly vast: An architect has drawings on the books to recycle this waste heat and use it to warm nearby buildings in the depths of the Swiss winter. (And on a related note, the collider’s computerized data is stored for the long-term on old-fashioned magnetic tape, because tape is inexpensive, compact, durable—and most important, uses little electricity. The collider itself shuts down for maintenance for two weeks at the end of each year, in part because that is when electrical demand and electricity rates are highest.) Partly because of such considerations, some researchers in academia are making efforts at “zero-power computing.”
Unfortunately, such environmental concerns seem to be the exception, not the rule, in the world of computing.
Places such as Google, Amazon, Netflix, and Microsoft have facilities much larger than CERN’s, which they use for taking on even bigger volumes of high-throughput computing jobs. These centers form the backbone of the Cloud—the remote memory and other computing services accessed by the Internet. Data centers contain the machines that do the actual work behind such activities as shopping online, booking the world’s hotel and automobile reservations, transmitting movies down fiber optic lines, emailing, storing photos, putting up people’s billions of Facebook posts … and such economically and environmentally dubious activities as “mining” Bitcoins.
The trouble is, all the energy needed to run those centers has to come from somewhere—and it is often generated by burning coal or other fossil fuels that contribute to global warming. What’s more, the waste heat is often simply dumped; instead of capturing the heat its server farms produce, for example, Microsoft’s latest plan is to simply place them undersea, where their heat will dissipate into the surrounding water—as if the oceans didn’t have enough problems already with warming.
The computer industry has been very successful at running under the radar when it comes to energy and environmental oversight, making it a significant contributor to global warming. Some figures suggest that the carbon footprint of Internet activity well exceeds that of air travel, and yet we hear a lot about the latter and almost nothing about the former. It seems that much of our shiny, white, iPad economy—promoted so heavily as environmentally friendly—actually runs on dirty black coal.
And that’s not the only environmental problem in computing. There are several others at the moment, with a massive new one rapidly emerging on the horizon.
The underside of online. There are at least four environmental aspects of the online economy that deserve serious scrutiny from consumers, producers, and regulators alike.
The first and most obvious is energy use. According to a 2011 study by Greenpeace, data centers alone account for almost 2 percent of all global electricity use, and this use is projected to increase by at least 12 percent annually. In the 2014 update to this report, Greenpeace noted that if the Cloud were a country, it would rank sixth overall in national energy consumption, behind the United States, China, Russia, India, and Japan—but well ahead of Germany, Canada, Brazil, and France. And while it is true that some data centers in the United States are moving towards more renewable sources of energy, as global Internet usage increases, the reliance on traditional forms of power—particularly coal—continues. Even in the United States, backup diesel generators are frequently used, because of the emphasis on data centers that can be continuously “up” and running round the clock. In his insightful 2012 series in the New York Times about the social and environmental impacts of rural data centers, James Glanz noted that in 2008, the San Francisco Bay Area Air Quality Management District had described Microsoft’s Santa Clara data center—in the hub of Silicon Valley—as one of the largest stationary diesel polluters in the region.
And these figures on data center energy use do not include the energy required to produce the digital devices that make these data centers necessary in the first place. According to a 2004 study by the United Nations, the energy required to manufacture computing devices is unusually high relative to comparable household technologies, which means that although the cost of using such devices is low, their overall annual energy burden is significant. The energy burden of a desktop computer, for example, is 1.3 times that of a refrigerator. The total amount of fossil fuels used to make one desktop computer weighs over 240 kilograms, or about 520 pounds. The amount of water required (1,500 kilograms, or 3,307 pounds) is even more significant.
Which brings up a second environmental concern: water consumption. Where energy is used, heat is created, and in a large, dense data center, the availability of water can be even more of a limiting factor than electricity rates. Cooling even a medium-size high-density server farm can require as much as 360,000 gallons of water (1,362,748 liters) per day. The National Security Agency’s new Intelligence Community Comprehensive National Cybersecurity Initiative Data Center in Bluffdale, Utah, is estimated to consume 1.7 million gallons (6,435,200 liters) every day. In many municipalities, the demand for fresh water for data centers competes directly with agriculture, industry, and the household needs of citizens, as is the case in rural Washington state. And, of course, in global terms water is an increasingly scarce and contested resource. We might conceivably solve the energy crisis, but the water shortage isn’t going anywhere.
A third concern about the environmental impacts of computing has to do with the mining of the materials needed for the global digital economy. This includes lithium mining in South America (environmentally neutral but politically fraught), tin mining in Indonesia (devastating to humans, plants, and animals), and cobalt mining in the Democratic Republic of Congo (often performed by slave labor in conditions morally, politically, and environmentally repugnant). Many of the rare earth elements crucial to both high-tech mobile devices and renewable energy technologies are controlled almost entirely by China, and are found only as by-products of traditional extractive industries. While the environmental impacts of such supply chains might be invisible to most Western consumers, they are nevertheless a critical and inescapable consequence of the digital economy.
Finally, there is the problem of e-waste, or the discarding of appliances that use electricity. Western nations alone account for more than 25 million tons of e-waste every year, and the disposal of this waste—often in less environmentally regulated regions of the developing world—introduces into the environment such dangerous contaminants as lead, antimony, mercury, cadmium, and nickel, as well as polybrominated diphenyl ethers and polychlorinated biphenyls (PCBs). In places like the “computer graveyards” outside of Agbogbloshie, Ghana, e-waste is burned over open flames to recover valuable trace minerals such as gold and copper, releasing in the process numerous human and environmental toxins.
While e-waste is often portrayed as an exclusively developing-world crisis, its insidious and often invisible influence is pervasive, even in the United States. Many Americans are unaware, for example, that the single largest concentration of Superfund sites (locations designated by the Environmental Protection Agency as particularly polluted and in need of immediate remediation) is located in Silicon Valley. In the roughly 10 by 40 mile strip of land that comprises Santa Clara County, California, there are 23 Superfund sites; most of them are contaminated by the byproducts of semiconductor manufacturing, including such highly toxic chemicals as trichloroethylene, Freon, and PCBs. Among other dangers, these chemicals have been linked to elevated rates of miscarriages, birth defects, and cancer. So far, more than $200 million has been spent on cleaning up soil and ground-water pollution in the area, and the extent of the problem is only just starting to be addressed. Most of the well-educated and well-paid engineers and scientists who live in the area are unaware of the environmental dangers posed by their seemingly “clean” post-industrial information economy.
And to make matters worse, some bold new computing technologies, such as virtual currency, were deliberately set up in such as way to be as wasteful as possible—such as the “mining” of Bitcoins.
An infinite sink of real resources. For those who do not regularly follow such things, Bitcoins are a computerized form of currency that offer financial institutions the promise of a decentralized network of exchange that is low-cost but reliable, and largely free from government oversight and intervention.
They rely upon an exciting new technology known as a “blockchain”—in technical terms, a distributed virtual ledger that can be used to record transactions. It may sound arcane, but blockchains and Bitcoins were the enthusiastic, optimistic talk of the global elite at the World Economic Forum meeting in Davos, Switzerland, this past January, far exceeding any grim discussions about financial crises or the need for regulatory reforms.
To many observers in the computing community, however, the hype surrounding the blockchain is confusing, if not downright inexplicable. After all, the only current widespread application of blockchain technology for financial purposes is for the Bitcoin virtual currency—and the results of the Bitcoin experiment are, at best, extremely mixed. Although the value of Bitcoin on the speculative markets at one point did reach a shockingly high level, more recently the Bitcoin network has been plagued by fraud, theft, fragmentation, and manipulation. It might be that the promise of the underlying blockchain technology transcends its particular implementation in the Bitcoin currency, but the evidence so far does not seem to justify the uncritical enthusiasm of the Davos participants.
My critique of the blockchain has nothing to do with its suitability as the foundation for a new global financial infrastructure, or its political and economic implications. Rather, my concerns are about its environmental impacts. At the heart of the current blockchain network—the key innovation that allows a decentralized, intangible, and yet entirely reliable and authoritative virtual ledger to exist—is a computational technique known as proof-of-work. And proof-of-work, when implemented on a global scale, is by design extraordinarily energy-intensive.
To give a sense of how this functions in practice, consider the case of the Bitcoin currency. Since Bitcoins exist only as entries in a virtual blockchain ledger, it is essential that every transaction must be verified and accepted by the collective participants in the network. How to do that in the absence of a centralized authority? To earn the right to enter a transaction into the blockchain, participants must engage in proof-of-work by solving an arbitrary cryptographic puzzle whose solution can only be produced using brute-force computational methods. The winner of the race to a solution is granted the right to enter the transaction, and is rewarded by the creation (alias “mining”) of a new Bitcoin and, possibly, the recording of a small transaction fee in the blockchain register.
What all this means is that the process of mining a Bitcoin is highly computationally expensive. And to make matters worse, the difficulty of the Bitcoin puzzle is constantly being adjusted. No matter how much computing power is thrown at the Bitcoin puzzle, mining a Bitcoin is designed to require about 10 minutes of processing power. Consequently, it is an infinite sink for computing power—and, as a result, of real money and material resources.
The size of the Bitcoin energy sink is enormous: As of today, Bitcoin’s worldwide computational output is closing on on 200 exaflops—or, to put it another way, 256 times the combined capacity of the world’s top 500 supercomputers, the “big iron” of the field. This figure is many times what CERN’s Centre de Calcul outside Geneva handles in a day, although technically “flops” are a measure of a computers’ speed, while bytes concern storage capacity. All of this for a single implementation of the blockchain technology for a currency whose relevance to the world economy is, at best, extremely marginal. Imagine what kinds of computational resources would be required if the “blockchain everywhere” advocates got their way, and various and multiple blockchain networks were introduced at every level of the global financial infrastructure.
To save everything, click here. Why is the current enthusiasm for the blockchain so dangerous? Again, leaving aside the political, economic, and social implications of an unregulated, extragovernmental network of financial exchange, let us keep in mind the direct effects on the environment. It is the dirty little secret of the information economy that computing power directly translates into electrical power, and by extension into the coal, oil, water, uranium, and other natural resources that are required to generate that electrical power.
To give you some sense of the scale of this relationship of power, on a single day in September, 2013, near the height of the market value of a Bitcoin, more than 18,000 megawatts of electricity were devoted to the race to solve the arbitrary Bitcoin cryptographic puzzle. To put this number into perspective, this puts the total annual energy consumption of the Bitcoin network somewhere between the nation states of Iceland and Ireland. And on an average day, when only 983 megawatts are used to mine bitcoins, that is still about half of what it takes to power the Large Hadron Collider for a day. Although there is some dispute about the exact energy requirements of the current Bitcoin network, the collective energy demand of Bitcoin is and will remain significant. Two-hundred exaflops of computing power costs a lot of electricity to operate, and although individual Bitcoin mining devices are becoming more energy efficient, the overall energy use and greenhouse gas emissions of the entire network will continue to increase over time—and is in fact designed to. Already the global center of gravity of Bitcoin activity has migrated to China, which is much more dependent on coal-fired electrical generation than the United States.
The lack of awareness among even the Davos elites of the energy demands of the blockchain technology with which they are so enamored is a symptom of a larger lack of consciousness about the potential environmental impacts of the digital economy. In the Internet era, we have become accustomed to thinking about information as supposedly being “free,” and the increasingly pervasive metaphor of the Cloud only serves to enhance that perception. To be digital is to be immaterial, at least according to popular belief, and it seems obvious that by moving more and more of our activities online we are moving towards a less environmentally hazardous economy.
While it may be true that some digital products and services are more environmentally sustainable than their traditional, bricks-and-mortar equivalents, this is not true in many other cases. At the very least, we need to pay more attention to the environmental costs of our online activities. There is no such thing as a free lunch, even in a virtual cafe in cyberspace.
None of this is to say that digital economy, or blockchain technology, or even the deeply flawed Bitcoin virtual currency are necessarily bad ideas from an environmental perspective. But they are not without their impacts, and as these technologies become more widespread and more globally distributed, these impacts will become increasingly significant. What is so disappointing is not that so many world leaders chose to rest their hopes on unproven technological “solutionism” (as Evgeny Morozov, author of To Save Everything Click Here, has aptly described it), but that they have failed to even try to count the environmental costs of implementing a widespread blockchain infrastructure.
So far, the computer industry has been very successful at running under the radar when it comes to energy and environmental oversight, and that needs to change given its significant contribution to global warming. The Cloud is surprisingly poisonous, a fact that few people outside of big data centers seem to be aware of. Government and industry must find ways to encourage the computing heavyweights to opt for efficiency and recycling instead of dumping their waste heat into the environment—a modern, high-tech version of what environmental thinkers call “the tragedy of the commons.” And the computing heavyweights must take more responsibility for the related problems of water consumption, mining, e-waste, and deliberately wasteful technologies. The solutions may involve more carbon taxes, greater incentives, white papers, regulations, extended manufacturer responsibility, or some combination of all of the above.
The knowledge and consciousness of these issues have not been a part of our collective conversation about the design and use of information technologies. They need to become so, and quickly.