
Photo by Manuel Geissinger on Pexels.com
Whenever you log on to Facebook or YouTube or your bank or your favorite online game, you’re connecting to computers in some data center somewhere on the planet. All those servers and all those data centers consume a huge amount of electricity.
In fact, a new research paper published in the journal Science titled Recalibrating global data center energy-use estimates (paywall) found that data centers consume about 1% of the world’s electricity.
Well there’s an interesting article in the New York Times today that looks at this new research.
Cloud Computing Is Not the Energy Hog That Had Been Feared
The study, which examined energy use in data centers across the world, found that,
“… while their computing output jumped sixfold from 2010 to 2018, their energy consumption rose only 6 percent. “
The main reason for this is the huge gains in energy efficiency achieved by large cloud data centers run by companies like Google, Amazon and Microsoft.
“In 2010, the researchers estimated that 79 percent of data center computing was done in smaller traditional computer centers, largely owned and run by non-tech companies. By 2018, 89 percent of data center computing took place in larger, utility-style cloud data centers.”
By moving to the cloud, non-tech companies have handed over responsibility for data center operations to a few large utility-scale companies who have a strong interest in energy efficiency to reduce costs.
This is a great example of dematerialization, the idea of producing more goods and services while using less energy and resources. If you’re interested in reading more about dematerialization, I would recommend Andrew McAfee’s book More From Less, which I reviewed here.
Ideally, you want to achieve absolute dematerialization where output increases while energy and resource consumption stays flat or actually declines. It seems like the big cloud computing companies are getting very close.