Welcome
Environment & Energy Report

INSIGHT: Will AI Increase or Decrease Power Grid Efficiency?

May 22, 2020, 8:01 AM

Conventional wisdom is that greater computing power distributed at the edges of the power network will unlock new opportunities to lower the system costs and carbon emissions, and that advances in AI will increase energy efficiency in the U.S.

However, we don’t really know how much energy the added computing and communications networks will consume versus how much they’ll save. The theoretical savings potential seems large, but opposing views opine that advances in AI will increase total power demand.

First, let’s look at some facts. Over the last few decades, the internet proved to be a global net energy saver. Numerous studies from Lawrence Berkeley National Labs (LBNL) and the Rocky Mountain Institute (RMI), among others, concluded that the internet used about 1% of all U.S. electricity in 2000.

Meanwhile, the internet enabled larger savings from efficiency gains such as information-enabled efficiency, greater industrial productivity, and telecommuting rather than traveling. The fact that total electricity use has remained flat-to-down over the past 10 years as internet traffic, data centers, and the entire information and communications (ICT) complex has mushroomed certainly supports this thesis.

Energy-Saving Trend Showing Signs of Change

Nonetheless, there are signs of a change in this trend. It is very likely that our ever-increasing wish to have everything available “just one click away” will lead to a noticeable uptick in power demand as 5G, IP-accessible devices, AI, edge computing, and the coming “data tsunami” hone in on us. Data processing requires electricity, and the cloud can create carbon emissions.

For example, areas of Ireland that hold data centers saw the development of new energy infrastructure and increases in emissions (de Vries 2018, p. 801.). Despite Ireland’s huge energy efficiency programs, the use of fossil fuels and carbon emissions are on the rise. The current grid was built in conditions where electricity moved in one direction from large scale generators to consumers. The introduction of renewables, creation of prosumers, the deployment of smart meters and IOT require the grid to update.

This provides an opportunity for AI to help expand or renovate the infrastructure. AI simulates intelligent human behavior in software form, and with the introduction of machine learning, it teaches itself how to improve.

AI will also be involved in the predictive maintenance of the grid. This process involves recording metering data from current grid operation, grid infrastructure master data and geoinformation or weather data and transmitting maintenance recommendations. One of the goals is to predict infrastructure weaknesses before they lead to negative effects or failures.

Within larger databases, it is possible to combine infrastructure data and external data so that it would also allow AI to better predict the production of energy. This is important for renewable energies whose production capacity fluctuates depending on the weather.

As this technology matures, AI will increasingly play a role in managing the demand for electricity. This will be combined with wide deployment of smart metering, which will reach 107 million U.S. households by the end of 2020. When corresponding market signals are received, AI can help manage loads to respond to demand or price conditions.

AI will also contribute to more stable electricity grids by automatically feeding the energy produced by thousands of households into the grid or energy stored in millions of electric vehicle batteries.

The Role of Self-Driving Cars

Another ICT-driven innovation that has been the subject of significant energy research, self-driving cars, also raises the specter of increased energy use. Autonomous vehicles (AVs) will transform our transportation infrastructure and urban landscapes more dramatically than anything since the automobile itself, but the nature and timing of the effects on electricity demand are highly uncertain.

Many features of AVs reduce demand, but only once they are almost ubiquitous and connected to each other through an effective communications and optimization infrastructure. At this point, vehicle crashes and congestion will (we think) disappear, making vehicles run on a vastly more efficient system.

However, in the critical next few decades when we must decarbonize, AVs will also encourage much more driving and energy use. When vehicles are fully autonomous, we can send them to fetch children at soccer practice, sleep in them overnight while they drive us to another city, and summon them for trips we’d otherwise take by foot, bike, or transit.

Nearly all AV energy experts agree that the net effect of these changes is highly uncertain, and that in many scenarios energy and electricity use could increase. The conventional wisdom is that AI and edge computing will make the entire enterprise of producing and using electricity more efficient overall, and this should still be considered the “null hypothesis.”

However, there are trends such as the “data tsunami” and autonomous vehicle activity that could lead to more electricity use than we expect, including even perhaps a net increase. We should not dismiss the significant possibility that AI and edge computing increase the scale of our electricity use as we move to decarbonize power and mitigate climate change.

This column does not necessarily reflect the opinion of The Bureau of National Affairs, Inc. or its owners.

Author Information

Dr. Peter Fox-Penner is founder and director of the Boston University Institute for Sustainable Energy (ISE) and Professor of Practice at the Questrom School of Business. He is a former senior official at the Department of Energy and the White House Office of Science and Technology Policy, as well as the author of the forthcoming book from Harvard University Press, Power after Carbon.

Olena Pechak is a senior fellow at ISE. Her current research focuses on developing approaches and pathways to expand clean electricity systems and how to overcome implementation barriers.

Matthew Lillie is a research assistant at ISE and MBA Candidate at the Questrom School of Business with a background in distributed wind and for-profit online learning software.

To read more articles log in. To learn more about a subscription click here.