Last week, we brought you a story unpacking whether utilities are ready for the promise of artificial intelligence. (That piece is linked here in case you missed it!) This week, we bring you a Q&A with one of the people I spoke with when working on that piece. David Groarke is managing director of Indigo Advisory Group. Our conversation has been edited for brevity and clarity. – Lisa Martine Jenkins


Lisa Martine Jenkins: The world is suddenly paying attention to the widespread potential of AI. Where are we in the adoption of AI by utilities specifically? Do you see 2023 as an inflection point? 

David Groarke: It is an inflection point to a certain degree. Utilities have had experience throughout the years with traditional AI and machine learning applications, but over the last 10 to 15 years, there has been a lot of infrastructure — intelligent end–point devices — deployed onto the grid. There’s a lot more granularity around visibility and data, and a lot more sophistication in terms of how utilities are managing that data.

The inflection point now is that we have more sensors on the grid; we have changing business models with distributed energy resources; and we have a sophistication on the supply side of solutions. That supply side is rapidly evolving, and that means that we are on the precipice of change.

LMJ: You speak a lot to utilities. From what you can tell, is AI something that there’s a lot of enthusiasm for right now? Is there any technology that’s particularly attractive to utilities? 

DG: There has been a lot of activity on the operations side of the house — predictive maintenance, gathering images, automating processes — for many years now. We’re seeing more maturity at the edge of the grid with DERMs and virtual power plant applications. And on the customer side, there has been quite a bit of sophistication around how utilities interact with customers, which traditionally has been a more transactional relationship.  

In addition, there are a lot of opportunities on the energy forecasting side and using AI to integrate intermittent renewable assets and better match supply with demand. However, where applications are a little more nascent is on the automation and control side, as AI grapples with grid physics and hard constraints. 

LMJ: Broadly, how can the use of AI encourage utilities to decarbonize?

DG: First, there is the question of physically managing a new renewable asset. There’s a whole bunch of AI use cases that can help, in terms of finding the right location, managing that asset over time, looking at predictive failures and so on. Certainly, it’s going to make onboarding a lot easier. 

But in terms of controlling these assets and forecasting dispatch, AI could help utilities better use different types of weather data and energy grid management data to dispatch resources at the right times. This effectively means there’s a better marriage of supply and demand of energy, and easier management of that asset over time. 

AI is really simplifying some of the more complex decisions that need to be made on the grid, or at least providing a whole lot more data for those decisions to be made effectively.

LMJ: What is the blue-sky scenario for a grid that integrates AI to the greatest possible effect? 

DG: We have a whole bunch of systems talking to each other, dispatching automatically, and working with third party data: weather data and customer usage data. We have this automated grid. Customers can set it and forget it. We have renewables dispersed globally and data centers that are about 1% of our global demand automating usage, and operators are just sitting there eating doughnuts. 

I think we have a long way to go to get to that scenario. We’re many years, if not decades, away from the integration of renewables at a large scale, with assets and markets communicating autonomously.  

What we are seeing each year, however, are incremental improvements in renewables deployment, grid management, and topology identification. Meanwhile, the customer side is being more fully transformed — both behind the meter and in terms of customer experience — as customers begin to manage their own energy and bring on their own distributed energy resources. As these come together over time, and as regulation matches innovation and technology deployments within the industry, we’ll get closer to that blue-sky scenario.

LMJ: Are there any applications of AI in the utility space that you think are under discussed, and any that are over hyped? 

DG: In terms of practical use cases, the most obvious and highly visible one is around energy forecasting. Traditionally, utilities have used large statistical models to forecast supply and demand. Now, utilities are looking at different types of weather data, and they’re using more granular demand data on the customer side, and they’re able to forecast dispatch and how the grid operates at a much more efficient level. That’s not over hyped.

Meanwhile, I think some of the use cases for generative AI and Large Language Models (LLMs) might be a little overstated in the short term. What we’re seeing is that LLMs for utilities can be used to provide synthetic data to train other models. I think that is quite an exciting application, where you’re able to do that data pre-processing layer, the data management layer, and then create the data for the inference layer. But I think we’re going to see an explosion of generative AI-type applications that perhaps are a little misguided in terms of what the capability is that sits behind them. 

I think it’s important, when you’re looking at AI and utilities, to contextualize. What’s the type of AI? And then what’s the actual use case? Each of those things requires an individual conversation. Otherwise, you can lose a reference point.

LMJ: What is the role of regulators here? How can they encourage the useful adoption of AI? Are there carrots and/or sticks at their disposal? 

DG: I think historically what’s worked very well for new technologies in the utility industry is enabling utilities to set up regulatory sandboxes that have conditions that might be outside the jurisdiction or regulatory constructs – test new ideas, pilot them, maybe use anonymized customer data before scaling a solution.

Utilities really want to be second and right, not first and wrong. It’s an interesting industry in that regard. What I will say is that if something works, the industry can move pretty quickly with the right regulation. And what regulators are looking for is that solutions work. They improve customer satisfaction. They hit their jurisdictional goals. But I think what’s important is to encourage utilities to work with both startups and traditional vendors, and to be innovative and learn from each other.

This is the second of several conversations we’ll be publishing on the use of AI in the energy transition in the lead-up to our Transition AI conference on June 15 in Boston. Spots are filling up fast, so register today! 

About Post Script Media

Post Script Media, founded by Stephen Lacey and Scott Clavenna, is a media and market intelligence firm devoted to enabling a faster transition to a net-zero economy. Its latest undertaking, Transition-AI, is focused on serving the climate tech community with a deep understanding of how digital solutions and AI will enable and accelerate the clean energy transition. Post Script’s audio business has produced some of the most important podcasts in the energy and climate space, including The Carbon Copy, Catalyst with Shayle Kann, The Big Switch, Columbia Energy Exchange, Climavores, Hot Buttons, and Watt It Takes. Post Script Media is headquartered in Boston.