Podcast
Sponsored
AI
DATA + CLOUD

For Microsoft, AI is an 'unlock' for decarbonization

A conversation with Microsoft’s Hanna Grene on the many ways AI is impacting utility operations.

Listen to the episode on:
The Carbon Copy podcast by Latitude Media
The Carbon Copy podcast by Latitude Media

Microsoft was an early mover in integrating OpenAI’s large language model into its Azure cloud services. Now every part of Microsoft’s technology stack — from cloud infrastructure to data analytics to consumer apps — will be ​“reimagined” for the AI era, according to CEO Satya Nadella. As a result, every industry will inevitably be impacted by AI.

Utilities will also find themselves at the center of this shift, even if most aren’t yet actively investing in AI for grid management. Generative AI will increasingly start to influence back-office operations and customer support inside utilities for ​“focus and efficiency,” explained Microsoft’s Hanna Grene during an on-stage discussion at Latitude Media’s Transition-AI: New York conference.

This week, we feature a conversation from our Transition-AI conference with Hanna. We talk with her about how large language models and other forms of artificial intelligence are making their way inside utilities — and why AI isn’t as intimidating as it seems.

Listen to the episode on:

Sign up for Latitude Media’s Frontier Forum on January 31, featuring Crux CEO Alfred Johnson, who will break down the budding market for clean energy tax credits. We’ll dissect current transactions and pricing, compare buyer and seller expectations, and look at where the market is headed in 2024.

Transcript

Stephen Lacey: Hanna Grene's been working in the power sector for 16 years and a few years ago she was the chief operating officer of a company called PXiSE Energy Solutions. PXiSE integrates renewables and batteries into a utility's operations through a software and controls platform called DERMS, also known as Distributed Energy Resource Management Systems. DERMS are an essential part of a clean grid, but Hanna had a front-row seat into how hard they are to sell and integrate.

Hanna Grene: And I started thinking about if every utility goes through their ADMS integration, and this was a few years ago, but we were seeing these integrations take five, seven years and then if every utility goes at the pace we were going at and then says, "Well, we're going to build our DERMS over the next five years," I just started adding up the numbers and I was like, we're going to blow past the timelines that we need to hit to get to our climate goals.

Stephen Lacey: This is Hanna speaking with me on stage at Transition AI New York last month, and she talked about a challenge that plagues virtually all vendors trying to work with utilities on a new technology, extremely slow sales cycles and integration cycles, and she realized that redesigning the way utility teams operate is just as important as the tech itself.

Hanna Grene: We need to change the way we work, we need to change the way we're approaching these problems, and we really have to bring technology to the forefront of what we're doing in the industry to get there faster.

Stephen Lacey: Then in 2021, Hanna went to Microsoft and today she's the leader of global operations and go-to-market strategy for the energy industry there. She works with big energy companies, including a lot of utilities to get them to think much more holistically about how to accelerate tech adoption in service of decarbonization.

Hanna Grene: I think of myself as a grid nerd at a tech company, and our role is to show up as a technology partner with what we do best hyperscale, cloud computing, global cybersecurity, the enterprise platforms that are already running most of the back ends of these businesses today and AI.

Stephen Lacey: Microsoft has emerged as a leader in artificial intelligence thanks to its partnership with OpenAI, and now every part of Microsoft's technology stack from cloud infrastructure to data analytics to consumer apps are going to be re-imagined for the AI era. That's what CEO Satya Nadella said in a recent investor letter, and that's putting AI at the center of Hanna's job working with utilities.

Hanna Grene: For me, AI is an unlock. It gives us a whole new set of tools that we can use to approach and accelerate around these issues. And honestly for folks who might not have seen some of the power of digital technology. If their kids are showing them on their phones, but what AI is capable of, it's an aha moment I think for other people and that's exciting.

Stephen Lacey: This is the Carbon Copy. I'm Stephen Lacey this week a conversation from our transition AI conference with Microsoft's Hanna Grene. I talked with her about how large language models and other forms of AI are making their way inside utilities and why AI isn't as intimidating as it seems. Let's get into some specific use cases and partnerships, but very broadly, Microsoft has identified hundreds of potential applications and some of those we'll talk about coming up. What rises to the top, generally speaking?

Hanna Grene: Yeah, I think about it in terms of capabilities of this technology, and so it is always important for me to set the foundation that Microsoft has been doing AI work for a long time. We have decades of history going back to the early days of the company where we've been working on AI and machine learning. So, a lot of what we're going to be talking about today is actually generative AI and this next click in innovation around Generative AI. And so, there are four areas broadly where Azure OpenAI is really finely tuned and they are summarization, semantic search, code generation and content generation, these really phenomenal images and things like that. That's that content generation machine. And then the way that we're, in addition to the Azure OpenAI work and co-innovation that we're doing with our customers, we're also pulling those Generative AI capabilities into a number of our products and we're calling those co-pilots and that's really their function.

And what those co-pilots do is they show up as that trusted resource to you in your work to support how you're doing that work more effectively, and it's a tool at your disposal. And so, the areas where I'm seeing customers sort of broadly apply, those are in predictive analytics, forecasting, a lot of really exciting forecasting opportunities, health and safety, supply chain needs much more advanced capabilities. That's an area we're seeing folks leap ahead, and then all the way out into customer service, customer programs and many, many more. I guess there's sort of three areas that I'm seeing it really shift the way that we're working. And the first is in this industry in particular, I think we have so much intelligence scattered throughout our organizations and there's a lot of time spent hunting for the right information to do your job and finding the right people. You sort of like, we're going to do this project, and then you gather up your V team and you go out and have to find all of the other people who have the information that you need.

And now using some of these data sets can actually, you can use that semantic search capability and your teams can get back, they can search, they can summarize, they can find the data faster and they can go back to the job that they set out to do in the first place. And so, that for me, it drives focus and efficiency. And the other piece, all of a sudden, all of this data, utilities have more data than all of our peer industries. We have more data in our segment than healthcare than finance. They would be jealous of our data sets and yet we make some of the least use of our data of any industry. And if we can flip on our head from data being this problem of how do we organize it and how do we turn it on and how do we make it useful? Well now with some of these AI tools, we can get curious.

And so, that to me is sort of the second unlock is we can get curious about our systems, about our networks, about our business, and we can ask questions of our data without you being a Python coder without a Python coder sitting next to your EE or your regulatory attorney in natural language, you can ask questions of your data. And the third, and just personally in my work, I'm getting a lot of value out of this already. It's the creativity component. And so in our co-pilots, there's a function now where in PowerPoint with the right direction, it can draft a PowerPoint deck for you.

And I don't know about you, but the first 20 minutes that I'm staring at my PowerPoint slides is not my most productive 20 minutes. It might be my least productive 20 minutes. My most productive time when I'm thinking about a message that I want to deliver to my team or to our partners, it's in editing that first draft, right? That's where I sort of see the message like, oh, okay, well, the co-pilot can do that draft for you. It's very good at tasks, which lets you go back to being very good at the creative part of what you were setting out to do.

Stephen Lacey: Yeah. Well, the co-piloting thing is interesting because as we've talked to a lot of utilities and other energy companies in evaluating this space, I mean those are the very obvious near-term applications for really any corporate entity, but utilities in particular where there's not a lot of friction in getting them to deploy some of those tools. And I know that you're working on also customer support tools and document management tools that can be helpful. And many broad macroeconomic analyses show how much efficiency you can bring to the corporate world in implementing these tools. Can you talk about some specific projects in say, document management making jobs easier, and then why utilities might be embracing those applications first before some of the other more complicated ones?

Hanna Grene: Yeah, absolutely, absolutely. There's so many exciting use cases, and there's another element of this which is I'm seeing companies that we're working with approach this really thoughtfully. And one of those companies is Southern California Edison, and they've shown tremendous leadership, not just in the business benefits and the grid benefits they're going to be getting out of their projects, but in their approach. So, they were very thoughtful early on and working with our teams as a Microsoft partner, and they said, what are some of the areas? We've got a long list of problems that we're excited about tackling with AI, but what are the ones that we can rise to the top where we're going to get to know these tools and get our teams to interact with them and learn how we want to apply them, and where we're going to get a really great benefit to our business and to our stakeholders right out of the gate.

And so, they lifted those to the top, so they could really get their hands on it, see the benefits, see the outcomes, and then start to get into some of the deeper problem sets. And we're seeing that really work because there are skeptics who say, "I don't know how this is going to impact my work or my team or my workflow." And so, I've seen that approach be really productive in helping people learn about the technology, get comfortable with it, see how it's going to benefit them and their teams and their business. And then everybody's eyes kind of light up and they get excited and they get creative. It goes back to that they get curious and they get creative. And so, on that journey one of the high-impact opportunities that Southern California Edison approached first, it's with their Edison GPT capability that they've launched and it's an Azure OpenAI and they wanted to approach regulatory data hunting and regulatory summarization.

And so, it's the Edison GPT for GRC or General Rate Case. And I used to run a policy team, and so I can speak to from firsthand knowledge that these are our teams of really talented experts. Many have legal backgrounds or specific policy and subject matter backgrounds. I don't think they got into this space to spend hundreds of hours going through years of testimony to write summaries of regulatory documents. That is not what we want these super talented people doing. And yet that is how a lot of time gets used because we have decades of regulatory process and summaries that have to be gone through to provide information out to the organization and stakeholders. And so, what Edison's been able to do, they loaded over 22,000 regulatory documents into Edison GPT like decades of filings, and now internal stakeholders can, if they approach the regulatory team, they need information on a specific filing, a rate case, past information, they can use the tool, find the information, it creates a summary and it creates a source list.

So, if you do need to go back to the original filings, you can find them, but it saved you hundreds of hours of staff time. It's also externally available. There are tons of stakeholders who come to the utilities asking for information about their filings, about their policy information, and now instead of those stakeholders waiting days or however long it takes, there's often a backlog because there's so much activity, especially in California and the regulatory space, they can get that information faster. And again, this drives that focus and efficiency layer. They can get back to the work that they were setting out to do when they came and asked for all of this information to be provided to them.

Stephen Lacey: How do our journalists get our hands on that tool?

Hanna Grene: It's open to external stakeholders, so it's available if you're a member of a regulatory proceeding, it's available too for what they're rolling out. And then just thinking about, well, what do we do next with it? They're looking at opportunities for it to provide some of that first round drafting of things like a GRC that would otherwise take weeks or months and boil that down. Again, this is not the creativity and the expert part of the job. It's the task part of the job, so that your experts can do the expert and creative part of the job and of the highest value part of their work. And we have so much information locked up in different parts of our organizations, and this is just such a powerful way for you to enable your teams and enable your colleagues to tap into that data and that information in a much more collective transparent and level way.

Stephen Lacey: So, going deeper, you're also working on methane detection using sensor data and satellite imagery. Talk about what tools you're using and what the applications for methane detection are.

Hanna Grene: Yeah, and this hits me getting back to that climate mission that I carry. Methane is 85 times more potent as an emission, and it represents about 20% of global emissions today. Although many reports show that that number could be much higher because of some of the leakages in the distribution and storage value chain. And so, when we have worked on methane, we're thinking not just about how do we detect and remediate, although that's really essential and is often a step that doesn't get performed to full accuracy today because it can be hard to identify the source of the exact source of leaks and ensure that the remediation was successful. We're also thinking about how we apply AI to get to prevent the leak before it happens scenario. And so, that's the goal that we're setting out from. So, we've partnered really closely with Accenture on this, and they've developed a methane emissions platform and tool set.

And that platform, they're working on it for oil and gas and in the power and utility space. In oil and gas, we've been using AI to help determine some of where you even place the sensors to ensure that you can't put sensors everywhere. It's just sort of prohibitive and not practical, but it helps you determine where to place those sensors, so that you find the leaks as quickly as possible. You can measure leakage amounts, you can ensure remediation, and you can do a quality control check that that remediation was successful. In the power and utility space, we've seen companies working with Accenture, like Duke that are doing satellite imagery to see system-wide, where are there leaks in the system? And in any of these cases we're looking for and currently trialing applications for AI to learn from what are the patterns of where we're seeing leaks in these systems? And then how do we get from what the platform can do today that detect, remediate, measure, resolve, which is essential. How do we use AI to get to this next phase, which is detect before leak? And that's really the project.

Stephen Lacey: And now in methane detection, there are a lot of people who physically go out in the field. And I'm just curious, are the tools, are you seeing any benefit in terms of operational efficiency or not having to send as many people out in the field and then accuracy of results?

Hanna Grene: Thank you. That's such an important point. In many parts of the U.S. and even beyond the U.S. globally, the most common way that we check for just quality of our distribution and transmission and storage systems is still paper, clipboard, and a walk. And so, anything that we can do that uses the sensor technology, there are a number of vendors today that use drones and fixed wing aircraft to fly over the systems, but to really have real-time information system-wide to get to that prevent, detect, remediate and prevention space, you need as much of a holistic system-wide point of view as possible. And so, it's really going to be about layering those data inputs and then using the AI and training the models to get out and do that preventative maintenance before the leak.

Stephen Lacey: Back in July, Schneider Electric and Pacific Gas and Electric unveiled a new DERMS platform. It was architected on Microsoft's Cloud to manage tens of thousands of batteries, solar systems and EV chargers. And there are two pieces to that DERMS rollout that Hanna says are noteworthy. One is using AI to bring new capabilities for querying data inside the DERMS system. And the other is how the three teams at Schneider, PG&E, and Microsoft actually workshopped the system.

Hanna Grene: We had cybersecurity experts in the same room, as the IT experts in the same room, as customer programs in the same room, as ops in the same room, as the control system designers. I have never seen teams move that hard and that fast at breakthrough innovation in this space for a system of that scale and ambition. And so, what we've launched is phase one, but again, DERMS fully in the cloud. And for PG&E, the goal is for this to provide full DER and EV visibility across their networks. It will for a subset of DERs that are controlled by the utility, it will provide that control, but then broader visibility to DERs from retailers and OEMs and other providers on their network. It will provide resilience, there are customers in PG&E territory who are on the other side of a long feeder that you might've heard.

They sometimes in California we have public safety power shutoffs or PSPS events in extreme weather conditions. And so, if PG&E needs to de-energize a line for the safety of the system, the customers on the other side of it will have solar and battery storage and maybe even an integrated EV to help provide that power and that resilience and backup. And then the other use case, which is really important, and I think this is the one that doesn't always get the airtime because it's still emerging in a lot of parts of the world, but if we're going to integrate electrified transportation and EVs at the scale that we're talking about and the scale that we're starting to see actually in California, we need to be able to control much more nimbly at the edge and to be able to balance power, supply, and demand, so that we can provide adequate power to support charging of EVs.And so, this DERM system will do that as well and help balance across those loads to support that growing electrification capability.

So, there are also some really exciting AI use cases in DERMs. And so, one for me is just that visibility and data component. Your DERMS system actually can become a giant data set and you could start to think about querying using that semantic search. You could ask your DERMS system, what's the average load on that feeder in March? And then you could say, "Okay, how much of that is from EVs? That is exciting, that is really exciting. And then another area where we're going to see breakthrough opportunities with AI in terms of forecasting. Forecasting is a huge opportunity for AI in general. If we think about the distribution system, you do forecasting today on the supply and the demand, okay, well now assume 40% of your customer base has EVs. That adds level two charging. Assume 20% of your generation comes from DERs. That is a level and complexity of forecasting that we do not have today. And the distribution grid and AI is going to be a really powerful tool to unlock that.

Stephen Lacey: So, you have as good a window as anybody into how utilities are thinking about this stuff. And as you look at whose best prepared to take advantage of new emerging technologies, how would you characterize those utilities? I assume that they are utilities who've already been working on this for a while.

Hanna Grene: Yeah, and what they've been working on, Stephen, is the data. AI is only as good as your data. I mean, that's sort of just the foundational truth of all of this work. And so the utilities that have been on a journey with us that had, especially when I see this in utilities that not just in IT because IT tends to get it. They're experts in this space and they get the value of the data. But when you see it from a CEO and a CFO and a CHRO and your COO that we've had this aha moment that we're going to be a digital company and that the future of our business in power, wherever you sit, relies on your data. And once you've had that aha moment, if you can start to imbue that, not just in how you enable and empower your IT team, although go do that, they're really important. But in how you enable and empower all of your business units and that leadership that you provide out to them that we're going to become a digital company and that this is happening.

That's really the biggest difference that I see in these companies. So, the companies that have been on that journey for a while, they've been doing the data work, and so they have these big pools of data. They have well-organized data estates. They're bringing in data from their business processes and from their grid and from their generation. Those are the companies that are already harvesting those really rich insights into the future of their business into planning the future of their system and operating their system today. And we can absolutely deliver outcomes and AI with utilities that don't have that depth of data. And I'm seeing that too because AI has been that motivator. Folks are saying, "I want to get into that. I want to take advantage of this opportunity to change our business and change how we're doing work." And so, there are ways we can get started in AI without that rich data estate, but it's a call to action that we need to go develop it.

And that the deeper, the more that we fill that pool for you of having the right data and actually leveraging AI to get you the right data in there and to make these clean data sets, that's where the real outcomes come. And so, an equation, I've been speaking with a lot of utility leadership teams lately, and the equation I always come back to is outcomes, business insights. There a function of compute in close proximity to data, like compute in close proximity to data is where you're going to get the results that you're really looking for.

Stephen Lacey: Could you please provide a list of all those utilities? The vendors in the room will be ready to take notes.

Hanna Grene: Come find me. We work with partners, we're a partner ecosystem.

Stephen Lacey: Okay. So, a couple more questions, two more questions to round out. I mean this conversation, we can't have this conversation without talking about energy intensity of AI and particularly large language models. Microsoft, Google and others have seen this massive spike in computing demand inside data centers, which ultimately of course has consequences for energy and consequences for decarbonization goals and how you're using renewables to meet the electricity demand of those servers. So, what can you say about how you're managing that spike in demand and what it does to make your decarbonization goals more complicated?

Hanna Grene: Yeah, this is an important question and it's an essential part of this conversation, and I should say where my team is really out focusing on delivering innovation and technology in this space. We partner really closely with the side of our business that's focused on our carbon goals and our energy demand and meeting that energy demand with our data centers. We do work very closely together. And the key thing here is compute uses energy, uses a lot of energy, and as we go forward into this next stage of AI, we are not backing off of our climate goals. And so, we will be facing this challenge and really looking to this industry to work in partnership with us to help think through how we're going to get there. And so how are we going to get where? We talk a minute about our climate goals. By 2025, we are committed to meeting our full load with a hundred percent carbon-free electricity.

And that is a robust goal already. But then when we look out to 2030, we are committed to being carbon-free as a company, to removing diesel gen sets as a backup power source to our data centers, which by the way means we need a really, really, really reliable grid globally, everywhere. And so, we got to work with you on that. And then we're also committed to reaching what we call a hundred hundred by zero, which is a hundred percent clean energy a hundred percent of the time, zero carbon. So, that's a time sensitive carbon gold because we all know in this room that you don't sit back and measure the energy load, the supply and demand of your grid in an annual basis. We measure it at 60 hertz, right? And so we won't be going down to 60 hertz, but it will be a time sensitive how are we meeting that carbon goal in a time sensitive way. If we look out to 2050, we're committed to being carbon negative since our founding in 1975, which means we are actively investing in and thinking about how we remove carbon.

So, we're not going to get there alone. I mean, that's the first thing I want to say. We're a large load and we need a lot of infrastructure, growth to help support our growing needs in this space as well. And it's our point of view that in addition to having these carbon goals, we want to be good grid citizens. And so, it's really imperative to how we approach this work that we do it in partnership with this sector and those 2030 and 2050 goals, nobody's done that before. It is off-roading. It is new territory, and we absolutely don't plan to figure it out by ourselves. And so, for us, it's really a call to action, and I would invite all of you, we're going to figure this out in partnership, and we are committed to being at that table and to helping think through how do we continue to lead in this carbon space? How do we continue to be thoughtful grid citizens, and how do we grow in this sector?

Stephen Lacey: So, Microsoft, Google, others are really putting themselves as AI first companies, and this is going to be really crucial to all the collaboration that you're doing with utilities and other energy companies. So, through the end of the decade, where do you think the industry will be in integrating this? Will we see any transformative applications? Will we still be in the experimentation phase? What's your vision for where things are going through the end of the decade?

Hanna Grene: Yeah, we don't have time to still be in the experimentation phase. And so for me, this is the excitement around this moment is an opportunity for those of us invested in accelerating the energy transition, let's harness this excitement. Let's start changing the way that our teams work, the way that we collaborate. Let's partner with whether that's your controls partners, your metering providers, your technology partners, folks like us. Let's bring the best of minds together to start approaching these problems differently, to focus on scale and repeatability and how we're delivering these essential systems around forecasting DERMS, AMI 2.0 Scale repeatability faster. I think those are the words I'm going to keep repeating. And how do we use these tools? Coming back to those sort of four big unlocks that I see AI delivering, focus, efficiency, curiosity, and creativity, those are superpowers that really light up people and teams and talent.

At the end of the day, that's what's going to make this happen. It's people and teams and talent and partnerships. And so for me, we need to harness this moment to drive that action and to drive towards that future. I'm already seeing breakthrough applications in our work and in this sector, so I don't worry about the tech being good enough. I worry about how do we change the way we're working and approaching these problems? How do we partner more effectively? Getting back to that example of where we're seeing it really work. And then there's so many things that when I got started in this space 16 years ago, I'll give you one example. Microsoft has a capability, it's called the Microsoft Planetary Computer, and it's a 24 petabyte data catalog.

Stephen Lacey: It looks similar to the AI image we had up there.

Hanna Grene: Absolutely, but it's in the cloud, it's in Azure, but that's being used for weather prediction. It's being used for wildfire prediction and it's able to look at large climate models into the future. And so, when I see tool sets like that with the lens that I have in this space, I look at that and go, if you're working on an IRP, you should be looking at a climate model, so that you're not building next generation transmission through an area that might be green and wooded today, but in 15 years, we expect to be a high-threat, wildfire zone. It just how do we use these tools to fundamentally change the way that we approach old process and old ways of doing business so that we're building the future that we want to live in?

Stephen Lacey: And that's all for the show. Again, that was me talking with Microsoft's Hanna Grene at our Transition AI New York event. We're going to have one more really interesting featured conversation next week on AI and consumer applications. If you want to read a version of this interview or if you want to check out more analysis on trends and AI and energy, we've got a ton of it at LatitudeMedia.com, go over there, subscribe to our newsletter, and we've got a bunch of other coverage on emerging climate tech categories and renewables, storage, carbon removal, AI, etc. And of course our partners at Canary Media. You can go to CanaryMedia.com and subscribe to their newsletter, and they're covering these trends as well. And the carbon copy is a co-production of Latitude Media and Canary Media. Dalvin Aboagye is our producer, Sean Marquand is our engineer.

Latitude Media is supported by Prelude Ventures. Prelude backs visionaries accelerating climate innovation that are going to reshape the global economy for the betterment of people and planet and their investing across a wide range of categories. You can learn more about their portfolio and investment strategy at Preludeventures.com. Give us a rating and review on Apple or Spotify. I know, I know we ask it every single week, but there's a reason, it's super helpful. So, if you share your thoughts on social media, share a link to this show to a friend or colleague, or give us a rating and review. We appreciate it and it will help us find new listeners. And I'm Stephen Lacey. This is the Carbon Copy. We'll catch you next time.

No items found.
No items found.
No items found.
No items found.
energy
energy transition
demand response
smart grid
virtual power plants
energy markets
big data
Microsoft
machine learning