Alphabet’s R&D hub is marshaling its computing muscle to model fast-changing electrical grids

X (formerly known as Google X) wants to virtualize power grids to speed clean energy adoption. Will utilities and grid operators get on board?
By Jeff St. John

  • Link copied to clipboard
The control center of Chile's grid operator Coordinador Eléctrico Nacional, X's second grid virtualization partner. (X)

Power grids are incredibly complex systems that operate at the speed of light, making them hard enough for computers to model when they’re built around big central power plants. Replacing those fossil-fueled central power plants with wind and solar adds a whole new range of variables for computer models to calculate. And when you plug in rooftop solar, electric vehicles, backup batteries and other distributed energy resources at the edge of the grid, it upends the traditional models altogether.

These are the kinds of complex computational challenges that X, the moonshot factory” formerly known as Google X, was created to solve, says Audrey Zibelman, vice president of the new X program focusing on creating digital models of power grids around the world. (X is essentially a research and development arm of Google parent company Alphabet.)

Since its April unveiling, X’s grid program has announced that it is working with one U.S. utility company, AES Corp., on this grid-mapping challenge. Specifically, X is partnering with the AES distribution utilities in Ohio and Indiana.

Last week, X added Coordinador Eléctrico Nacional (CEN), Chile’s transmission grid operator, to its roster of partners. Over the next few years, the duo will work on crunching the data to inform that country’s push to close its coal plants by 2040 and reach net-zero carbon emissions across its economy by 2050.

For Zibelman, whose energy industry career includes serving as New York state’s chief utility regulator and as CEO of Australia’s energy market operator, these partnerships are opportunities to enable the renewable and distributed energy needed to decarbonize power grids over the coming decades.

The planning tools we have now are at a different level of speed and granularity than the ones we’ll need in the future,” she said. We’re coming from a grid where the distribution system model could be separate from the transmission system model, and the models could be slower because the generators were a bit slower.” New systems will require clarity down to the sub-nanosecond, which most models don’t see.”

The world’s hopes of mitigating climate change hinge on decarbonizing those grids as quickly as possible — which means utilities and transmission grid operators need technology that can model how to transform them as quickly as possible.

We know we’re going to have to spend trillions on infrastructure to decarbonize the power system. The question is how [to] do it most efficiently.” Zibelman said. If you don’t have the situational awareness of what’s happening on the system, you’re going to sub-optimize the decision-making — at a significant cost to customers.”

Analyzing reams of data at lightning speed

Describing the planning conundrum Chile’s CEN is facing, Zibelman said: They need enough understanding of how the grid will work when the coal plants are shut down…[and how to] redesign the system so they can have full access to their abundant solar resources in the north” of the country, she said. Digital tools that can model and test many variations of grid structures, generation and load inputs will help CEN determine which coal plants to retire in what order and when those decisions need to be made.”

Chile has been a pioneer of several techniques to better integrate solar into the grid, from its groundbreaking grid-scale battery deployments to tapping solar farms to balance instances of grid instability. But with climate change creating even greater disruptions to power grids, the operators need to make sure that when they’re approving and allowing these things, they have as full a view as [possible] of what will happen,” Zibelman said.

Distribution grids are even more complex than transmission grids, and their operators often lack accurate data on how they’ve been altered over years of operations, upgrades, breakdowns and repairs.

Sometimes the distribution utilities don’t have a picture of all their investments in the field,” Zibelman said. At the same time, distributed energy resources like rooftop solar, electric vehicles and backup batteries are beginning to supply a significant share of their power. How is adding rooftop solar…[or] batteries changing things?”

Major grid vendors such as Siemens, General Electric and Hitachi ABB Power Grids are developing digital twins,” a type of digital grid model designed to capture all of these real-world conditions. But these systems are largely adapted from versions meant to mimic and predict the operation of individual pieces of equipment or power plants, so they are likely to encounter challenges when scaling to the levels of complexity and uncertainty inherent on distribution networks spanning thousands of miles and subject to constant alteration by variables including weather and customer behavior.

You have to have confidence in the model” to be able to rely on it for investment decisions, Zibelman said. 

Freeing the grid data 

These are well-recognized problems for utilities and grid operators. The search for modeling tools that can process more complex grid problems at speeds that make them useful beyond long-range planning has been a major focus of ARPA-E, the Energy Department’s energy R&D organization. DOE laboratories, university researchers, grid giants and startups have all tackled these issues and barriers from different angles.

But grid data is a particularly thorny arena for non-utility actors to get involved in. Utilities tend to jealously guard their data, citing the need to protect customer privacy and the security of their networks against both physical incursions and cyberattacks. Critics have slammed utilities for making it difficult to obtain data to use in ways that can help customers, accusing power companies of slowing progress to prevent third parties from exploring data-using opportunities that they might like to monetize themselves.

Regulators often have to order utilities to make data available, and then order them to comply, whether that’s to share smart-meter data with customers and their chosen third-party service providers, or to open up grid-hosting capacity maps to companies seeking to interconnect solar, batteries and other distributed energy resources.

Google’s mixed track record on energy innovation 

Whether X’s new foray into electric-sector innovation will overcome these challenges is as yet unclear. Google has certainly made waves in the corporate clean energy procurement space, funding gigawatts of new solar and wind power. Its aggressive target of reaching 24/7 clean power across all its data centers and corporate campuses by 2030 has set a new bar for other corporate clean energy buyers and prompted Google’s investments into advanced geothermal energy and load-shifting for data centers.

At the same time, Google’s own efforts to develop technologies for the power grid have not yielded stellar successes. The PowerMeter home electricity tracking system backed by Google.org was canceled in 2011; a $1 million challenge to build smaller and more efficient inverters hasn’t yielded any publicly announced progress; and its RE<C initiative exploring more cost-effective renewable energy technologies was wound down in favor of investing directly in wind, solar and energy storage.

X has its own slate of projects, ranging from Waymo’s driverless cars and Wing’s delivery drones to energy-centric ventures such as home geothermal energy provider Dandelion Energy and molten-salt energy storage developer Malta. But in contrast to those initiatives, the grid-modeling project will require tight coordination with utilities and grid operators — entities not historically known for nimble adoption of new technologies.

Zibelman conceded the scale and scope of the challenges involved. This is very much a moonshot,” she said. But like most moonshots, it requires collaboration across various fields to start building the tools we’ll need in order to decarbonize and then to make that information available.”

AES Corp., the utility holding company that owns the Ohio and Indiana utilities working with X, has a long-standing relationship with Google, including its deal to provide 90 percent carbon-free power around the clock to Google’s Virginia data centers.

The X grid-modeling program will only use grid data sourced from partner utilities to provide services to those utilities — a notable limitation for an effort linked to a company whose business model revolves around tapping consumer data. In the energy space, Google uses consumer data to power its Nest smart thermostat and home-automation business. It also competes against Amazon in the market for voice-activated devices, which can control home energy use and communicate with utilities.

ZIbelman believes the grid-modeling effort might lead to moneymaking opportunities, but she didn’t offer any specifics about how that might happen. We take some of the world’s hardest problems, where we think there could be a 10x technology solution” — a solution that offers significant value at much lower cost than alternatives — and we create great companies out of it.”

It’s also about how visibility can help drive the regulatory system [and] the market system so that everyone can benefit,” she said. How do we create a valuable business around making the power system cleaner and more efficient and more resilient — and more equitable, making sure everyone has more access to clean energy?”

Jeff St. John is director of news and special projects at Canary Media. He covers innovative grid technologies, rooftop solar and batteries, clean hydrogen, EV charging and more.