Navigant Research Blog

Framing the Smart Grid of the Future

Neil Strother — April 29, 2015

Armed with years of data, utility industry officials are highlighting some of the results from the most ambitious smart grid demonstration project in the United States. One of the key lessons they learned is how difficult it can be to use the latest smart grid hardware to consistently produce high quality data.

That was the conclusion noted recently by Ron Melton, the director of the Pacific Northwest Smart Grid Demonstration Project and a senior leader at Pacific Northwest National Laboratory (which is operated by Batelle). Launched in 2010, the demo was federally funded under the American Recovery and Reinvestment Act (ARRA) at a cost of $178 million, making it the largest single project of its kind. It included five states—Oregon, Washington, Idaho, Montana, and Wyoming—comprising some 60,000 metered customers, 11 utilities, two universities, and assets in excess of 112 MW. The goal was to test a broad range of ideas and strategies to see if a regional smart grid could lower energy consumption and increase reliability.

Lacking Tools

One of the broad lessons for utilities is that the tools and skills to manage the huge volume of data from smart meters and sophisticated sensors on the grid are largely nonexistent, according to Melton. But it goes beyond merely managing data; the real challenge is to get consistently good data to ensure that sensors across the grid are working properly and that key operating decisions can be made based on reliable high-quality information.

Transactive Control

One of the core technologies used in the project is called transactive control, which in essence is two-way communications between electricity generation and end-use devices, such as electric water heaters, furnaces, clothes dryers, etc. The control signals communicate the price of delivering power to that device at a specific time, and the device can decide when to use electricity—with the owner’s consent, of course. This is the underlying technology for demand response (a topic discussed in detail in Navigant Research’s report, Demand Response Enabling Technologies). Project managers were able to show that transactive control works and could theoretically reduce 4% of peak power costs in the Pacific Northwest. But, as Melton says, this would require about 30% of demand on the system to be able to respond in this way. To get there will take a concerted effort to clearly show the value streams to all parties and then figure out the financial incentives.

Clearly, utilities are still in the early phase of the smart grid and handling big (and small) data in new ways is often uncharted territory. Nonetheless, this demo highlights the framework on which the future grid—what we at Navigant Research see as the energy cloud—will be built, and the steps necessary as the grid of tomorrow emerges.

Leave a Reply

Your email address will not be published. Required fields are marked *

Blog Articles

Most Recent

By Date

Tags

Clean Transportation, Digital Utility Strategies, Electric Vehicles, Energy Technologies, Finance & Investing, Policy & Regulation, Renewable Energy, Smart Energy Program, Transportation Efficiencies, Utility Transformations

By Author


{"userID":"","pageName":"Framing the Smart Grid of the Future","path":"\/blog\/framing-the-smart-grid-of-the-future","date":"5\/27\/2018"}