Navigant Research Blog

Framing the Smart Grid of the Future

— April 29, 2015

Armed with years of data, utility industry officials are highlighting some of the results from the most ambitious smart grid demonstration project in the United States. One of the key lessons they learned is how difficult it can be to use the latest smart grid hardware to consistently produce high quality data.

That was the conclusion noted recently by Ron Melton, the director of the Pacific Northwest Smart Grid Demonstration Project and a senior leader at Pacific Northwest National Laboratory (which is operated by Batelle). Launched in 2010, the demo was federally funded under the American Recovery and Reinvestment Act (ARRA) at a cost of $178 million, making it the largest single project of its kind. It included five states—Oregon, Washington, Idaho, Montana, and Wyoming—comprising some 60,000 metered customers, 11 utilities, two universities, and assets in excess of 112 MW. The goal was to test a broad range of ideas and strategies to see if a regional smart grid could lower energy consumption and increase reliability.

Lacking Tools

One of the broad lessons for utilities is that the tools and skills to manage the huge volume of data from smart meters and sophisticated sensors on the grid are largely nonexistent, according to Melton. But it goes beyond merely managing data; the real challenge is to get consistently good data to ensure that sensors across the grid are working properly and that key operating decisions can be made based on reliable high-quality information.

Transactive Control

One of the core technologies used in the project is called transactive control, which in essence is two-way communications between electricity generation and end-use devices, such as electric water heaters, furnaces, clothes dryers, etc. The control signals communicate the price of delivering power to that device at a specific time, and the device can decide when to use electricity—with the owner’s consent, of course. This is the underlying technology for demand response (a topic discussed in detail in Navigant Research’s report, Demand Response Enabling Technologies). Project managers were able to show that transactive control works and could theoretically reduce 4% of peak power costs in the Pacific Northwest. But, as Melton says, this would require about 30% of demand on the system to be able to respond in this way. To get there will take a concerted effort to clearly show the value streams to all parties and then figure out the financial incentives.

Clearly, utilities are still in the early phase of the smart grid and handling big (and small) data in new ways is often uncharted territory. Nonetheless, this demo highlights the framework on which the future grid—what we at Navigant Research see as the energy cloud—will be built, and the steps necessary as the grid of tomorrow emerges.

 

The Impacts of the Evolving Energy Cloud

— April 9, 2015

In my July 2014 blog, I discussed how utilities should play both offense and defense as the energy cloud evolves and transforms the energy sector. Navigant Research’s new white paper, authored by Mackinnon Lawrence and Eric Woods, provides an update on the evolution of the energy cloud. To summarize, we foresee the strategic, business model, and operational impacts on incumbent utilities increasing, more so as new entrants play important roles in states like Hawaii, California, Arizona, Colorado, New York, New Jersey, and the Carolinas.

Distributed energy resources (as detailed in Navigant Research’s report, Global Distributed Generation Deployment Forecast) and renewables will continue to grow exponentially over the next 5–10 years globally, driven by expanding customer choices and a rapidly changing technology landscape. This will dramatically affect utilities’ customer relationships and increase the complexity of their operations as distributed, intermittent, renewable energy resources spread and the grid becomes more and more digitized. Below is an overview of the highlights of the themes we see evolving rapidly.

Customer Relationships: The further evolution of distributed generation, energy efficiency, demand-side management, demand response, smart metering, behind-the-meter energy management systems, and social media will drastically change the way utilities interact with their customers—many of whom will generate their own power, sell power back into the grid, and plug in their electric vehicles at night. These increasingly sophisticated energy customers expect increased self-service and new products and services, which in turn will require innovative front- and back-office customer operations. This is likely to lead, in many cases, to a strategic pivot in how utilities proactively engage with customers.

Operations: Increasing the return on capital investments and reducing operating expenditures has historically been a priority for utilities. As the energy cloud revolution spreads, the importance of managing assets and capital will only increase. Utilities must give special consideration to managing assets, particularly procurement and the decommissioning of stranded assets. Additionally, utilities will look to build or acquire distributed energy resources and other disruptive technologies that transform day-to-day grid operations while maintaining security and reliability through climate change and other major shifts.

Regulation: All of this will also have a profound impact on regulatory policy, raising the question: will current deregulated market structures be forced to change? The utility industry is vital for the global economy, and is regulated as such. As the energy cloud matures, the regulatory environment can and must change. For a more detailed examination of likely regulatory shifts, please see this blog by Mackinnon Lawrence.

Ultimately, the objective is to provide a safe, reliable, and affordable service to customers. But a fragmented landscape of players (developers, producers and operators, wholesale and retail) will drive the need for organizational, infrastructural, process and data integration, and coordination across the power value chain and could create significant cost in a highly distributed energy infrastructure environment. It will be very interesting to see how markets will evolve as the energy cloud transformation takes hold. More to come…

Mackinnon Lawrence contributed to this blog.

 

Regulating the Energy Cloud

— April 8, 2015

As discussed previously on the Navigant Research blog (Offense and Defense and Open or Closed?), the electrical grid is evolving toward an energy cloud model in which two-way energy flows support distributed energy resource (DER) integration, transactive energy, and other complex market structures and transactions. Representing a platform on which stakeholders will engage to facilitate greater coordination and sophistication in selling and consuming energy, this network of networks has the potential to be far more flexible, dynamic, and resilient than the traditional grid. These changes are detailed in Navigant Research’s recent white paper, The Energy Cloud.

But ensuring reliable, safe, and cost-effective service—the core focus of utilities—is not guaranteed merely by the energy cloud’s emergence. An enforceable regulatory model will need to emerge that balances innovation and the economic benefits of open market competition with the need to maintain interoperability and coordination across a network made up of many layers of disparate elements.

Avant Garde

Today, we’re witnessing a period of rapid experimentation with respect to regulating utilities of the future in the energy cloud. The dramatic rise of DER on the grid has pressed regulators and utilities alike to respond.

Although many factors will determine how the energy cloud will evolve across different markets, states like New York, Hawaii, and California remain at the vanguard of regulatory experimentation in the United States.  In New York, for example, regulators have proposed initiatives to transform utilities into so-called distributed systems platform providers that would act as the interface between consumers and the bulk power system. Other states (e.g., Massachusetts, Minnesota, and the Carolinas) are beginning to explore alternative models as well, but have yet to challenge the utility’s role as the owner and operator of the distribution system.

The issue of who ultimately owns the distribution grid is at the heart of the energy cloud’s evolution. On one hand, the government and the public want to increase competition to achieve lower cost and additional service for customers. On the other hand, the increased complexity and cost to manage distributed intermittent resources across the grid could drive reregulation and consolidation.

Independent Operator

In the latter case, few challenge the need for a centralized authority to do so. Proponents of this approach, such as former Federal Energy Regulatory Commission (FERC) chairman Jon Wellinghoff, support the creation of an independent distribution system operator to manage the distribution grid even if utilities ultimately own the system.

This trend stands in direct opposition to a historic transition toward deregulation in the United States that is already underway. Deregulated markets are expected to allow for more experimentation with respect to business models, thus creating a competitive market for power generation and allowing retail customers to decide who supplies their electricity. However, a lack of standardization and coordination within these markets could make it much more difficult to ensure reliable, safe, and cost-effective operation due to a high level of market fragmentation. The growing footprint of DER, for example, requires much tighter integration and a stronger coordination of demand and supply across the energy value chain.

While it might seem that over-regulation or reregulation would stifle innovation, with respect to the energy cloud, the opposite may in fact prove true. Regulated markets, it turns out, may provide more stable platforms for a coordinated rollout of energy cloud infrastructure and capabilities.

Jan Vrins contributed to this blog.

 

Virtual Power Plants Harness the Power of the Energy Cloud

— May 29, 2014

Among the elements of the emerging energy cloud – i.e., the assembly of dynamic networks that can enhance the efficient allocation of distributed energy resources (DER) benefits across a broad customer base – virtual power plants (VPPs) are among the most powerful and flexible.  Enabling power providers to take advantage of economies of scale through aggregation and optimization, VPPs maximize the value of electrons flowing across the system.  Schneider Electric, which is among the long list of companies exploring the VPP opportunity, , likes to use the analogy of Amazon when discussing VPPs: while the store may be virtual, the assets delivered, whether books or CDs or electricity, are real.

The primary goal of a VPP is to achieve the greatest possible profit for asset owners while at the same time maintaining the proper balance of the electricity grid.

Navigant Research’s new analysis, which tracks spending on software networking products and services for VPPs, forecasts that the market will grow from just over $1 billion in annual revenue in 2014 to more than $5.3 billion by 2023.

Total VPP Vendor Annual Revenues, Base Scenario, World Markets: 2014-2023

 

(Source: Navigant Research)

Unifying the Cloud

Vendors such as Ventyx, a subsidiary of ABB, now offer asset performance software for managing assets, operations as well as smart grid analytics as a cloud-based software-as-a-service (SaaS) – the ultimate virtualization of our energy services.  Today, virtually every major regional power grid in the United States relies on Ventyx’s software analytics to manage complexity at the transmission level.  Yet the company is moving away from customized software solutions to a more standardized, unified smart grid architecture that reaches down to the retail customer level.

In May, Ventyx announced that it will roll out some of its product offerings via Microsoft’s Azure cloud platform.  Asset Health, the predictive analytics component of Ventyx Asset Performance Management, is already available as SaaS on the Ventyx website.  It’s offered under a single quarterly subscription fee, delivered via Azure and accessed from the customer premises using the Internet.  Its cloud-based demand response management system service, developed in collaboration with Deutsche Telekom, has also been commercialized at the T-City project in Friedrichshafen, Germany.  Additional Ventyx Asset Performance Management applications will be available in the cloud over the coming months.

This move is significant for the growth of VPPs because it will enable electric utilities and power generation companies to invest in smart grid functionality without costly investments in IT infrastructure, workforce, and ongoing maintenance.  According to Ventyx, the cloud model is also highly configurable, highly secure, and highly scalable.

Navigant Research’s webinar, “The Energy Cloud,” will explore VPPs and other elements of this emerging distributed architecture, on June 3rd at 2 p.m. ET.  Click here to register.

Taylor Embury contributed to this blog.

 

Blog Articles

Most Recent

By Date

Tags

Clean Transportation, Digital Utility Strategies, Electric Vehicles, Energy Technologies, Policy & Regulation, Renewable Energy, Smart Energy Practice, Smart Energy Program, Transportation Efficiencies, Utility Transformations

By Author


{"userID":"","pageName":"Energy Cloud","path":"\/tag\/energy-cloud?page=15","date":"2\/24\/2018"}