Navigant Research Blog

Exploiting Continuous Improvement to Achieve Transformation and Efficiency Goals: Part 2

— June 21, 2018

In my last blog, I discussed the forces at play that are fundamentally transforming the utility industry. At the center of this transformation is the shift in the way electricity is generated and distributed, and the evolution of the traditional relationship among stakeholders across the electrical grid, particularly between utilities and their customers.

In this environment, many utilities are adopting programs focused on innovation, understanding that new and bold thinking is required to successfully address these forces of transformation. In a recent survey, the ability to “market new energy and products and services” and to “radically improve ability to innovate” were among the top-ranked capabilities that utilities should develop to meet future challenges.

How Can Utilities Implement Innovation?

However, while many utilities have a familiarity with and muscle memory for Continuous Improvement and the pursuit of incremental quality, fewer are comfortable with the process of rapid innovation. Historically, utilities have not been paid for innovation; the legacy utility business model and regulatory framework has emphasized stability and risk aversion, exemplified by the rate of return financial construct. While the need to deliver safe, reliable, and cost-effective services will always remain at the core of every utility’s responsibility, how those objectives are achieved is undergoing a fundamental evolution that will require innovation in multiple dimensions.

There is much that utilities can learn from companies in the automotive, consumer electronics, publishing, and other sectors when considering how best to successfully adopt innovation practices. The speed of transformation in these and other sectors confirms that innovation efforts must be designed, deployed, and yield real benefits within a new business model. Because adopting an innovation practice is a question of culture change, it is important for utilities to consider the internal resources it has available when seeking to implement an innovation process. And here is the linkage between Continuous Improvement and innovation: Continuous Improvement practitioners can be a driving force for successful adoption of new innovation practices. Here’s how:

(Source: Navigant)

The core tool kit of Continuous Improvement practitioners can be essential to the design, development, and integration of innovation practices into utility operations—and can help those programs yield results.

In my next blog, I will consider how Continuous Improvement in utilities will need to evolve to meet the demands of a rapidly changing sector. Change management, agile, scrum, “outside in,” and other techniques and ways of thinking will be required to ensure success. These and other topics will be considered at the Change Management for Utilities (West) and Process Excellence for Utilities (West) Conferences.

 

New Solar Records Bring the Real Energy Transformation into the Light

— June 19, 2018

Not long ago, utilities fought the introduction of renewables into their markets. Then, they didn’t really understand that the real energy transformation we are seeing today is consumer choice, including the potential of full individual energy independence in some markets driven by technology innovation.

Now utilities are exploiting the falling costs of solar to fend off the upcoming competition.

PPA Prices Are Crashing, despite New Tariffs

Despite all the talk about the new import tariffs and their effect on the installed cost of new PV installations, it appears that developers have squeezed procurement costs to continue breaking records for the lowest cost projects in the country. On June 11, Central Arizona Project (CAP), a local utility, signed a 20-year public-private agreement (PPA) with Origis Energy’s subsidiary AZ Solar 1 for a record low $0.0249/kWh.

A new record came only a day later. On June 12, NV Energy managed to break the $0.024/kWh floor, with a 25-year PPA signed for $0.0237/kWh. The Eagle Shadow Mountain solar project will have a total capacity of 300 MW and is being developed by 8minutenergy. This project is part of a 1 GW solar PV procurement process run by NV Energy as part of its strategy to become the first 100% renewable utility in the US. The procurement process also called for 100 MW/400 MWh of battery storage.

Saving the Monopoly

NV Energy’s push for 100% renewables does come with a caveat. The final green light to invest in these projects depends on the result of an energy choice initiative that will be voted on Nevada this year. If it is accepted, the project could be cancelled. NV Energy was also behind another controversial law, which eliminated net metering in the state in 2015.

This is part of a wider trend, where US utilities are increasingly deliberating not between new conventional generation or renewables, but between central or distributed resources. Behind this issue, utilities are also examining control of the end consumers relationship.

Maintaining control of the end user is key as the industry transitions from one in which the value was in the generation and transmission of electricity, to one in which the value will lie on the services that can be provided in addition to electricity.

Central Renewables versus Distributed Renewables

NV Energy understands that—at least in a resource-rich state like Nevada—the future is solar. Consequently, it wants to position itself as a clean and potentially cheap provider of electricity in the state in the eyes of the ballot voters this year. If it succeeds, it might dodge what is the most dangerous bullet for a utility—the opening of the retail market to competition and the potential race to the bottom that could spark. But even if NV Energy succeeds in maintaining its monopoly status, consumers will retain some choice—installing their own solar and producing their own electricity.

 

What Exactly Is AI? Does It Matter?

— June 19, 2018

The more I read about artificial intelligence (AI), the less clear it becomes what people mean when they talk about AI. And while semantic arguments can be a waste of everyone’s time, a loose definition of Al presents an opportunity for vendors to package software into something that it isn’t. Blockchain and AI vie for position as the most overhyped term in technology today. But blockchain at least benefits from a common understanding of the underlying technology, even if its potential uses are severely overstated. Any buyer of AI-powered software should beware of strangers bearing gifts: always make sure that AI products are capable of what you want them to do, not what the vendor claims.

AI Beats Expectations to Be Simultaneously Everything and Nothing

Everyone has a definition of AI. Your perspective, knowledge of analytics, and desperation to sell some technology have a great bearing on what constitutes AI. I have seen definitions that range from “any code that includes an IF statement” to “nothing currently in existence.”

Technology marketers can never be accused of reticence when it comes to latching on to the latest industry buzzword. AI’s loose definition means it can be applied to virtually any piece of technology. An IF statement in a computer code means that a computer will make a decision based on some form of data input. At a very basic level, this is an intelligent decision.

Many disagree and claim the boundary for what constitutes AI lies in more sophisticated analytics processes. Essentially, people look for examples of a computer mimicking human thought. The big question is: How close to human thought does a computer have to get before it is considered capable of “thought?” Indeed, each time a new development happens in the field of advanced analytics, such as AlphaGo beating a human at the notoriously complex game of Go, someone will always say that it’s not true AI. This trend is summed up in Larry Tesler’s theorem that AI is “whatever hasn’t been done yet.”

The Paradoxical Definition of AI

I have no idea what AI is supposed to be, and I believe that this uncertainty stems from the blurred boundaries of its definition. The sorites paradox explains this well: Starting with one grain of wheat, how many more grains must be added before one has a heap? Similarly, there is no clear answer to the questions around what constitutes intelligence, when AI takes over from basic tech-based processing, or how close to human thought tech processing must get before it can be deemed even artificially intelligent. Some may propose specific demarcation; others immediately disagree.

What Matters Most Is That None of This Matters

We often start reports on technology with a definition to help the reader better understand the boundaries of what we write about. I have spent far too long trying to come up with a definition of AI with which I am comfortable. I can’t come up with a sensible boundary for what does and doesn’t constitute AI, and even if I did, more people will disagree than agree. But who cares? AI really means the latest and shiniest analytics product to hit the market.

What buyers should bear in mind is whether this technology does the job for which it was intended at a competitive price. Just make sure that anything painted in AI colors is going to make a decent ROI for your business.

 

It’s Time Critical Infrastructure Had a Standardized, Licensed Spectrum Band

— June 12, 2018

US power utilities are grappling with challenges ranging from growing distributed energy resource (DER) penetration (e.g., solar and EVs), business model disruption, and low to no load growth. Navigant Research has covered the utility business model extensively in its Energy Cloud thought leadership series, and has found that it will shift markedly over the next 10-20 years. A more decentralized and services-centric model will emerge where value is created, not by the delivery of electrons but rather by the provision of services. This shift will bring a multitude of challenges to electric utilities.

To meet these challenges, utilities will need extensive networking capabilities to support applications ranging from visibility into behind-the-meter energy generation to enhanced customer engagement and more. While in the past utilities have tended to build ad hoc, application-specific, silo-based networks using unlicensed or narrowband licensed frequencies, going forward this approach will not suffice.

Critical Networks Stand to Benefit from a Nationwide Licensed Broadband Spectrum

During the next decade, Navigant Research expects the number of connected devices within the average utility to grow by an order of magnitude—at least—and the volume of data coming from each connected device will also climb. At the same time, the number of non-utility connected devices using unlicensed spectrum bands will increase by 400% or more.

The exponentially growing use of unlicensed bands could affect utility network performance and grid reliability. Add in the numerous operating and business model pressures that utilities today face, and it becomes clear that there is a pressing need for a nationwide, standardized licensed spectrum band around which power utilities and other critical infrastructure providers can build their critical networks.

A nationwide licensed broadband spectrum allocation combined with buildouts based on commercial technology standards will ensure that utilities are able to implement the solutions necessary to manage not only reliable and affordable power delivery, but also financial stability in the rapidly changing energy economy.

Utility Networking Technology Selection Map

(Sources: Navigant Research, Electric Power Research Institute)

If utilities and regulatory bodies can work together to rally around a single spectrum band, this would solve an urgent and growing need of the industry and facilitate roaming and cooperation by personnel from disparate utilities. Vendors could then standardize on the band, bringing substantial economies of scale—particularly if the de facto commercial wireless technology standard, LTE, is used.

For more on this topic, read Navigant Research’s new white paper, sponsored by pdvWireless: The Urgent Need for a Licensed Broadband Spectrum Allocation for Critical Infrastructure. The paper describes the critical drivers behind utilities’ need for licensed, standardized broadband spectrum in the US, including DER integration, growing concern over grid resiliency and the increased risk of cyberattack, and the changing requirements that competition places on customer service organizations.

 

Blog Articles

Most Recent

By Date

Tags

Building Innovations, Clean Transportation, Digital Utility Strategies, Electric Vehicles, Energy Technologies, Finance & Investing, Policy & Regulation, Renewable Energy, Transportation Efficiencies, Utility Transformations

By Author


{"userID":"","pageName":"Utility Transformations","path":"\/tag\/utility-transformations","date":"6\/25\/2018"}