Navigant Research Blog

Transmission Superhighway Takes Shape

— October 20, 2014

In a previous blog I focused on the expansion of high-voltage transmission systems driven by utility-scale wind generation in the multistate arc that stretches across the central United States, from the Texas Panhandle to North Dakota.  Many of us have underestimated the impact and potential of this resource as a contributor to many states’ renewable portfolio standard targets (RPS).  Headlines about new utility-scale solar projects obscure the fact that installed utility-scale wind capacity is at least 5 times that of solar.

Recently, I looked into the long term electric transmission plans for every region in the United States, and found interesting developments in the Southwest Power Pool (SPP) region.  SPP covers much of the Great Plains and the Southwest, including all or part of an eight-state area that includes Arkansas, Kansas, Louisiana, Mississippi, Missouri, New Mexico, Oklahoma, and Texas.  The geographical footprint of SPP overlaps slightly with other independent system operators (ISOs) and regional transmission operators (RTOs) such as Midwest Independent System Operator (MISO).  SPP’s footprint can be seen in the map below.

SPP Regional Footprint

 (Source: Southwest Power Pool)

In 2008, SPP announced that it plans to build the electric equivalent of the United States interstate highway system – an interstate transmission superhighway that would serve as the backbone of a higher capacity, more resilient transmission grid, while providing increased access to low-cost generation, improving electric reliability, and meeting future regional electricity needs.

The SPP transmission plans I saw show that this conceptual idea is beginning to come to fruition, as new 345 kV transmissions systems are being built and older systems are upgraded.  Many of these projects have been completed by the transmission owner/entities in the region to address congestion issues in corridors like the Omaha/Kansas City to the Texas Panhandle route.  The figure below shows recent transmission system builds and upgrades.

SPP Regional Transmission System

(Source: Southwest Power Pool)

On the Horizon  

Meanwhile, ABB has debuted new, 1,110 kV high-voltage direct current systems.  A recent announcement by ABB on new products with 1,110 kV high-voltage direct current capabilities raises the bar again.  Until this announcement, 765 kV lines were the largest capacity lines available, and most transmission lines are currently in the 230 kV to 350 kV sizes.  ABB and other vendors (such as Alstom Grid, General Electric, and Siemens) are focusing on the Asia Pacific markets in China and India, as well Northern Europe, where major utility-scale wind projects now under construction will need to be connected with urban areas.  ABB’s announcement is exciting because it raises the high-voltage capability to a new level, well above what we currently see here in the United States.  I can only imagine that ABB will be talking to SPP about how to take the transmission superhighway to the next level.


Epic Electric Transmission Crosses the Rockies

— October 14, 2014

One of the most ambitious high-voltage transmission system and utility-scale energy storage projects in history is happening in the American West.  Designed by Duke American Transmission in a partnership with Pathfinder Renewable Wind Energy, Magnum Energy, and Dresser-Rand, the massive plan was recently announced.  As I have discussed in a previous blog, the utility-scale wind generation projects in progress across the High Plains and the Midwest are epic, to say the least.  Transporting this energy to major population centers such as Los Angeles represents major challenges and huge transmission system investments.  The intermittency of the wind resource needs to be managed, as well.  That is why this proposal represents some very creative thinking and engineering.

Driving cross-country from San Francisco to Northern Wisconsin on I-80, I began to better understand the massive geographical challenges that transmission utility planners and operators face.  The idea of moving twice the power that the Hoover Dam in Nevada produces from Chugwater, outside of Cheyenne, Wyoming, to Southern California includes building high-voltage direct current (HVDC) transmission lines across mountain passes up to 11,000 feet in Wyoming, and slightly lower passes in Nevada and California.  These lines will take years to fund and build, creating significant opportunities for major suppliers like ABB, which recently announced new 1,100 kV HVDC transmission system capabilities.

Salt Storage

The other really striking part of this announcement is the grid-scale storage project, which proposes to excavate salt caverns in central Utah and use them to store the wind energy as huge volumes of compressed air, serving as a massive battery, larger than any storage system ever built.  Compressed air would be pumped into these caverns at night, when wind power generation is peaking, and discharged during the day during periods of higher demand. 

The proposal is currently going through what may be endless approval processes at the state and federal levels, but a decision could come as soon as 2015.  In many ways, this new and novel proposal reminds me of the Pacific Gas and Electric (PG&E) Helms pumped storage solution that has been operating since 1984, storing Diablo Canyon’s nuclear output at night by pumping water up into a lake and then discharging it through turbines for peak generation.  The Duke project could be an epic feat of American power engineering to rival Hoover Dam itself.


On the Grid, Time Is of the Essence

— October 2, 2014

The precise calculation of time is at the bedrock of nearly all modern technology, including mobile telecommunications.

In technology, we’re talking really granular time – time measured in milliseconds or less.  Early telecommunications were based upon time-division multiplexing (TDM), and telecoms today still depend upon successors to TDM.  Newer smartphones have onboard accelerometers and gyroscopes to measure velocity in three dimensions – all time-based.

Electric grids are no exception.  An instructive example is the time-synchronized phasor measurement unit (PMU).  Synchrophasors are networks of PMUs that measure the phase angles of the alternating current (AC) at various points along a high-voltage network.  Power flows from higher angles to lower angles, so some difference in phase angles (the phase shift) is expected, but not too much.  As wide area situational awareness tools, synchrophasors can supply an early indication that something is amiss in a high-voltage transmission network.  After-the-fact analysis shows that during the Great Northeast Blackout of 2003, phase angles that were normally shifted by 25 degrees had increased to a 135-degree shift.  Had synchrophasors been widely available and deployed at the time, it is likely that much of the outage could have been foreseen and prevented.

Obey the Time

But time is critical to synchrophasors’ performance.  To ensure coordination, all the PMUs in a network take their time stamp from a single GPS satellite.  The time stamp is added to the reading and sent to a phasor data concentrator (PDC).  This typically happens 30 times per second.  Comparisons at the PDC or other central sites indicate whether or not phase shifts are within expected tolerances at each PMU.  Out-of-tolerance measurements indicate that immediate action is required.

Here’s the problem: if the time stamp is unreliable, then a valid comparison of phase angles is impossible.  Synchrophasors are but one example, and for sure there are worse things that could result from the loss of reliable time service – loss of geospatial information systems, for example.  But the point remains: time is key.

All of this leads to time as an attack surface for smart grids.  PMUs are one of many devices in a grid that rely upon synchronized time to give utilities control of their networks.  Newer clip-on line sensors promise to make distribution management more granular as well, by taking thousands of readings per second.  Again, those readings must be accurately time-stamped to be of any use.

Point of Vulnerability

Disrupt time and you disrupt the grid.  How many ways are there to disrupt the time signal across a synchrophasor network?  Taking out a satellite is an extreme possibility, but there are simpler earthbound approaches.  My paranoid security mind just won’t let me list them in this blog however.

We depend upon time for much of what we do.  We need time readings to be there for us reliably, down to the millisecond or less.  And yet, time is not defined as a U.S. critical infrastructure sector.  Where is the defense for this irreplaceable asset?

I must credit Frank Prautzsch of Velocity Technology Partners for raising time as an issue at a recent cyber security conference.  Frank’s point, which I hope I have amplified here: while we consider complicated attack scenarios against smart grids, there are some really basic things that must also be defended.  Time is among the most basic of them all.


Power Sector Buzzes with Jargon

— October 2, 2014

As a utilities analyst, I encounter a number of buzzwords –  terms that seek to broadly and catchily define the multivariate technologies and approaches that have been developed to modernize the electric grid.  The most common are “smart grid,” “grid 2.0,” and “utility 2.0.”  In this post, I’d like to assist myself, and any interested reader, in better understanding these terms and how they differ.

Supposedly, the term smart grid was coined in 2003 by Andres Carvallo, then the CIO of Austin Energy, to explain the Electric Power Research Institute’s (EPRI’s) Intelligrid – an electric grid that was monitored and managed remotely and incorporated data analytics into processes.  The term didn’t really stick until 2009, when the U.S. Department of Energy (DOE) awarded 99 American utilities a total of $3.4 billion dollars as part of the American Recovery and Reinvestment Act of 2009 (ARRA)-funded Smart Grid Investment Grant.

So what does smart grid mean?  According to the DOE,  it means “computer based remote control and automation … made possible by 2-way communication technology and computer processing.”  Let’s just call it the foundational definition for all of the technological innovation that exists to modernize the electric grid.

One for the Shredder

As for the second, newer term, grid 2.0, it turns out that this buzzword didn’t really pick up that much, and as far as I could ascertain, it’s used synonymously with smart grid.  So we can just throw that one out right now and stop confusing people.

Utility 2.0, on the other hand, is an important conceptual extension from smart grid.  I’m pretty certain I first saw this word last year in reference to microgrids, in a Public Utilities Fortnightly article that explained how different technologies can enable grid resiliency and lessen the impacts of outages.  The term has also been used to describe the concept of utilities revising their decades-old business plans to take advantage of increased renewables generation, distributed energy penetration, advanced demand-side management, and customer engagement.  Last spring, the state of New York introduced its Utility 2.0 plan, which seeks to introduce regulatory incentives for utilities to fundamentally upgrade their business models, operations, and infrastructure.

Ignoring Complexity

The problem with the term utility 2.0 is that, in most cases, it’s used only in reference to how utilities do business, not to the technological and infrastructure considerations that enable this business.  In that sense, it indicates that utilities and regulators and customers are all going to work together, take some financial hits, and pay for and install a smart grid, and it’s all going to be great.  That simple definition ignores the most difficult parts of the process.

We’ve moved past the simple understanding of the smart grid.  We need to better understand the complexity of enabling different systems within the electric grid to function as a cohesive architecture.  This will be a different process for each utility because each system is uniquely configured to adapt to different constraints, and because there are so many different types of offerings out there that are targeted at similar issues.

So, to me, the term utility 2.0 is not just about reshaping business practices and integrating new technologies, such as distributed generation and demand response; it’s the systematic integration of diverse systems that allow for each utility to realize its own transformative goals.  This concept, also called interoperability, might be the single most enabling aspect of updating our electric infrastructure.


Blog Articles

Most Recent

By Date


Clean Transportation, Electric Vehicles, Policy & Regulation, Renewable Energy, Smart Energy Practice, Smart Energy Program, Smart Grid Practice, Smart Transportation Practice, Smart Transportation Program, Utility Innovations

By Author

{"userID":"","pageName":"Utility Innovations","path":"\/tag\/utility-innovations","date":"11\/1\/2014"}