Navigant Research Blog

Show Me the Savings: The Need for Post-Retrofit Data on Commercial Building Retrofits

— August 31, 2010

The energy efficiency retrofit industry for public buildings is relatively well developed worldwide compared to the private building retrofit industry. It is a large market in the United States, with annual revenues for energy service companies in the vicinity of U.S. $4 billion. However, long-term energy efficiency and carbon mitigation targets worldwide will rely heavily on improving the efficiency of the entire building stock. Public buildings represent only about a quarter of the total commercial building stock in the United States, and retrofits have barely begun to touch the private building stock.

One reason for the sluggishness of efficiency in the private building stock is the lack of post-retrofit data on building performance. Although there are many successful examples of retrofits in the private sector, the industry as a whole needs a robust set of data on post-retrofit performance and payback before they will be convinced that the opportunity to reduce operating costs is real, the risks are low, and the ROI is high enough to justify investments in efficiency.

Today, efficiency retrofits are typically based on a predictive model in which building engineers model energy use based on the building’s equipment, envelope, climate, and usage patterns. However, the actual performance of the building can often diverge significantly from the predictive model, and this discrepancy makes private building owners reluctant to invest in energy efficiency. It also makes financial institutions unwilling to provide the necessary financing to support efficiency projects, as the perceived risks (that buildings will not meet their predicted efficiency levels) are too high.

There is also the issue of payback period. Whereas the public sector is willing to accept payback periods of 10-15 years, the private sector rarely accepts paybacks of over 4 years. Additional post-retrofit performance data would ease concerns about the paybacks of certain measures and reduce the perceived risks. Today there is a tendency toward “cream skimming,” or selecting only measures with the fastest paybacks (such as lighting retrofits, energy control management systems, and retrocommissioning). Reliable data on the ROI for all efficiency measures would give investors the confidence they need to invest in efficiency.

The industry is just now starting to take the first few steps toward addressing this major barrier. For example, the Deutsche Bank Americas Foundation, a philanthropic arm of Deutsche Bank, is starting to compile a set of data on several hundred buildings in New York City. Gary Hattem, the president of the foundation, argues that “if underwriters can determine a predictable savings from retrofits, then they can create a financial instrument backed by these savings to sell on the open market.” In other words, data on post-retrofit building performance would reduce the perceived risks and free up capital for efficiency.

If the Deutsche Bank Americas Foundation and other groups can compile post-retrofit performance data in commercial buildings, it would do a lot to push the needle and lower the bar for private building owners to begin investing in energy efficiency.



| 1 Comment »
 

We hacked the smart meter! Tell the world!

— August 31, 2010

Why are we all so excited about the Smart Grids anyway? Potential. Smart Grids hold potential to save consumers money on their energy bill. To save utilities capital expenditure for new generation. To guarantee energy independence for our nation – for your nation, wherever you are. To do more with less energy. Some believe that Smart Grids will save the earth. There is just so much potential!

Bear Bryant once said, “Potential means you ain’t done it yet.” Others heard him say, “Potential is what gets you fired.”

Where is that more true than Smart Grids? We certainly ain’t done it yet and several jobs – and political careers – are being risked in its name. So it may be worth a diversion from our day-to-day worries to consider what could prevent our realizing Smart Grids’ potential.

Here’s one possibility: Fear. And who’s afraid? Ratepayers. Ratepayers that elect politicians.

Anything that sows fear into the hearts of ratepayers – consumers – can make its way back to elected representatives. Those politicians, wishing to remain in office, will oblige their constituents and work to retard Smart Grid deployment. We know that this can happen. Already we have seen several Smart Grid deployments stalled by the outcry over increased energy bills.

Now consider another fear: terrorism. The press reports regularly that smart meters have been hacked. Perhaps the first major story was when a well-known security research firm showed us in 2009 that a smart meter could be successfully hacked.

But they didn’t really show us that a smart meter could be hacked. We already knew that. That any IT device can be hacked is a given – it’s part of our daily lives. Their research showed us how. Full credit for that – it’s valuable research that can only lead to better security. Better security because the threats are better understood, and better security because some meter manufacturers are blasted out of their complacency. That’s okay. This is extremely important work that needs to be done.

What’s not okay is going on national television with a sensational story that smart meters have been hacked. Of course they have. If you have sold 100,000 units of hardware or software and it has not been hacked, there are two plausible explanations:

• It cannot be hacked because it doesn’t do anything

• It actually has been hacked but you haven’t discovered the hack

So what is it to go on TV and tell an uninformed public that doesn’t understand IT that their electric meters can be hacked? How about sensationalism? Or sowing fear? Is all this negative publicity for smart metering bad? Yes.

The public already fears smart meters, after stories in several states of electricity bills suddenly doubling when smart meters were installed. The explanations often turn out to be pedestrian: “Your electricity bill doubled because your old meter was rusted out and not accurately recording your consumption.” But no matter the explanation, mainstream media have been quick to pick up on the thread: Computer geeks botch another technology roll-out; don’t understand the real world, etc.

For sure there have been errors in smart metering roll-outs. Regardless, consumers fear smart meters, and the media seem content to stoke those fears. So what is the need to layer fear of terrorism on top of that? The typical consumer has no idea what a cyber-attack is, let alone whether a worm is more likely to propagate in an RF mesh, an RF star, a PLC network, or none of the above.

What if instead we considered that we are still in the deployment phase – admit that there will be problems, find them through sometimes brilliant research, and then fix them? That’s not to say that we should cover up any bad news. Where there are problems they must be aggressively communicated to the right people and quickly remedied. But telling the general public that as an industry we haven’t got our act together – that only sows more fear in consumers’ hearts. Who will then write their elected representatives, and repeat the cycle we’ve already seen once with billing issues. So why not go easy on the sensational stories for a while?

Sad to say about the Cyber Security world: we’ve chosen a profession where the true heroes are anonymous.



| 3 Comments »
 

And the Smart Grid Communications Winner is….

— August 30, 2010

In the weeks since our smart grid networking and communications report was released, we’ve had some interesting industry reactions. Some press folks and, well, utilities just ask “which technologies are the winners and which are the losers?” Perhaps anticipating this, vendors have been quick to reinforce how their particular networking flavor is better than their competitors.

Technically, there are many potential points of comparison, including bandwidth, latency, range (particularly for wireless), reliability, security, and of course, cost. Since the suitability of any given technology depends on application requirements, we outlined the key smart grid applications and their requirements: HANs, AMI NANs (Neighborhood Area Networks), AMI backhaul, Distribution Automation WAN, and Substation Automation WAN. We defined the requirements rather broadly, as they vary considerably on a case-by-case basis. We then surveyed over 16 different communications technologies and outlined their attributes against these applications requirements.

The complications arise when trying to offer summary comparisons between the technologies, as attempted in the nearby table. For example, bandwidth might seem like a straightforward metric to characterize, however the bits-per-second of a link may be a poor predictor of actual application throughput. Node-to-node performance in a mesh network is highly dependent on the number of hops and link contention within these hops. Depending on customer deployment decisions, a network with 19.2 kbps links could outperform a network with >100 kbps links. Extending this logic, a star-based topology, such as a 3G public network, might then seem better compared to a mesh. And yet, the latency across an IEEE 802.11 broadband mesh (aka “metro Wi-Fi”) may still be an order of magnitude less than that of a public wireless network when all the access protocols and various backhaul networks hidden within a public network are factored in. And some technologies may offer a wide range of bandwidth options depending on range (e.g. WiMAX) or cost (e.g. satellite). Which data point to choose for a comparison? Even cost has many variables: the cost of fiber cable and equipment continues to drop, but that hardly matters if digging up an interstate highway or crossing a mountain range in order to install it.

The bottom line, as we are careful to point out in our report, is that any summary comparison needs to be understood only as a starting point. Ultimately, for any given project, the various choices – technology and individual vendor – need to be evaluated against carefully constructed use cases. As for winners and losers, there are some general principles that do seem to universally apply:

1. Standards are important. If the evolution of data and telecom networks have demonstrated anything, it is that proprietary technologies invariably yield to industry standards. Distressingly, the largest smart grid application in terms of number of nodes (smart metering and AMI), is the least standardized. This will change.

2. Security must be baked in. A secure network means much more than having a bit of link encryption or vague support of the “IP security suite”. It must include a comprehensive end-to-end security regime including strong key management, and is a business process issue as much as a technical concern.

3. Evolution flexibility trumps application-specific bells and whistles. The smart grid is a big, long-term endeavor, and none of us really knows quite where it will lead. Flexible layered network architecture is key to accommodating as yet unforeseen changes.

So, while our report may imply some winners and losers within our per-application forecasts for each of the 16+ smart grid networking technologies, ultimately, it is up to you to pick the ‘winner’ for your application. I hope we can help, and good luck!



| No Comments »
 

Ford EVs Connect with Portland, GM Swings and Misses

— August 27, 2010

In advance of the launch of the Ford Transit Connect Electric van and other EVs, Ford started a 14-city promotional tour in Portland, Oregon on Monday. Ford will sell limited numbers of Transit Connects later this year and electric Ford Focus’ later in 2011. Ford is working with utilities, local government, and Portland State University to ensure that the city is ready with a charging infrastructure when the EVs arrive.

Portland is a natural fit for emissions free driving. The mentality of its citizens is as a green as the ubiquitous coniferous trees and the adoption of hybrids is among the highest in the nation. Ford and Nissan recognize the opportunity to sell plug-in vehicles in the Northwest and have spent time and money marketing their vehicles in Oregon and Washington. Earlier this year truck company Navistar chose Portland to unveil its eStar all-electric truck.

In choosing to invest time in Portland and make vehicles available here first, Ford will leverage a growing EV infrastructure. Portland is a participant city in the DOE’s EV Project, and the area will have 1,000 EV charge spots in place by July 2011, according to Portland General Electric’s Charlie Allcock. Allcock said “some” of the 800,000 smart meters being deployed by PGE will be able to communicate with charging equipment. About one-third of PGE customers do not have a location at their primary residence for convenient home charging, a situation that Allcock said the utility is studying to find a solution.

Ford is also working with Microsoft so that the charging information that is collected wirelessly via the Sync platform used in its vehicles will be shared with the Redmond, Washington company’s Hohm servers and made available online to Ford EV owners. Ford Manager of Electrification and Infrastructure Mike Tinskey said that the telematics system and Sync platform (which was launched in 2008) will be a differentiator for the vehicle, a reference to Nissan, which is creating a telematics and communications system for the Leaf.

The Northwest appears to be in General Motors’ blind spot as the company is skipping over Portland and Seattle for its first shipment of the plug-in Chevrolet Volt, which won’t be available in the region until sometime in 2012. GM representatives told me that the decision to sell first in states including California, New York and Connecticut (really?) was more about getting the maximum attention than satisfying the most rabid demand for EVs.

Representatives at Plug-in 2010 last month said that the company is “waging a media war” and felt that the other states had higher visibility in the press. GM is confident that because the Volt will be produced in such small numbers (only 40,000 through the end of 2012) that they’ll sell out even if the Northwest isn’t among the early areas to receive the cars.

While marketing to Times Square instead of Pioneer Square may serve GM in the short run, eschewing among the most adamant EV audiences that also will have the robust infrastructure for plugging in vehicles is short-sighted. By the time the Volt rolls into the Northwest, consumers will have alternatives from Mitsubishi, Toyota, Coda Automotive and Fisker Automotive to choose from. EV adoption is likely to cluster around largely coastal metropolitan hubs, and being the first Green on the block with an EV might require purchasing whatever is available at the time instead of queuing in a long line and hoping that a vehicle arrives.



| No Comments »
 

Blog Articles

Most Recent

By Date

Tags

Clean Transportation, Electric Vehicles, Policy & Regulation, Renewable Energy, Smart Energy Practice, Smart Energy Program, Smart Grid Practice, Smart Transportation Practice, Smart Transportation Program, Utility Innovations

By Author


{"userID":"","pageName":"2010 August","path":"\/2010\/08","date":"11\/23\/2014"}