Navigant Research Blog

Making the Case for an Intelligent Edge in an IoT World

— June 12, 2018

Cloud computing gets plenty of attention in IT circles and among grid managers—it is hard to ignore when technology giants like Amazon, Microsoft, IBM, and others keep promoting their cloud solutions. But as the Internet of Things (IoT) concept gains momentum, new attention is being focused on intelligent edge and distributed computing.

This theme was prevalent at the recent Internet of Things World conference in Silicon Valley, where participants pointed out advantages that edge has over the cloud. Jesse DeMesa, a strategy partner at the venture capital firm Momenta Partners, said that a cloud-first or data center-first approach to IoT analytics will not work, and companies will ultimately move toward more autonomous systems. Many current IoT adopters, he said, focus on connecting, collecting, and storing data, while the “real value of data has a shelf life often measured in seconds.”

Getting the Edge on Costs

His point is well-taken. Looking back and analyzing large datasets for actionable insights via a cloud scenario does have value. But gleaning insights within seconds, or fractions of a second, at the edge and making immediate adjustments can be equally valuable, if not more so, when critical operations are at stake or the safety of nearby personnel is in play.

The need to rethink a cloud-first approach was emphasized for cost reasons by HarperDB CEO Stephen Goldberg. During a panel session, Goldberg said the bandwidth needed to push data to the cloud and the edge storage infrastructure is expensive, and ends up grabbing a significant share of the cost in an IoT deployment. He argued that a more distributed computing infrastructure, where edge devices already in place are doing as much computing as possible, is a more rational approach. This is true.

IoT vendors recognize this need for advanced intelligence at the edge. Recent examples of companies with new edge offerings include: SWIM EDX, which processes edge streaming data in real time; C3 IoT and Intel partnering on a new artificial intelligence (AI) appliance for optimizing applications that do not require cloud computing; and Edgeworx, a startup that builds software for edge gateways and micro data centers.

Edge Networks Get Sharper

The big tech players have noticed the growing edge computing need and the advantages of putting more and strategically useful computing horsepower there as well. Dell and Microsoft, for instance, have teamed up to create an integrated IoT edge platform. Amazon Web Services has updated its edge computing platform, called Greengrass, to incorporate machine learning capabilities. Similarly, Hewlett Packard Enterprise’s Aruba subsidiary launched a new network edge solution called NetInsight in March that uses AI to autonomously monitor corporate networks and optimize performance.

What this all means is that the edge of networks is getting smarter. It means companies deploying IoT solutions need a strategy that integrates edge, on-premise, or cloud computing architectures to take advantage of each for an enterprise’s or a grid operator’s own needs and applications. In some scenarios, a cloud architecture makes sense, or an on-premise solution. But more likely the complexity in these deployments will require a sophisticated blend of technologies. As my colleagues Richelle Elberg and Mackinnon Lawrence note in their Navigant Research white paper From Smart Grid to Neural Grid, the future mature Energy Cloud will be based on technologies that integrate ubiquitous connectivity, cloud-based AI, and edge computing. That same type of integration of computing power will be needed for enterprises beyond the energy grid that seek to harness IoT as well.


Utility Cloud Use Soars in 2017

— December 5, 2017

I’ve been analyzing technology markets long enough to have observed the entire cloud computing hype cycle; it’s now a well-understood, mature technology. Which also means that there’s little to write about for an analyst more used to covering emerging technologies. However, from 2008 to 2011, I never got the chance to write a detailed report about cloud computing in the utility industry. While the cloud hype volume was cranked up to maximum, the utilities industry was doggedly refusing to move any IT infrastructure into the cloud. Which was a personal disappointment: for a fan of puns, it was difficult to resist the temptation to write a report titled Utility Computing in the Utility Industry.

I even advised a couple of cloud vendors that wanted me to tell them how to break through the conservatism of the utilities industry. My only advice? “Lobby the regulators, because utilities just aren’t going to budge on this one.” And why? Three primary reasons:

  • Conservatism: No utility ever liked going first with something new. In any monopoly market, moving first was only ever a disadvantage. Rather, utilities would wait for someone else to stump up investment capital and let vendors learn from the mistakes of others before bringing a more reliable product to market.
  • Security: I have been told more than once of security officers halting vendors’ cloud pitches midway because of security concerns.
  • Regulatory: Some regulators would not let data be transported outside of certain geographic areas, killing off any idea for clouds based in other jurisdictions.
  • Finance: The key selling point of cloud is its OPEX-based pricing scheme. While music to the ears of CFOs in other industries, it killed cloud’s chances in utilities rewarded for making capital investments.

Until a year ago, cloud’s adoption by utilities was slow and steady. However, 2017 has marked a dramatic change in the industry’s attitude toward the technology. There is no better way than using cold facts to describe the rapid acceleration of its adoption. In a recent call with SAP, I was astounded at the company’s growth in cloud-based revenue in the first three quarters of 2017: a 90% year-over-year increase on 2016. SAP’s utilities business unit recorded the second-highest growth in cloud revenue across the business, just behind retail.

Market Requirements Have Eroded Resistance to Cloud Adoption

This growth isn’t completely unexpected. European utilities are under significant pressure to reduce costs and are no longer concerned about the OPEX versus CAPEX argument. The cloud is helping them achieve this. Cloud vendors have also made great strides to improve security issues. So much so that investment in security is typically higher than a utility can manage for its onsite data centers. And with the growth in demand for cloud, vendors can build infrastructure in more locations, negating the need to move data across international borders.

What’s next? The industry is becoming more comfortable with cloud, and more IT infrastructure will be moved to it. However, this will be done in a controlled manner. Despite some (frankly laughable) claims to the contrary, private clouds will account for the vast majority of utilities’ use. Core IT infrastructure likely will never be moved to public clouds, due to the inherent increased risk.

Finally, a word of caution. I predict that some utility’s adoption of cloud services will be piecemeal, unstructured, lack a coherent strategy, and uncoordinated. Different departments will procure cloud services for their own departmental means. In effect, reversing the recent trend to consolidate data in a data lake or data warehouse. Instead, new cloud-based operational data siloes will be created, where access to data is restricted. To counter this threat, individual departments must be reined in just enough to ensure enterprisewide data management without choking innovation.


Exelon’s GE Predix Deal Points to a Cloud-Based Future

— January 16, 2017

IT InfrastructureIn November last year, General Electric (GE) announced that US utility Exelon had signed a major deal to use the Predix platform to analyze data from its entire generation fleet. The enterprisewide agreement covers data from Exelon’s 32,700 MW portfolio of nuclear, wind, solar, hydroelectric, and natural gas power. Initially, Exelon will use Predix to help improve operational efficiency: it is targeting efficiency gains of up to 5% and operating and maintenance cost reductions of around 25%.

This was one of the most newsworthy announcements in the utility IT space in 2016, as it marks one of the first major shifts to cloud-based analytics of a large utility’s OT data. It is also GE’s largest Predix deal to date and helps validate the large bets the company has made in the OT analytics space, particularly its acquisition of Bit Stew (data ingestion) and Meridium (asset performance management).

Utilities and the Cloud

When the cloud was a hot topic in the broader IT world 4 or 5 years ago, there was little to write about in the utilities industry. This changed when utilities started to migrate back-office applications such as enterprise resource planning (ERP) and customer relationship management (CRM) to the cloud. Since then, there has been an acceleration of certain IT systems to the cloud, but utility OT applications have remained resolutely on-premise.

As utilities have become more comfortable with the cloud concept, it was only a matter of time before the OT world caught up with IT. Other IT vendors (e.g., Oracle with its Dataraker product) have already recorded numerous successes analyzing utilities’ grid data in the cloud. However, until now, the vast majority of these projects have been small-scale proofs of concept or have focused on smart meter data analytics.

Real-Time Performance

Exelon’s Predix deal marks a new era in cloud-based OT data analytics, as it involves the analytics of large volumes of operational grid data from preexisting sensors. This data has historically been fed into data historians, where it was then cleansed, extracted, and analyzed. The process was inefficient and relied on significant IT resources. By streaming data as soon as it is created into the Predix cloud, Exelon will be able to build a real-time view of its assets’ performance. And while the project will initially focus on asset performance management, if the project is a success, there is little to stop Exelon developing new use cases, particularly in network management.

GE is excited by the deal—it is one of the three largest software deals the company has ever closed—and expects more data from critical infrastructure to go into the cloud soon. While it has obvious strengths in asset performance management (APM), it also expects network management applications to migrate to the cloud. While we recommend a certain amount of circumspection (after all, one big deal does not signify a lasting trend), if analytics start to deliver the promise for huge cuts in operating costs, regulators will start to demand that utilities use these tools to cut customer costs.

Discussions of the deal with GE late last year highlighted yet again how vital C-level sponsorship is when driving digital transformation within a utility business. Exelon’s C-suite has been instrumental in the company’s adoption of new technologies, and GE has invested a great deal of time with executives to support them in making this change possible. There will be no digital transformation without C-level buy-in: the cultural and technological changes required are too great to be driven from individual departments.


Does a Devilish Startup Have an Answer for the Internet of Things?

— June 5, 2012

A new startup aims to advance the idea of the “Internet of Things” in a way that could have a significant impact on the energy business.  Electric Imp was formed in 2011 by former iPhone hardware engineering manager Hugo Fiennes, former Gmail designer Kevin Fox, and veteran firmware engineer Peter Hartley.

Their idea is that products from dishwashers to doorbells to blenders will include slots for “Imp cards,” which will enable users to wirelessly monitor, control, and get alerts from everyday devices.  Electric Imp acts as the web interface in the process, handling cloud services in the background.

The Imp card has an embedded processor and communicates over standard Wi-Fi protocols, including encryption.  The card itself has a familiar SD card form factor.  It’s similar to an Eye-Fi card used in digital cameras for automatically uploading photos to computers, tablets, or phones.  But in this case, the device is controlled via a cloud service that sends alerts when energy rates are cheapest for running a washer, for example, or kicks on lights when certain conditions are met, among other possibilities.

The Imp cloud service acts as the central hub for each device.  In order to set up a Wi-Fi connection, the company uses patent-pending technology called Blinkup to enter SSID and password information on iOS and Android smartphones; this data is beamed wirelessly to an Imp’s light sensor by quickly pulsing the handset’s screen on and off.

The company, based in Los Altos, California, is in talks with equipment makers to get them to build the slots into new products.  The Imp cards themselves will retail for $25 each.  The company plans to release a developer preview bundle in June, with the first compatible devices expected later this year.

Though Electric Imp is an early startup, with little proven in the way of a business, it has a strong management team plus $7.9 million in Series A money from Redpoint Ventures and Lowercase Capital.  If Fiennes and his team can make sense of connecting devices and the Internet beyond what we see today, I wouldn’t bet against them.  The big challenge will be to get manufacturers on board quickly enough and at a scale to make a difference.  And that can be quite a demon to overcome.


Blog Articles

Most Recent

By Date


Building Innovations, Clean Transportation, Digital Utility Strategies, Electric Vehicles, Energy Technologies, Finance & Investing, Policy & Regulation, Renewable Energy, Transportation Efficiencies, Utility Transformations

By Author

{"userID":"","pageName":"Cloud Computing","path":"\/tag\/cloud-computing","date":"6\/24\/2018"}