Navigant Research Blog

The Fog of Big Data

— August 14, 2013

Big Data and the analytics to extract useful information from it have great potential for smart buildings technology.  As these terms are used more broadly, though, they risk losing their original meanings.  For many people, Big Data means “a lot of data” and analytics is diluted to mean “processing a lot of data.”  These general usages risk blurring the real opportunities afforded by Big Data and analytics in the smart buildings sector.

This was reinforced at two June vendor conferences I participated in: Realcomm’s IBcon conference in Orlando and Schneider Electric’s Xperience Efficiency event in Washington D.C.  Schneider marketing director Kent Evan’s trends presentation at Xperience Efficiency cited the early definition of big data, as described by META Group (now Gartner) analyst Doug Laney in a 2001 research report.  That is, Big Data has three attributes: volume, velocity, and variety.  But Evans reminded the audience that not every data problem in buildings is a Big Data problem – there’s plenty of work we need to do to make better use of the small data we have available to us.

Faster, Faster

While I generally agree with this assessment, it is still worth reviewing how smart building data volume, velocity, and variety are challenging traditional building management systems.  Certainly, data volume is growing as building control systems with hundreds of control points morph into complex systems with many thousands of points, driven by more granular sensors and controls. The velocity of this data is also increasing as sensors are sampled more frequently and denser submetering deployments provide more information to systems operations.  The variety of the data is growing as well, especially as different systems – ranging from HVAC to lighting to security (including video) – become data sources for an increasing range of potential applications.  Building data may not suffer from as many structured versus unstructured data challenges as experienced in other industries, but existing applications are often straining to process this data.  More importantly, new types of processing offer new insights and control optimizations.

This processing issue gets to the second part of my concern about the imprecision of the term “analytics.”  There were dozens of software vendors at the IBcon conference hawking buildings analytics packages, and on the surface, the marketing messages blur the distinction between them.  At the risk of oversimplification, these packages can be sorted into two buckets: those that use complex rules to effectively sort through lots of data to discover the desired information, and those that algorithmically fuse disparate data sets together to infer new actionable information and insights.  Products in both buckets can perform impressive and useful tasks, but the latter is the class of applications that are best described as true analytics.

Perhaps the best advice for building operators trying to sort through the Big Data and analytics hype is to focus on the specific problems at hand and understand how the proposed solutions offered arrive at their answers.  Whether they involve Big Data or little data, analytics or advanced rules engines, solving problems is the ultimate goal.

 

Dublin Digs Deep with City Data

— June 24, 2013

Cities that want to take advantage of new technologies to improve their operations should be ready to embrace both top-down investment in new management and control systems and bottom-up innovation from a wide range of stakeholders.  Dublin provides a good example of a city that is taking advantage of both approaches to attack some critical city issues.

The Irish capital faced a serious congestion problem as its economy boomed before the credit crunch.  Some estimates suggested that congestion was costing the economy over 4% of GDP.   While the economic downturn has eased the pressure on the traffic system in the short term, the city realizes it has to get smarter at dealing with the underlying problems.

The city’s transportation managers have been working with IBM’s Smarter Cities Technology Center, which is based in Dublin, to understand how they can use data analytics to help optimize traffic management and improve the operation of the city’s bus system. Dublin has no metro, so the bus system is particularly important for transportation in the city.

Working with the IBM research team, the traffic department has combined data from bus timetables, traffic sensors, CCTV and real-time GPS updates for the city’s fleet of 1,000 buses.  This data is used to build a digital map of the city, overlain with the real-time position of each Dublin bus. This allows traffic controllers to see the status of the whole network, drill down into problem areas and make informed decisions on the best actions to reduce congestion.  The data also enables better optimization of traffic management measures and of the bus schedule.

The SPUD Effect

I spoke to Brendan O’Brien, Head of Technical Services, Roads and Traffic Department at Dublin City Council, about the impact of the system at an IBM-hosted event in the city in May.  I asked him how this data had changed the city’s approach to managing the city’s transport.  O’Brien said his team can now combine macro and micro levels of management much better, viewing problems in specific locations while also developing better informed strategic plans for the city.  The challenge is to find time to take advantage of these strategic insights.

Dublin is not only looking to the city’s control systems and big data analytics to improve insight into traffic and transport conditions, but also at the possibilities offered by open data. Dublinked, the city’s open data platform, provides an impressive range of public data sets and enables third parties and individuals to contribute data.  Dublin City Council and other local authorities in the Dublin region are working with the National University of Ireland Maynooth to explore the opportunities for service innovation and collaboration with other agencies and suppliers. Mapping of disabled parking spaces in the city, for example, has been done through crowdsourced information.  IBM has also been using the data to demonstrate the possibilities for data analytics on open data platforms with its Semantic Processing of Urban Data (SPUD) demonstration.

Dublin is a good example of how a smart city strategy should not depend on any single system or application, but rather on the innovative use of multiple tools and applications, shared data, and collaborative networks for innovation.

 

Utilities Flunking Big Data 101

— August 2, 2012

Oracle has released an intriguing study that includes a survey of utility executives who collectively say they are not doing so well when it comes to Big Data – i.e., the challenge of making sense of large volumes of complex data that can be transformed to help improve business operations and customer service.  In fact, the utility executives rank themselves among the least prepared to handle the data deluge, something we at Pike Research have noted as a major challenge for the industry as it transforms itself with smart grid technologies.

Here is the relevant scorecard, according to Oracle’s study:

  • Public sector (government), health care and utility industries are the least prepared, with 41% of public sector executives, 40% of health care executives, and 39% of utility executives giving themselves a grade of “D” or “F” for preparedness
  • Overall, 29% of the C-level executives surveyed from 11 different industries in North America give their firms a “D” or “F” for preparedness
  • By contrast, communications industry executives claim to be the most prepared to handle the data deluge, with 20% giving themselves an “A”

Practically all the executives surveyed are concerned about the near- to mid-term data challenges, with 97% saying their firms must improve their use of data over the next two years.  Thus utilities are not alone, just behind the curve compared to others.

So why are utilities flunking out?  One reason is a lack of experience.  Few utilities have ever seen such a huge volume of data generated from utility processes.  In the past, utilities read a meter once a month and sent a bill.  That’s roughly 12 data points per year per meter.  With the latest technology, that volume expands exponentially, with the potential number of meter reads increasing to more than 35,000 per year per meter (based on 15-minute intervals).  That’s an increase of 291,900%!

And that’s just the meter read.  Some advanced meters also send and receive other data related to pricing, pre-paid options, load control, tamper detection, and temperature, which can amount to hundreds of additional attributes that must be processed and correlated.  In addition, the grid itself is being outfitted with a plethora of new two-way communicating devices that report their measurements even more frequently than most smart meters.  These additional types of data create new challenges that many utilities are just now trying to comprehend, and leverage to their benefit and that of their customers.

Moreover, the utility industry lacks skilled data analytics experts.  With just about everyone across the corporate spectrum seeking these people, utilities have to compete for scarce talent.  Most people with these skills are gobbled up by sexier industries, like communications or web-based businesses.  As a result, utilities are turning to third-party vendors for help with data analytics, and that will help ease the burden for the near-term.  However, utilities that want to leverage the data in their own way and perhaps save the cost of outsourcing will need to bring this expertise in-house.

The data deluge is a growing, long-term issue that will test utilities for years to come.  The message from the Oracle survey does have an upside, however: executives know they are failing, which is the first step in making the necessary changes to improving their performance.

 

Utilities Get Smart with Smart Meter Data

— June 1, 2012

The Edison Foundation announced in May, 2012 that as many as one in three households in the United States now have smart meters. Yet, utilities stand to know even less than they did before about some of their operations once they deploy an advanced metering infrastructure.  That is, if they don’t find a way to turn smart meter data into actionable information.  With electro-mechanical meter reading, meter readers do more for the utility than just read consumption values off of the meters; they also look for signs of meter tampering and fraud, and inspect the meters and utility assets for their condition.  Without these eyes in the field, utilities may experience problems identifying theft and knowing when other field assets may be in need of repair.  Smart grid data analytics can help address these problems, but given the volume and velocity of the incoming data, utilities must first define the business challenges they are trying to overcome.  Getting and storing the data is not the issue; making sense of it is.

In a field whose practitioners are said to be “reading the electronic tea leaves,“ many vendors focus on technology solutions to solve the smart grid big data problems.  And perhaps rightly so, as in the discipline of data analytics, issues like data quality, usefulness of the algorithms, scalability, and performance are key factors to deployment success.  However, in choosing the right solution, successful utilities consider not only their most pressing issues like meter-to-cash operations, demand response and customer service, but also how to leverage their smart grid to introduce operational efficiencies, such as voltage optimization and asset protection.

Figuring out how to do this effectively requires defining the business problems where a data analytics solution can deliver a promising return on investment (ROI).  Not all solutions will.

Smart grid data analytics (stand by for a Pike Research update of this report in 2012) is part of the process towards achieving a fully optimized utility in a modernized electricity delivery framework.  The optimized utility has learned how to leverage data analytics for meter operations, grid optimization, asset control, and renewables integration for the benefit of both the utility and the consumer who desires to save money or use energy more efficiently.  Vendors may offer a variety of strategies towards achieving these goals, including managed services and applications, enterprise applications, platforms, and visualization tools.  It’s likely that not one solution will fit all and a flexible approach – achieved through partnerships and open systems – will be the most powerful.

As smart meters and other grid sensors make their way at a rapid pace into the field, a strategic, business-focused, viewpoint from utility stakeholders will go a long way toward ensuring solid decisions that support the evolutionary nature of the smart grid and all the information that can be gleaned from it.

 

Blog Articles

Most Recent

By Date

Tags

Clean Transportation, Electric Vehicles, Energy Management, Energy Storage, Policy & Regulation, Renewable Energy, Smart Energy Practice, Smart Grid Practice, Smart Transportation Practice, Utility Innovations

By Author


{"userID":"","pageName":"Data Analytics","path":"\/tag\/data-analytics","date":"4\/17\/2014"}