Navigant Research Blog

Building on Big Data

— November 10, 2014

Advanced methods of interpreting large volumes of data have brought innovations in areas such as healthcare/pharmaceuticals, meteorology, marketing, e-commerce, government services, national security, and financial services.  Despite success in other areas, though, big data is only beginning to have an impact on building automation and energy efficiency.  In a 2013 blog, my colleague Bob Gohn discussed big data in the context of buildings.  In this blog, I’ll take a look at some of the solutions emerging in this area and how the buildings industry will be affected.

Continual Correction

Currently, the most common use for big data in buildings is fault detection and predictive maintenance.  Advances in sensor technology have enabled unprecedented views into the status and functionality of building systems such as heating, ventilation, and air conditioning (HVAC).  Sensors are capable of regularly measuring every aspect of the system’s performance by analyzing the data to identify equipment that needs to be replaced or may be about to fail.  Bringing technicians onsite to service equipment can be a major expense for building owners.  This type of data analytics allows a diagnosis to be made before the technician arrives, while also providing information on replacement parts and other relevant items. Data analytics solutions can also build a list of the known problems in a building and derive each piece of equipment’s usage and cost, enabling a quantitative return on investment (ROI)-based assessment of which upgrade or investment should be implemented first.

As building automation and data analytics continue to advance, new applications within the buildings industry are emerging.  Advanced building energy management systems (BEMSs) harness large quantities of data to provide a visualization of the overall energy consumption of a building or portfolio of buildings.  These systems also have the ability to leverage historical data to provide recommendations for how to best reduce consumption.  Next-generation BEMSs have the capability to adjust building system parameters automatically to maximize occupant comfort and energy efficiency.  One example of this type of advanced system is SHIFT Energy’s Intelligent Live Recommissioning (ILR) solution, which provides ongoing re-adjustments.  Another cutting-edge solution is offered by Ecorithm, whose program also includes richly detailed graphics to visualize processed data across a building’s floor plan, identifying areas of waste and recommending corrections.

Designed with Data

Big data is also playing an increasingly important role in the design of resource efficient buildings.  Building information modeling (BIM) programs allow architects to analyze key performance metrics such as natural ventilation, daylighting, solar heat gain, overall energy usage, and even how people will likely interact with spaces.  These programs utilize vast amounts of data from existing buildings to visualize how a conceptual building may perform.  Such analysis can speed the construction of new buildings by leveraging the data-rich plans from previous projects, modified to fit the specific characteristics of the new site.  This also allows designers to cut costs by eliminating the duplication of work from past projects.  Reducing the time and cost required to construct new buildings is an essential factor in addressing rapidly growing urban populations that lack sustainable buildings and infrastructure.

Despite these achievements, the buildings industry is not yet exploiting available data to the extent that other industries are.  Looking forward, advances in building design, construction, and management can leverage big data and advanced analytics to reduce costs and improve efficiency.  As buildings and cities become increasingly automated and digitalized, data analytics will play a growing role in energy efficient buildings.

 

Coming to the Motor City: A Smarter Grid

— July 13, 2014

The smart grid in Detroit is about to get smarter – and so are utility industry executives exploring options for real-time grid data and analytics.  Distribution grid sensor developer Tollgrade Communications recently announced a $300,000 project to deploy its LightHouse sensors and predictive grid analytics solution across DTE Energy’s Detroit network.  The companies aim to demonstrate how outages can be prevented.

The 3-year program was selected as a Commitment to Action project by the Clinton Global Initiative (CGI) at the recent CGI event in Denver, where Tollgrade CEO Ed Kennedy took to the stage with former president Bill Clinton to discuss the project.  Tollgrade, Kennedy said, will make public quarterly reports on the project, beginning in 1Q 2015, identifying best practices and sharing detailed performance statistics.

Cheaper Than Building a Substation

With 2.1 million customers and 2,600 feeder circuits, DTE Energy has already begun piloting the system around Detroit, and Tollgrade says that it hopes to prevent 500,000 outage minutes over the next 3 years.  Because of the heavy concentration of auto manufacturing in the Detroit area, those saved minutes should translate into substantial economic benefits.  The system will leverage several communications protocols, including DTE’s advanced metering infrastructure communications network, reducing the startup cost and improving the return on investment.

The sensors will be placed along troublesome feeders as well as outside substations where older infrastructure increases the likelihood of outages.  Combined with the predictive analytics solution, the sensors cost just a few thousand dollars per location and could help DTE Energy avoid or defer replacing a million-dollar substation.  Both investors and regulators are sure to like those stats.

Predicting Change

Predictive grid analytics has been a hot topic in the industry for the last few years, but only recently have the prices of solutions and sensors fallen to a level where utilities can justify the cost to deploy them widely throughout the distribution network.  Navigant Research expects the market for distribution grid sensor equipment to grow from less than $400 million worldwide today to 4 times that amount by 2023.  (Detailed analysis of distribution grid sensors can be found in Navigant Research’s report, Asset Management and Condition Monitoring.)

Since its first meeting in 2011, CGI America participants have made more than 400 commitments valued at nearly $16 billion when fully funded and implemented.  The Modern Grid was one of 10 working groups this year; others include efforts in Sustainable Buildings and Infrastructure for Cities and States.

Another CGI Commitment to Action grant announced last week will fund a market-based, fixed-price funding program for solar and renewable technologies.  The Feed-Out Program from Demeter Power will support solar-powered carports with electric vehicle charging stations at a net-negative cost to the customer.  In other words, eligible businesses pay a fixed monthly fee to Demeter Power (lower than their previous monthly electricity bill) and their employees and customers enjoy free car charging while parked there.  Demeter will own and maintain the infrastructure.

The program will initially make financing available to commercial properties located in Northern California communities participating in the California FIRST property assessed clean energy (PACE) Program, which is offered through the California Statewide Community Development Authority.  Interested participants must register with Demeter Power Group to participate in the program, which is expected to launch in the first quarter of 2015.

 

The Fog of Big Data

— August 14, 2013

Big Data and the analytics to extract useful information from it have great potential for smart buildings technology.  As these terms are used more broadly, though, they risk losing their original meanings.  For many people, Big Data means “a lot of data” and analytics is diluted to mean “processing a lot of data.”  These general usages risk blurring the real opportunities afforded by Big Data and analytics in the smart buildings sector.

This was reinforced at two June vendor conferences I participated in: Realcomm’s IBcon conference in Orlando and Schneider Electric’s Xperience Efficiency event in Washington D.C.  Schneider marketing director Kent Evan’s trends presentation at Xperience Efficiency cited the early definition of big data, as described by META Group (now Gartner) analyst Doug Laney in a 2001 research report.  That is, Big Data has three attributes: volume, velocity, and variety.  But Evans reminded the audience that not every data problem in buildings is a Big Data problem – there’s plenty of work we need to do to make better use of the small data we have available to us.

Faster, Faster

While I generally agree with this assessment, it is still worth reviewing how smart building data volume, velocity, and variety are challenging traditional building management systems.  Certainly, data volume is growing as building control systems with hundreds of control points morph into complex systems with many thousands of points, driven by more granular sensors and controls. The velocity of this data is also increasing as sensors are sampled more frequently and denser submetering deployments provide more information to systems operations.  The variety of the data is growing as well, especially as different systems – ranging from HVAC to lighting to security (including video) – become data sources for an increasing range of potential applications.  Building data may not suffer from as many structured versus unstructured data challenges as experienced in other industries, but existing applications are often straining to process this data.  More importantly, new types of processing offer new insights and control optimizations.

This processing issue gets to the second part of my concern about the imprecision of the term “analytics.”  There were dozens of software vendors at the IBcon conference hawking buildings analytics packages, and on the surface, the marketing messages blur the distinction between them.  At the risk of oversimplification, these packages can be sorted into two buckets: those that use complex rules to effectively sort through lots of data to discover the desired information, and those that algorithmically fuse disparate data sets together to infer new actionable information and insights.  Products in both buckets can perform impressive and useful tasks, but the latter is the class of applications that are best described as true analytics.

Perhaps the best advice for building operators trying to sort through the Big Data and analytics hype is to focus on the specific problems at hand and understand how the proposed solutions offered arrive at their answers.  Whether they involve Big Data or little data, analytics or advanced rules engines, solving problems is the ultimate goal.

 

Dublin Digs Deep with City Data

— June 24, 2013

Cities that want to take advantage of new technologies to improve their operations should be ready to embrace both top-down investment in new management and control systems and bottom-up innovation from a wide range of stakeholders.  Dublin provides a good example of a city that is taking advantage of both approaches to attack some critical city issues.

The Irish capital faced a serious congestion problem as its economy boomed before the credit crunch.  Some estimates suggested that congestion was costing the economy over 4% of GDP.   While the economic downturn has eased the pressure on the traffic system in the short term, the city realizes it has to get smarter at dealing with the underlying problems.

The city’s transportation managers have been working with IBM’s Smarter Cities Technology Center, which is based in Dublin, to understand how they can use data analytics to help optimize traffic management and improve the operation of the city’s bus system. Dublin has no metro, so the bus system is particularly important for transportation in the city.

Working with the IBM research team, the traffic department has combined data from bus timetables, traffic sensors, CCTV and real-time GPS updates for the city’s fleet of 1,000 buses.  This data is used to build a digital map of the city, overlain with the real-time position of each Dublin bus. This allows traffic controllers to see the status of the whole network, drill down into problem areas and make informed decisions on the best actions to reduce congestion.  The data also enables better optimization of traffic management measures and of the bus schedule.

The SPUD Effect

I spoke to Brendan O’Brien, Head of Technical Services, Roads and Traffic Department at Dublin City Council, about the impact of the system at an IBM-hosted event in the city in May.  I asked him how this data had changed the city’s approach to managing the city’s transport.  O’Brien said his team can now combine macro and micro levels of management much better, viewing problems in specific locations while also developing better informed strategic plans for the city.  The challenge is to find time to take advantage of these strategic insights.

Dublin is not only looking to the city’s control systems and big data analytics to improve insight into traffic and transport conditions, but also at the possibilities offered by open data. Dublinked, the city’s open data platform, provides an impressive range of public data sets and enables third parties and individuals to contribute data.  Dublin City Council and other local authorities in the Dublin region are working with the National University of Ireland Maynooth to explore the opportunities for service innovation and collaboration with other agencies and suppliers. Mapping of disabled parking spaces in the city, for example, has been done through crowdsourced information.  IBM has also been using the data to demonstrate the possibilities for data analytics on open data platforms with its Semantic Processing of Urban Data (SPUD) demonstration.

Dublin is a good example of how a smart city strategy should not depend on any single system or application, but rather on the innovative use of multiple tools and applications, shared data, and collaborative networks for innovation.

 

Blog Articles

Most Recent

By Date

Tags

Clean Transportation, Electric Vehicles, Policy & Regulation, Renewable Energy, Smart Energy Practice, Smart Energy Program, Smart Grid Practice, Smart Transportation Practice, Smart Transportation Program, Utility Innovations

By Author


{"userID":"","pageName":"Data Analytics","path":"\/tag\/data-analytics","date":"12\/20\/2014"}