Navigant Research Blog

In a Connected World, Cars Talk to Buildings

— August 20, 2015

It wasn’t so very long ago that communications were largely limited to living beings, whether it was birds, whales, dogs, or people. Our devices were largely mute, performing functions when requested by a button or switch, but otherwise isolated from each other. The development of high-bandwidth wired and wireless communications over the past 2 decades has led to a corresponding transformation with the development of devices that communicate and often act without human intervention. While we are still a long way from Skynet, ubiquitous connectivity is enabling a wide range of possibilities that can help reduce our energy demands in the coming years.

Connected Thermostats, Cars, and More

The old-fashioned set it and forget and even the newer programmable thermostats are being supplanted by wireless, cloud-connected versions like the Nest. In place of a simple mechanical thermocouple, these newer units include sensors to detect motion, light, and humidity, as well as Wi-Fi connections, to the Nest servers to take advantage of big data machine learning and remote control from smartphone apps. By tracking usage patterns and local weather conditions, these thermostats automatically create custom control profiles to provide optimum comfort and minimize energy use.

Ever since the advent of the modern plug-in electric vehicle (PEV) at the turn of the last decade, many of these vehicles have been able to connect to remote servers to get localized electric utility rates. When plugged in at night, they can delay charging until off-peak rates begin, reducing the load on the grid and saving costs for owners. Beginning in 2016, the first cars will start rolling out with vehicle-to-vehicle communications systems that enable cars to broadcast safety-related messages to other nearby vehicles. According to Navigant Research’s Connected Vehicles report, about 80%–90% of new light duty vehicles in North America and Western Europe are expected to be using the technology by 2025.

Researchers, including those at the Detroit technology incubator, NextEnergy Center, are working on new ways to connect these and many other disparate systems to leverage even more information and harness the energy storage in idle PEVs for additional savings. Among the numerous projects at NextEnergy are studies of vehicle-to-building (V2B) and microgrid systems. A small house at the center has been equipped with a direct current wiring system and appliances that enable easier and more efficient energy transfer between the rooftop solar panels, the interior of the house, and the battery electric vehicle parked outside. Another study is developing a model for how many PEVs would be needed to get an optimum balance between vehicle cost and energy cost savings from reducing peak demand for commercial buildings.

The NextEnergy Center will host a 1-day conference on September 9, 2015 called the V2B Mashup. The event includes panels and presentations with speakers from Cisco, General Motors, Visteon, the U.S. Department of Energy, DTE Energy, and more. Seating is limited and online registration is available at www.nextenergy.org/v2bmashup.

 

High-Accuracy Mapping: An Opportunity for the Post Office?

— June 23, 2015

Telescopers_webSynergy is one of the most overused and abused words in business. Whenever this word is uttered, it’s time to break out a big hunk of salt. However, at the recent TU-Automotive Detroit conference in Detroit, an actual synergistic opportunity popped up in the course of discussion. The U.S. Postal Service (USPS)—and by extension, other postal services globally—could play an important role in the future of automated driving. According to Navigant Research’s Autonomous Vehicles report, nearly 95 million vehicles with some autonomous capability will be on the world’s roads by 2035.

High-Resolution and High-Accuracy Mapping

One of the most common topics to arise during the 2-day gathering of people involved in automated driving and connectivity was the need for high-resolution and high-accuracy mapping data. Alain De Taeye, management board member at TomTom, gave a keynote presentation on the requirements for highly automated driving systems. While sensors including a global positioning system (GPS) that can detect the immediate surroundings are clearly a critical component, they are insufficient for robust automated control. Maps can help extend visibility well beyond the line of sight of either the driver or sensor system.

More importantly, the combination of high-definition 3D maps and sensors enables greater capability than either on its own. For example, GPS sensors are notoriously unreliable in the urban canyons where automated vehicles offer some of their most important potential benefits. As satellite signals bounce around off tall buildings set closely together, a GPS-only system often places the user far from their actual location. On the other hand, cameras and LIDAR sensors can contribute to a fused real-time map of the surroundings that can be correlated with stored maps for validation and provide more accurate and precise location information.

De Taeye discussed the sources of data used by TomTom and other map providers, including HERE and Google. By blending data from satellite imagery, government data, real-time crowdsourced information, and fleets of vehicles that traverse the actual roads, maps are constantly updated. De Taeye emphasized the need for continuous updates on road information to ensure accuracy as well as precision, which is where the USPS could come to the rescue. Even companies as large as Google have practical limits on how frequently they can drive down each road.

Capturing Data with Future USPS Vehicles

Ryan Simpson, an electrical engineer with the USPS, attended the conference to learn about some of the new technologies that could potentially be put to use in future service vehicles. With more than 150,000 daily delivery vehicles and another 100,000 vehicles of various form factors, the USPS has the largest commercial vehicle fleet in the world. Those 150,000 delivery vehicles traverse a huge proportion of the roads in the United States 6 days a week, 52 weeks a year. The USPS is currently in the process of defining a next-generation delivery vehicle to replace its rapidly aging fleet. If the new vehicles were equipped with some cameras and sensors, they could capture data with much higher frequency than any of the existing mapping companies. Real world data about everything, including road construction, bridge problems, and even potholes, could be updated daily.

Given the persistent financial difficulties of the USPS, providing fresh and reliable navigational data to mapping companies could provide a significant revenue stream that helps support a very important service to the U.S. population. At the same time, such data would also help to enable automated driving systems. This would be genuine synergy.

 

The Information Reality behind the Intelligent Building

— June 9, 2015

Big data and the Internet of Things (IoT) are the buzz when it comes to intelligent buildings. A slew of vendors are tagging their solutions and coming to market with a message of cost-effective intelligence that will redefine how we live and work in buildings.  But are we ready?

In mid-May, I attended Haystack Connect, an event that brought together a vibrant vendor community tackling the reality of the development of the intelligent building.  The panels and conversations circled on a vision for open-source data modelling via Project Haystack. According to Project Haystack’s website, the project is an open source initiative to streamline working with data from the IoT and to standardize semantic data models and web services with the goal of making it easier to unlock value from the vast quantity of data being generated by the smart devices that permeate our homes, buildings, factories, and cities. The applications the project focuses on include automation, control, energy, HVAC, lighting, and other environmental systems.

Two lessons learned: First off, big data is a marketing tagline, but building owners want to know what it does for them. Second, the IoT can generate a whole lot of information, but the key is accuracy and action.

Big Data: More Isn’t Necessarily Better

The demand for intelligence is ubiquitous, from smartphones to smart watches, and the notion of data-driven decision-making is helping to accelerate customer demand for smart buildings. Getting the data from large existing buildings and making sense of what it means across an enterprise is no small feat. As one speaker put it, “This problem is not the domain of the data scientist.”  In other words, there is building technology and engineering expertise that has to be a part of the equation. In the Project Haystack world, this is about cleaning and processing system information with consistent approaches via tags that speak the same language. Without common naming, analytics can hit a wall.

The Promise of Data Granularity

The trajectory for device connectivity is impressive, and underlying the evolution in technology adoption is the maturation of cost-effective tools that make actionable building intelligence accessible to an ever-growing audience.  Wireless sensors and controllers can not only add granularity to the assessment of building performance, but also open the door to smaller facilities that have been out of reach for the legacy building controls industry.  The exposure of new applications to a wider audience is a critical step in the process of market maturation for smart buildings. As these solutions become adopted across customer segments, market awareness and business value will only increase.

 

In Profit Crunch, Oil Firms Look to Big Data

— June 5, 2015

New_Picture_webAs the price of crude continues to fall and the availability of places for oil companies to store oil shrinks, oil and gas companies are looking for ways to reduce costs and preserve profits. Operational efficiency is a familiar path—one that leads to layoffs, up to 75,000 coming at companies big and small, as reported by Continental Resources. At the same time, some companies are looking inward to big data as a way to make operations and exploration more efficient. We’ve written about the large potential for big data to make buildings, for example, more efficient. And it’s clear that the value of big data lies in its context.

In the case of oil and gas, it is important to keep in mind how diverse this industry is. The use of data in oil exploration and production is wholly different from its employment in oil refining, distribution, and marketing.

Down the Stream

According to the panel members of a recent Cleantech Forum panel on the digitization of the oil and gas industry, there’s scant consensus on data models and formats in single business units, let alone across an entire company or the industry. The spread of digitization is not universal, either. This presents a clear challenge to the industry–data analytics are only useful when the data is consistently collected and, well, analyzed. But it also presents an opportunity. Any company that can figure out how to collect, integrate, and analyze data across the oil and gas stream—from wellhead to gas pump—will be able to unlock the potential of both operational efficiency and optimization. Those gains in efficiency will save money and help the companies achieve their sustainability goals.

A few companies are already testing that promise. WellAware is looking to bring a new Internet of Things (IoT) network for oil and gas, providing customers a view into the production, conveyance, and processing of petroleum products. The Texas-based firm deploys sensors and gathers data from existing monitors to provide visualizations and analytics on system performance. To compete with OSIsoft, an incumbent in oil and gas data collection and historian services, WellAware will provide hardware and advanced analytics—two offerings that OSIsoft either does not offer or outsources.

Human Input

A different approach, one based on large time series data analysis, is offered by Mtelligence Corporation and MapR, a provider of the powerful open-source Hadoop solution. Called Mtell Reservoir, the solution will focus on real-time and historical sensor data analysis to provide system managers operational insight. Given the large volume of data gathered in a drilling operation and the time it takes to load and analyze data, an in-stream solution will have great value.

These big data solutions are poised to give oil and gas operators greater intelligence and insight into operations. However, they don’t close the loop on operations, removing the need for people making decisions. This is due in part to the complex nature of drilling through multifaceted substrates and processing materials of varying quality. Production technologies like directional drilling and fracking have changed the oil and gas business and are in part responsible for the current low oil prices. Data analytics may help to stem the profit losses in the near term.

 

Blog Articles

Most Recent

By Date

Tags

Clean Transportation, Electric Vehicles, Policy & Regulation, Renewable Energy, Smart Energy Practice, Smart Energy Program, Smart Grid Practice, Smart Transportation Practice, Smart Transportation Program, Utility Innovations

By Author


{"userID":"","pageName":"Conferences & Events","path":"\/tag\/conferences-events","date":"8\/28\/2015"}