Navigant Research Blog

High-Accuracy Mapping: An Opportunity for the Post Office?

— June 23, 2015

Telescopers_webSynergy is one of the most overused and abused words in business. Whenever this word is uttered, it’s time to break out a big hunk of salt. However, at the recent TU-Automotive Detroit conference in Detroit, an actual synergistic opportunity popped up in the course of discussion. The U.S. Postal Service (USPS)—and by extension, other postal services globally—could play an important role in the future of automated driving. According to Navigant Research’s Autonomous Vehicles report, nearly 95 million vehicles with some autonomous capability will be on the world’s roads by 2035.

High-Resolution and High-Accuracy Mapping

One of the most common topics to arise during the 2-day gathering of people involved in automated driving and connectivity was the need for high-resolution and high-accuracy mapping data. Alain De Taeye, management board member at TomTom, gave a keynote presentation on the requirements for highly automated driving systems. While sensors including a global positioning system (GPS) that can detect the immediate surroundings are clearly a critical component, they are insufficient for robust automated control. Maps can help extend visibility well beyond the line of sight of either the driver or sensor system.

More importantly, the combination of high-definition 3D maps and sensors enables greater capability than either on its own. For example, GPS sensors are notoriously unreliable in the urban canyons where automated vehicles offer some of their most important potential benefits. As satellite signals bounce around off tall buildings set closely together, a GPS-only system often places the user far from their actual location. On the other hand, cameras and LIDAR sensors can contribute to a fused real-time map of the surroundings that can be correlated with stored maps for validation and provide more accurate and precise location information.

De Taeye discussed the sources of data used by TomTom and other map providers, including HERE and Google. By blending data from satellite imagery, government data, real-time crowdsourced information, and fleets of vehicles that traverse the actual roads, maps are constantly updated. De Taeye emphasized the need for continuous updates on road information to ensure accuracy as well as precision, which is where the USPS could come to the rescue. Even companies as large as Google have practical limits on how frequently they can drive down each road.

Capturing Data with Future USPS Vehicles

Ryan Simpson, an electrical engineer with the USPS, attended the conference to learn about some of the new technologies that could potentially be put to use in future service vehicles. With more than 150,000 daily delivery vehicles and another 100,000 vehicles of various form factors, the USPS has the largest commercial vehicle fleet in the world. Those 150,000 delivery vehicles traverse a huge proportion of the roads in the United States 6 days a week, 52 weeks a year. The USPS is currently in the process of defining a next-generation delivery vehicle to replace its rapidly aging fleet. If the new vehicles were equipped with some cameras and sensors, they could capture data with much higher frequency than any of the existing mapping companies. Real world data about everything, including road construction, bridge problems, and even potholes, could be updated daily.

Given the persistent financial difficulties of the USPS, providing fresh and reliable navigational data to mapping companies could provide a significant revenue stream that helps support a very important service to the U.S. population. At the same time, such data would also help to enable automated driving systems. This would be genuine synergy.

 

The Information Reality behind the Intelligent Building

— June 9, 2015

Big data and the Internet of Things (IoT) are the buzz when it comes to intelligent buildings. A slew of vendors are tagging their solutions and coming to market with a message of cost-effective intelligence that will redefine how we live and work in buildings.  But are we ready?

In mid-May, I attended Haystack Connect, an event that brought together a vibrant vendor community tackling the reality of the development of the intelligent building.  The panels and conversations circled on a vision for open-source data modelling via Project Haystack. According to Project Haystack’s website, the project is an open source initiative to streamline working with data from the IoT and to standardize semantic data models and web services with the goal of making it easier to unlock value from the vast quantity of data being generated by the smart devices that permeate our homes, buildings, factories, and cities. The applications the project focuses on include automation, control, energy, HVAC, lighting, and other environmental systems.

Two lessons learned: First off, big data is a marketing tagline, but building owners want to know what it does for them. Second, the IoT can generate a whole lot of information, but the key is accuracy and action.

Big Data: More Isn’t Necessarily Better

The demand for intelligence is ubiquitous, from smartphones to smart watches, and the notion of data-driven decision-making is helping to accelerate customer demand for smart buildings. Getting the data from large existing buildings and making sense of what it means across an enterprise is no small feat. As one speaker put it, “This problem is not the domain of the data scientist.”  In other words, there is building technology and engineering expertise that has to be a part of the equation. In the Project Haystack world, this is about cleaning and processing system information with consistent approaches via tags that speak the same language. Without common naming, analytics can hit a wall.

The Promise of Data Granularity

The trajectory for device connectivity is impressive, and underlying the evolution in technology adoption is the maturation of cost-effective tools that make actionable building intelligence accessible to an ever-growing audience.  Wireless sensors and controllers can not only add granularity to the assessment of building performance, but also open the door to smaller facilities that have been out of reach for the legacy building controls industry.  The exposure of new applications to a wider audience is a critical step in the process of market maturation for smart buildings. As these solutions become adopted across customer segments, market awareness and business value will only increase.

 

In Profit Crunch, Oil Firms Look to Big Data

— June 5, 2015

New_Picture_webAs the price of crude continues to fall and the availability of places for oil companies to store oil shrinks, oil and gas companies are looking for ways to reduce costs and preserve profits. Operational efficiency is a familiar path—one that leads to layoffs, up to 75,000 coming at companies big and small, as reported by Continental Resources. At the same time, some companies are looking inward to big data as a way to make operations and exploration more efficient. We’ve written about the large potential for big data to make buildings, for example, more efficient. And it’s clear that the value of big data lies in its context.

In the case of oil and gas, it is important to keep in mind how diverse this industry is. The use of data in oil exploration and production is wholly different from its employment in oil refining, distribution, and marketing.

Down the Stream

According to the panel members of a recent Cleantech Forum panel on the digitization of the oil and gas industry, there’s scant consensus on data models and formats in single business units, let alone across an entire company or the industry. The spread of digitization is not universal, either. This presents a clear challenge to the industry–data analytics are only useful when the data is consistently collected and, well, analyzed. But it also presents an opportunity. Any company that can figure out how to collect, integrate, and analyze data across the oil and gas stream—from wellhead to gas pump—will be able to unlock the potential of both operational efficiency and optimization. Those gains in efficiency will save money and help the companies achieve their sustainability goals.

A few companies are already testing that promise. WellAware is looking to bring a new Internet of Things (IoT) network for oil and gas, providing customers a view into the production, conveyance, and processing of petroleum products. The Texas-based firm deploys sensors and gathers data from existing monitors to provide visualizations and analytics on system performance. To compete with OSIsoft, an incumbent in oil and gas data collection and historian services, WellAware will provide hardware and advanced analytics—two offerings that OSIsoft either does not offer or outsources.

Human Input

A different approach, one based on large time series data analysis, is offered by Mtelligence Corporation and MapR, a provider of the powerful open-source Hadoop solution. Called Mtell Reservoir, the solution will focus on real-time and historical sensor data analysis to provide system managers operational insight. Given the large volume of data gathered in a drilling operation and the time it takes to load and analyze data, an in-stream solution will have great value.

These big data solutions are poised to give oil and gas operators greater intelligence and insight into operations. However, they don’t close the loop on operations, removing the need for people making decisions. This is due in part to the complex nature of drilling through multifaceted substrates and processing materials of varying quality. Production technologies like directional drilling and fracking have changed the oil and gas business and are in part responsible for the current low oil prices. Data analytics may help to stem the profit losses in the near term.

 

Novel Microgrid Architectures Face Regulatory Hurdles – Even in New York and California

— June 4, 2015

If I had to pick two states that are leading the charge on reinventing electric utilities, they would be New York and California. Yet, even in these state laboratories of regulatory reform, novel forms of distribution networks (often referred to as microgrids) that rely upon the inherent advantages of direct current (DC) are facing obstacles.

The core challenge facing DC distribution networks lies with the need for standards and open grid architectures that can help integrate the increasing diversity of resources being plugged into retail power grids. This, among other issues, is the focus of the first major conference sponsored by the Institute of Electrical and Electronics Engineers (IEEE) on DC distribution networks. The conference will take place in Atlanta, Georgia, from June 7 through June 10.

In New York, Pareto Energy of Washington, D.C. obtained preliminary engineering approval from Consolidated Edison (and a $2 million grant from the New York State Energy Research and Development Authority [NYSERDA]) to install its patented GridLink microgrid controller at the 12.8 MW combined heat and power (CHP) plant that serves Kings Plaza Shopping Center on the Brooklyn waterfront.  GridLink converts power from each generation source (including grid power) from alternating current (AC) to DC, collects all the power on a common DC bus, converts that DC power back to AC, and distributes power to any load (including those on the utility grid).  All the while, each power source is electrically isolated. In short, GridLink creates a non-synchronous plug-and-play microgrid.

Although Kings Plaza has never been connected to Consolidated Edison’s grid, it provides electric and thermal energy to the center at costs less than half of equivalent utility services. Under the plan, 8 MW of low-cost power from Kings Plaza’s CHP unit will be exported to the utility grid, which may be utilized to serve nearby low-income communities during a major power outage. Despite these potential benefits, some regulatory snags have delayed the project. Pareto has also filed a petition with the New York Public Service Commission, claiming discrimination against its lower cost option to traditional power delivery infrastructure to meet contingency requirements for reliability within the Consolidated Edison service territory.

The View from the Other Coast

In California, the issues are different, but they also involve DC. One case involves Bosch, which was awarded a California Energy Commission grant of $2.8 million grant to develop a high-penetration solar PV DC microgrid at an American Honda Motor Co. parts distribution center in Southern California. The project is designed to validate the efficiency performance benefits of a patented system allowing it to directly connect DC power flowing from solar PV to LED lighting and DC ventilation systems located within the building, as well as a DC energy storage device. The benefits of DC attached to this project include lower installation and operating costs. In addition, this project is pioneering the application of a DC distribution network within existing building codes in order to boost reliability.

While Bosch observes it has not run into any problems with building codes or other such potential obstacles to its DC building grid business model, it has identified an interesting dilemma. Since state subsidies for both solar PV and energy storage are linked to the size of the inverter interconnecting with the AC grid, it appears DC technologies are being discriminated against, despite the fact they are more efficient and reliable.

In both cases, the status quo is being challenged by new technology revolving around a nonsynchronous microgrid incorporating the advantages of DC.  This is the subject of my next report, Direct Current Distribution Networks, expected to publish later this month.

 

Blog Articles

Most Recent

By Date

Tags

Clean Transportation, Electric Vehicles, Policy & Regulation, Renewable Energy, Smart Energy Practice, Smart Energy Program, Smart Grid Practice, Smart Transportation Practice, Smart Transportation Program, Utility Innovations

By Author


{"userID":"","pageName":"Conferences & Events","path":"\/tag\/conferences-events","date":"8\/2\/2015"}