Navigant Research Blog

When in Doubt, Take a Survey

— May 28, 2013

Cecil Adams of The Straight Dope once supposedly wrote, “Around here, we don’t vote on the facts.”  That was before the age of online surveys.  Once again I have in my inbox a request to participate with other executives in a survey of the current sentiment and outlook of the smart grid industry.

When I see these surveys I wonder, “Who really cares what we think?”  The electrons don’t care.  The untrimmed trees under the high voltage lines don’t care.  The hostile nation-state hackers certainly don’t care.  The ratepayers – sorry, I meant to call them customers – don’t care.  And if someone thinks that I am a “smart grid executive,” then I hate to think who else has been identified as an “executive.”

As a market research professional, I admit that I look down my nose at surveys.  They are not primary research, as many losers in last November’s U.S. elections are now aware.  My research involves a lot of time on the telephone, asking questions of key stakeholders in a given research area, then synthesizing diverse responses into one or two theses.  This is rarely straightforward.  One slide from my conference presentation deck asks, “What is the No. 1 cyber security problem facing utilities?”  During one research project I asked 33 people this question and got 28 distinct answers.  It takes three slides to answer what you’d think is a really simple question.

Not Another Monkey

That is what research looks like.  The notion that you can just run another SurveyMonkey to an anonymous audience, arbitrarily designate your audience as executives, and therefore develop conclusions about the industry… just doesn’t sit well with me.

But surveys produce numbers.  Numbers can be analyzed, operated upon, correlated, summarized.  And no matter the source, numbers somehow convey an air of certainty.  Especially if you have a large enough sample size and can claim a statistical error margin of +/- 3%.  That’s just got to be right, doesn’t it?

Not always.  Surveys of sentiment are qualitative.  This particular survey asks questions such as whether my company’s smart grid investment is going to increase, decrease, or stay the same.  Whether the increase is by $100 or by $1 billion, I tick the same box.  There is nothing quantitative going on here.  Yet we often ascribe to survey results the same strength as weather measurements or time signals.

In the spirit of full disclosure, Navigant Research does publish and sell an annual Smart Grid Consumer Survey.  We are open that we are measuring consumer sentiment, nothing more.  And I have used some of the survey results as supporting data for my research.  But I would never draw conclusions solely from anonymous surveys.

For us, research begins with lots of telephone time discussing issues with key stakeholders.  That research continues with our all-star research associates who spend their days tracking down untold quantities of obscure but useful information.  That is how you begin to understand the direction of a market.

 

Is ‘Strategic Intelligence’ an Oxymoron?

— April 11, 2012

An essay on TheAtlantic.com by Eric Garland, a former strategy analyst and author of Future Inc: How Businesses Anticipate and Profit from What’s Next and How to Predict the Future…and WIN!, recounts Garland’s growing disenchantment with the field in which he’s made a living for 15 years: strategic intelligence.

I don’t particularly trust anyone who writes books with titles like that (particularly ones with totally ungrammatical subtitles), but Garland’s indictment is stinging and persuasive. “The market for intelligence is now largely about providing information that makes decision makers feel better, rather than bringing true insights about risk and opportunity. … Our future is now being planned by people who seem to put their emotional comfort ahead of making decisions based on real — and often uncomfortable — information.”

Garland is mostly talking about strategic intelligence at the corporate and nation-state level, but his definition of “strategic intelligence” (“researching trends, analyzing their potential impact, and reporting the possibilities to decision-makers”) could certainly apply to the field of clean technology research and analysis that we inhabit at Pike Research, as well.  He identifies three trends that are making it harder for empirical evidence and clear-eyed analysis to overcome institutional biases, internal politics, and short-term thinking.  First, “the explosion of cheap capital from Wall Street has led major industries to consolidate,” leaving a smaller pool of firms, many of which operate in markets distorted by politics, protectionism, and government handouts.  Second, this concentration of capital and economic clout has created giant bureaucracies in which “conventional thinking and risk avoidance become paramount.”  When you’re part of a large bureaucracy far removed from the real-world consequences of individual decisions and actions, it’s harder, and less rewarding, to base your thinking on strategic intelligence.

Finally, the influence of policy-makers is stronger than ever before.  This may seem counter-intuitive at a time when the United States can’t even craft a national energy policy, but Garland makes the case that national governments are now in the business of shielding large corporations – GM, Verizon, big banks – from the turbulent forces of globalized capitalism.

“How can you use classical competitive analysis to examine the future of markets when the relationships between firms and government agencies are so incestuous and the choices of consumers so severely limited by industrial consolidation?”

Watch Out for the Elephants

I have a couple of responses to this lament. One is that, although many cleantech sectors (electric vehicles and solar power, to name two) are certainly influenced by – many would say “distorted by” – government policy and government handouts, I have not found it the case that that limits the usefulness of evidence-based analysis and quantitative market sizing and forecasting.  Quite the opposite: the companies we talk to every day need independent intelligence more than ever, in large part because the actions of governments can be so unpredictable and so market-changing.  When you’re trying to run through an elephant herd it helps to know which way the trunks are swinging, as it were.

Second, the lamentable state of strategic intelligence is not news.  Garland never refers to the invasion of Iraq nor the intelligence failures (or misuses) that led up to it, but his critique certainly springs from the dark days of 2002, when an entire generation of CIA intelligence gatherers and analysts saw their work distorted and repurposed to further a predetermined foreign policy objective: the invasion of Iraq.  In 2007 John Heidenrich wrote a long essay on the CIA’s official website called “The State of Strategic Intelligence,” which made the same complaint that Garland makes today:The architects of the National Security Act of 1947 would be greatly surprised by today’s neglect of strategic intelligence in the Intelligence Community.”

Last year former Fortune managing editor Walter Keichel III published an essay on the Harvard Business Review site in which he noted that the entire business model of corporate strategic analysis has shifted: “Behemoths such as McKinsey and BCG … have broadened what they do and moved down the food chain. McKinsey teams are beavering away in places like the United Arab Emirates and the ‘Stans — Turkmenistan, say, or Tajikistan — but they’re as likely to be doing operations projects as pure strategy work.”  These days the real money, Keichel notes, lies not in corporate strategy but in “semi-permanent, year-in, year-out relationships with companies rich enough to pay scores of millions annually for help and advice.”

That reminds me of the old Woody Allen joke: “I know therapy works – I’ve been doing it for 30 years!”  (To be sure, though, ongoing customized client relationships are often not only more lucrative to the consultant but more valuable to the client than one-off, high-level strategic studies.)

Garland’s overall point is inarguable.  “The study of the future used to be easier to sell, maybe because the analysis usually predicted the growth of the consumer economy or the next great gadget,” he writes.  “But the future is no longer nearly as palatable, and the customers are less interested.”

But the customers who aren’t interested in hard truths about an unpalatable future aren’t good customers, anyway, because they’re not going to be around for very long.

 

Blog Articles

Most Recent

By Date

Tags

Clean Transportation, Digital Utility Strategies, Electric Vehicles, Energy Technologies, Policy & Regulation, Renewable Energy, Smart Energy Practice, Smart Energy Program, Transportation Efficiencies, Utility Transformations

By Author


{"userID":"","pageName":"Research & Analysis","path":"\/tag\/research-analysis","date":"12\/16\/2017"}