Want to know what $10 and access to Google Survey Tools will get you? How about a quick and dirty survey about crazy Canadian consumer purchasing habits? If you don't believe me, check out the chart below:
Yes, these are real survey results. Yes, they suggest that more people would buy a banana than a hockey puck worth a lot more than the $0.99 investment. No, it does not bode well for old iPhones or the longevity of everyone's favorite novelty tune from Ylvis.
You got a lot out of that, didn't you?
Beyond the fact that Canadians are just plain strange, there is a point here. It seems like operator surveys have become super-popular in the mobile telecom space during the last year or so. Analyst shops launch them. Vendors launch them. Media outlets quote them. People just seem to love them. This makes sense. There are lots of new network technologies out there - from small cells, to carrier WiFi, to SDN and NFV – and everyone wants to figure out how they'll evolve. What do vendors need to do in order to meet carrier demands? How can they best time product development? Any best practices for operators when it comes to deployment? With these types of questions out there, it just makes sense to get input from people who are actually shaping the market. This input, after all, should tell us what the future will look like!
And yet, returning to my survey of Great White North inhabitants, it should be clear that not all surveys are created equal. Garbage in, as they say, yields garbage out and that definitely holds when running a survey. If you want some interesting, cocktail party conversation, any survey might do. If you want to actually shape your sales, marketing, R&D or deployment strategies, then survey design and analysis are much bigger concerns. And this is what worries me about the rash of surveys I'm seeing. Every one purports to paint a detailed, accurate view of the world. Some of them might actually do that, but only if they're paying attention to a number of key design issues. Things like…
· Sample. We've all been trained to believe that the more people we survey, the more accurate our results will be. If I'm trying to characterize the entirety of the service provider market for LTE-Advanced upgrades, for example, talking to 100+ service providers is a lot better than talking to 30. Simple enough. Beyond the sample size, however, there's also the question of who you talked to. Executives in charge of networks operations? Strategy and planning executives? CMOs? CTOs? People with no decision making authority whatsoever? Did you hit big operators and small operators? How about a good distribution across geographies? Enough of each in order to get meaningful results? The answers will speak to whether or not survey results map to the reality of the real world. After all, as much as we'd like to think that world is a straightforward place where everyone thinks the same, we all know it's much more complex. If you want to understand adoption curves – much less forecast demand – knowing the details counts.
· Questions. It should be obvious, but the questions that get asked define the answers you get. Think about a survey where you were asked to assign a value to something. "On a scale of 1-to-10, how much do you like peanut butter, ice cream, olives, and liver?" How much thought are you going to put into your responses? Would you give a more accurate answer if you were forced to rank your interest in everything that's currently in my fridge? Here's another example. What if you want to figure out how many operators are planning to deploy Software Defined Networking (SDN) of Network Functions Virtualization technologies in the 12 to 18 months? Did you ask whether they were already deploying those technologies? It's unlikely that many are already in the middle of deployment yet, right? Yet, knowing that some (or many) operators think they already are deploying SDN and NFV would be particularly telling. You get the idea. Asking the right questions, in the right way, is just as critical as who you're talking to if you're really trying to understand a market.
· Analysis. On the topic of SDN/NFV surveys, we ran one at the tail-end of last year (SHOCKER, I know). While the survey was in the field, I had an opportunity to talk with a major US cellco CTO. I mentioned that I was pretty excited about the survey. I'll never forget the CTO's response. "Sounds interesting, but I'm sure that if you talked to five different people in my organization, you'd get five different sets of responses." There are two ways to interpret the story. One is an indictment of the survey. One is a reminder that we need to be careful in how to read the data. Talking to 30 service providers and extrapolating the findings to an entire market doesn't make any more sense than believing that, in a very young market, those service providers actually have the answers you're looking for. That's just bad analysis. The same holds for any concrete rankings or statements in a market that those service providers don't fully understand. Looking across the data for trends or having "directional" faith in the answers at the extremes is a better bet.
There are a lot of survey data floating around the telecom world right now--surveys pertaining to technology updates, service uptake, spectrum usage, etc. This is generally a good thing; with a myriad of technology transitions facing vendors and operators alike, meaningful insights should help in making sure they get commercialized and deployed well. A problem arises, however, when survey data isn't meaningful or isn't interpreted correctly. Then, the decisions these surveys are being used to support (and the money spent on them) are at risk.
This isn't a new problem. Surveys have been used for years to gauge service provider and consumer sentiment. There's always been the risk that poorly executed surveys can be used to drive misguided strategies. Circa 2014, however, with so many different technology and service transitions facing operators, the importance of the issue is elevated. This, in turn, elevates the importance of "spending." Spending time, up front, to make sure you're asking the right questions. Spending the money necessary to reach an audience that makes sense. Spending time once the results come in to actually understand them vs. just looking for sound bites.
Peter Jarich is the VP of Consumer and Infrastructure at Current Analysis. Follow him on Twitter: @pnjarich.