Featured Product
This Week in Quality Digest Live
Customer Care Features
Mark Rosenthal
Making headway as a change agent
Paul Laughlin
Part 1: The technical stuff
Pavel Kireyev
Three tips for deploying AI
Steven Brand
A crucial method for differentiation (and profitability) in a tight market
Jim Benson
Episode One of the Respect for People Series

More Features

Customer Care News
Chick-fil-A leads; Chipotle Mexican Grill stabilizes
Consolidated Edison posts large gain; patient satisfaction is stable
Partnership for a Cleaner Environment (PACE) program has grown to more than 40 suppliers in 40 countries
Trader Joe’s tops supermarkets; Home Depot overtakes Lowe’s
TVs and video players lead the pack, with internet services at the bottom
AIAG’s director of corporate responsibility comments on impact of new ethics language in upcoming IATF 16949
Good news for Detroit
The Baldrige Criteria for Performance Excellence can help
ISO/PC 303 project committee will provide international benchmarks to reduce purchasing risks

More News

Annette Franz

Customer Care

Surveys: Don’t Believe Everything You Read

Valuable insights vs. misinformation

Published: Wednesday, August 21, 2019 - 11:02

Being a customer experience (CX) professional is hard enough; misinformation just makes our work more challenging. Misinformation or confusing information by a person with a ton of followers and a ton of influence makes our work even more challenging.

Recently, Seth Godin published a post on his site titled “Sneaky Surveys (and Push Polls).” I’m a big Seth Godin fan, but this post made me pause. As of today, it’s already received almost 4,000 likes (stars) on his site. In his post, he makes six points about surveys. I’ll address each one.

Open access online surveys

His comment on this topic is: “All open access online surveys are essentially inaccurate, because the group that takes the time to answer the survey is usually different from the general public.”

I’ve seen very few truly open access online surveys. I have to assume he’s talking about polls posted on social media or on media sites (given some of his comments on the other five items). And polls, as you know, are different from surveys. If someone posts a poll on social media, I’m not sure they’re looking for feedback from the general public anyway; they are looking for (or will get) responses from people who care, negatively or positively, about the topic in question. If you’re not vested in the topic or if it’s not relevant to you, the likelihood that you’ll respond to this type of survey—or any survey—is pretty slim. The critical thing here is, as the author of said survey, you must know/realize this.

Now, if you’ve got a site intercept or a static survey on your site (not behind an account login), these are open access because there are no restrictions to access; anyone can come to your site to respond. But no one (general public) goes to your site just to take a survey. And you’re not looking for feedback from the general public, either; you’re looking for feedback from people who have come to your site to search, research, purchase, get support, etc. And those are the only people who will respond. Does that make the survey inaccurate? Um, no.

This is a good reminder to always define your objectives, know your audience, ask questions relevant to your audience, and present the survey in a way that gets you feedback from that audience.

Survey vs. census

Godin states: “Don’t confuse a survey with a census. A survey asks a randomized but representative group some questions and then seeks to extend the answers to the entire group as a whole. A census seeks to ask everyone in the group, so that no generalization is required.” He goes on for a couple more paragraphs about this one.

A survey is not always asked of a randomized group; sometimes it is asked of your entire population; hence, a census is a type of survey (i.e., the U.S. Census), but it also refers to the sample to which the survey goes: to your entire population. For example, a point-of-sale survey goes to your entire population, unless you only serve that up on every nth receipt, to every nth customer.

He goes on to say: “The huge mistake is believing that you need to survey more and more people. You don’t. And your work to reach more people actually makes your survey less accurate not more (see the first thing).”

I’m not sure how the survey becomes less accurate with more and more responses. He seems to be linking that to “open access,” but very few surveys are truly open access. No well-designed market research or voice of the customer (VoC) program will just lob a survey over the fence for anyone to respond. You have a sampling plan and want specific people, e.g., prospects and customers, to respond, not the general population. Otherwise, that’s a waste of time and money. I do agree with him on this: “What you need is a correctly representational group, which can be dramatically smaller than the entire population.”

As customer experience professionals, we prefer to hear from more customers rather than fewer. The feedback we get is generally spread out over time, not a point in time, so we will likely have more responses than fewer responses. He makes a generalization about surveys, but there are ongoing transactional and relationship surveys, for example, for which you will get more and more feedback over time. And that does not make them inaccurate.

Survey anonymity

He starts off this point with: “You might believe the survey someone just emailed you to fill out is anonymous. It probably isn’t.” He is correct. He cites information from SurveyMonkey’s site about tracking IP addresses and email addresses, i.e., they are built-in features. And then says: “If you get a survey link by email or even as you browse a site, it’s a safe guess to imagine that your answers are tied in some way to your other interactions with the organization that posted the survey. Respondent beware.”

Ouch. Respondent beware. Let’s scare the public. Guess what? You’re being tracked in your everyday life, everything you do. If people want personalized experiences, they’re going to have to give up data. That doesn’t mean surveys are sneaky. What’s sneaky is Amazon or Apple recording your conversations (without you knowing) and then using those data to present you with offers for the products you were talking about. Yeah, that’s happened to me. But then—all expectations of privacy are pretty much gone at this point.

Back to surveys. I don’t know of many customer experience surveys that are anonymous. If you follow some of the rules of VoC surveys, you’re going to personalize your email invitation and reminder. That’s the first sign that the survey isn’t anonymous. Not promising anonymity allows you to tie the customer response to the customer’s data so that you can close the loop with the customer, and so that the analysis is far more robust and insightful.

Push polls

Well, here’s a big problem. You can’t put surveys and push polls into the same bucket. They are very different and have very different purposes. His comment here is: “Asking someone a question can change the way they feel. Done crudely, this is called a push poll (‘Did you know that Bob was indicted last year?’) but even asking someone a thoughtful question about their satisfaction can increase it.”

Push polls are most commonly used in political campaigns to sway the respondent/voter. Political polls. Not your surveys. As you know, this approach is actually a huge no-no in surveys.

I’ve got two thoughts top of mind here, two rules you must adhere to: don’t ask leading questions, and don’t try to sell with your surveys! Well-designed surveys do neither of these.

Open-ended question

His next point is a painful one: “At the conclusion of the endless surveys when they ask you if you have anything else to add, don’t bother. It’s not like the CEO is busy reading your comments.”

While I can’t disagree 100 percent because there are still companies out there that do nothing with their feedback, this is such a broad, inaccurate, blanket statement that it ought to either piss you off or motivate you to move. Don’t throw out the baby with the bath water! In many cases, the CEO is not reading the comments (yet, some do!), but there are other people in the organization—the ones who are actually going to act on them—who are reading (or using text analytics to glean insights). Insights from customer feedback is socialized throughout the organization, in a variety of ways. In customer-centric organizations, the CEO is in the loop, getting a summary of findings in her dashboard and briefings.

Focus-group survey

Oh dear. What’s a focus-group survey? He says: “The single best way to figure out how people feel isn’t to ask them with some focus-group survey. It’s to watch what they do when given the choice. “This or that?” is a great way to get to the truth of our preferences.”

Yes. Ethnographic research and other approaches for observing customers in their natural habitats are a great way to learn about customers and their preferences. As a matter of fact, focus groups are a useful tool for this, as well!

But observations don’t get at the why, and surveys aren’t just about preferences. Here’s the rub: Unless you ask, you won’t/don’t understand. Surveys are about understanding. Yes, understanding customer preferences, but also expectations, reasons, how they felt about the experience, pain points, problems to solve, what went well and what didn’t, what they liked and didn’t like about the experience, and more.

Taking a broad brushstroke to surveys and putting a negative spin on them does no one any good. And mixing topics doesn’t, either. If you buy into, or aren’t sure about, any of the things Godin wrote about, please be sure to read these articles about VoC programs, surveys, survey invitations, closing the loop, socializing insights, and more.

It’s also a good time to remind you that the survey is a touchpoint, so you must consider the respondent experience—and make it a good one!

And remember that you can listen to customers in other ways, not just via surveys.

Is Godin’s post a kick in the pants for customer experience professionals and market researchers to do things better? Or has he simply confused a bunch of topics (that address a bunch of different audiences) that shouldn’t be tied together—or even written about at all? As always, I’d love to get your thoughts on this topic.

First published July 31, 2019, on the CX Journey blog.

Discuss

About The Author

Annette Franz’s picture

Annette Franz

Annette Franz, CCXP is founder and CEO of CX Journey Inc. She’s got 25 years of experience in both helping companies understand their employees and customers and identifying what drives retention, satisfaction, engagement, and the overall experience – so that, together, we can design a better experience for all constituents. She's an author (she wrote the book on customer understanding!), a speaker, and a customer experience thought leader and influencer. She serves as Vice Chairwoman on the Board of Directors of the Customer Experience Professionals Association (CXPA), is an official member of the Forbes Coaches Council, and is an Advisory Board member for CX@Rutgers.