We’re knee deep in a fairly significant research project for our client, Red e App. It’s an internal communications platform that serves enterprise companies that have large numbers of hourly and non-desk workers. It turns out that more than 50 percent of hourly workers don’t have company email, so internal communications is often a challenge.
But Red e App wanted to know more about the hourly worker’s experience, so we, along with Edison Research, are in the process of discovery.
The process reminded me how different research is when it’s done by professionals rather than thrown together by brands that think they understand market research enough to cut corners on the investment.
Ponder this question for a moment, and be honest with yourself: How many times have you created and launched a questionnaire using Google Docs or SurveyMonkey? Those are surveys. They are not research.
There are three core differences, from my perspective. Each individually, and certainly in total, create the value in working with a credible research firm to survey consumers or a target audience. Keep them in mind the next time you want to survey a group of people.
Surveys ask questions; research asks the right questions.
As much as I hate to break it to you, unless you are a trained market researcher, you very likely don’t write good survey questions. I know, it’s unfathomable, right? But it’s true.
People are more or less honest when you ask questions a certain way. They answer truth vs. their projection of the truth based on how you write the questions, too. In short, the way you write the question may completely invalidate the insights you discover.
Think about how you would answer the following:
- How helpful was your customer service call?
- How frustrating was your customer service call?
- How would you rate your satisfaction with this customer service call?
The first two completely bias the answer set. There’s a method to the madness. And you probably don’t know the method.
Surveys are random; research is representative.
You may think you can go out and ask 250, 500 or even 1,000 people your questions and claim it is representative of the U.S. population or the demographic you intend to query, but unless you’re a real market researcher, you’re just fooling yourself. Real market researchers make sure that the percentage of males and females, various races, income levels and the like match the percentage represented in the broad population you’re surveying. They are truly representative samples, not just assumptions you’ve made.
Performing this “weighing” of the data means you won’t survey 500 people. You’ll most likely wind up with a number like 547 or similar. You have to keep asking various genders, races and the like until your sample matches the population. And it’s hard, tedious work to get it right.
Surveys answer a series of questions; research answers them by factorials.
Remember the mathematical process of multiplying factorials? For example, five factorial is 5×4×3×2×1. When you do a survey with your little monkey, you get one answer for every question you ask. A market researcher cross-tabs and gives you as many answers for each question as you have in total.
What that means is if I ask for gender, age, income and education level, I not only get those answers as percentages of my total, but I also get gender broken down by age, income and education level; age broken down by gender, income and education level; income level broken down by gender, age and education level, and so on.
This becomes infinitely powerful when you ask questions like, “How satisfied are you with your current job?” and can cross-tab that with questions like, “How often does your supervisor communicate with you?” or “How strongly do you feel your input is valued by your supervisor?” And then you can break that down by industry, level of experience, income, gender, age and so on.
Self-administered surveys most certainly serve a purpose. They can be very cheap, easy and fast to implement, and they can give you good information depending on how many people answer and what kind of questions you ask. But when you really want to dig for consumer insights, you need more than a monkey to make that happen.
Think of an aspiring singer. Let’s call her Jenny. Should she record her demo with an iPhone or a recording studio? That’s the difference.