Online survey panels, what is going wrong?
Online survey panels were once an innovative game-changer for the market research industry. They introduced a new cost-effective and efficient way to gather consumer insights. In an era where data drives decisions, they held great promise and benefits, providing access to large and diverse groups of consumers at the click of a button. Over the years, the quality of these panels has been deteriorating so much that the initial perceived benefits are now overshadowed by the laughable quality of data. Does this mean online survey panels are becoming irrelevant and obsolete?
The Decline of Online Survey Panels:
You might be thinking I’m being provocative or biased here. And that would be understandable, until you look at the stats…
From Promise to Problem: Tracing the Downfall
A recent study was conducted among panel participants from five of the top ten largest online survey panel providers. This study revealed almost half (46%) of respondents who completed the survey had to be removed because they failed multiple quality control measures such as incoherent verbatim responses, or speed throughs. The issues were so evident they obviously needed to go.
Another study, conducted by Kantar, has found that researchers discard an average of 38% of the data they collect due to quality concerns and panel fraud, with some reporting having to discard up to 70% of their data!
The fact that solutions like Data Scrubbers exist is a testament to how bad this problem is. Bad data is rampant, and it is only getting worse with the addition of bots and AI saturating the industry. In fact, this problem has created and sustains an entirely separate industry dedicated to removing bad respondents – after the study!
And, this assumes that the data scrubbers can find all of the bad and fraudulent respondents and effectively clean them out. Rep Data has found that only one third of fraudulent responses are caught by traditional data cleaning methods – giving rise to the ‘fraud mirage’ or the ‘garbage in garbage out’ era we’re living in. We’re paying to collect bad data, and we can’t even be sure if we’re cleaning it all out. How can we with a clear conscious say that the insights we are delivering to end clients are good when the root source of the data is so muddy. We’re attempting to produce treasure from mire.
researchers discard an average of 38% of the data they collect due to quality concerns and panel fraud, with some reporting having to discard up to 70% of their data!
The Impact of Poor Data Quality:
The result – massively biased survey results. A study conducted by Ron Sellers revealed a 287% inflation gap for brand awareness between ‘bogus’ respondents and ‘valid’ respondents. This is the type of skewed research that in many cases provides insights that influence ‘data-driven’ decisions for multi-million dollar companies.
The Core Issues at Play with Online Survey Panels:
Eternal Trade-offs: The Quest for Quantity Over Quality
Consumer Research Panels are under the wire trying to deliver data faster and cheaper each day, and this does nothing to help them foster quality. Research suppliers and clients are faced with limited budgets, but also expected to deliver high-quality insights. Demanding more from panel companies has gotten them nowhere. In fact, panel companies hold no responsibility over the quality of their panelists. They deliver as quickly as they can, paying these people little to nothing, and don’t care that retention rates are declining rapidly. The panel companies are not the ones hurt in these situations, it’s the research agencies and clients that are put under immense pressure. They in turn try to make the best of it by adding data quality questions, using fraud tools, and embracing rigorous cleaning processes on the back end. But the moral of the story is that all sides are being put in a bad place to choose quantity and speed over quality.
The Rise of Sophisticated Fraud: Bots and Survey Farms
Survey farms are getting harder to find and growing with each day. Online articles and blogs are even teaching people how to cheat the system, with the infamous example of Paid For Your Say leading the way. They teach participants exactly what to do to “shape-shift” into whatever is needed for acceptance into a study with online panels.
The increasing use of AI bots with large language models is adding to the challenge of detecting and preventing fraudulent behavior. Studies have demonstrated the challenge of distinguishing the difference. One peer-reviewed study discovered that experts were able to correctly identify AI writing in only 39% of cases. Another evaluation of AI detection tools revealed that these tools frequently produce false positives and false negatives, failing to meet the accuracy claims made by the tools themselves. Even OpenAI says AI detectors do not work.
Digital Advertising Buys Misleading Panel Recruitment Efforts
Many panel companies today are buying ads via programmatic buys. This means the ads are being placed based on algorithms to reach the audience in the most lucrative spaces. This seems smart at the surface, but if you look at what this actually means, it is quite scary for the quality of these panels. If you google and look ‘how to cheat surveys’ or ‘the best survey bot programs’ you will find lots of articles and blogs. And on these articles and blogs you are likely to see some of the biggest panel companies’ programmatic ads popping up.
The Plight of the Respondent: Low Pay and Long Surveys
A quick look at the low incentive rates and average length of surveys in our industry tells us all we need to know about why survey data quality is going through the floor.
Incentives:
If we are being honest, researchers themselves are partly to blame for this. I have seen projects pushed through with an LOI of 25 minutes or higher, while clients are expecting to only pay max $1.50 per survey response. But, when asked, none of the researchers themselves said that they would take the survey for that incentive. Yet for some reason we have it in our heads that ‘normal’ people will do what we ourselves wouldn’t consider for a second. When we are paying people next to nothing for what they do, how do we expect to get quality responses, yet a ‘representative’ sample? The representative portion of the population would feel way too undervalued to take part in something so worthless. Of course, fraudulent respondents are moving in to cheat the systems and fill the gap.
When we are paying people next to nothing for what they do, how do we expect to get quality responses, yet a ‘representative’ sample?
Survey Length:
Survey research has been fixated for years about stuffing as many questions as possible into each survey – leading to extremely long questionnaires. 15-minute surveys are now considered ‘short’. When tech titans such as TikTok and Meta have optimized their apps over years to attract people’s attention, it still amazes me that researchers expect to compete for this attention and get the average participant to stay focused for longer than ten minutes to answer a survey. Compared to Insta reels, surveys could not be more boring.
SurveyMonkey has shown for surveys longer than 30 questions (~10 mins), that respondents spend half as much time per question than when the survey has less than 30 questions. The average human’s attention span is now 8.25 seconds – down 4.25 seconds from year 2000 (no doubt in part due to the likes of Meta & TikTok). By the time a person is 10 minutes into a survey, they are simply skimming the questions and giving the first ‘satisficing’ response they find. Quality left the chat by question 10.
The average human’s attention span is now 8.25 seconds – down 4.25 seconds from year 2000
So … What Are Our Solutions??
Deal with it … Become Obsolete
We always have the option to not change and to just deal with the issues at hand. But we think this is the absolute wrong answer and the one that will eventually put you out of business, making your data and insights garbage and making you irrelevant and obsolete. Instead, we recommend embracing a combination of the following solutions into your research.
Embracing Behavioral Data: A Genuine Path to Authentic Insights
Our answer to these problems is using what we call behavioral-panels. These are market research panels that combine survey data with behavioral data to get the best of both worlds. Respondents opt-in to the consumer panel and grant access to their digital behaviors & location data (collected through a meter). In addition, respondents can be reached for surveys – targeted based on the behaviors they show or triggered when a certain event happens e.g. if the panelist is exposed to a selected ad on social media.
Your run of the mill questionnaire design that includes traps and measures to determine whether a respondent is valid or not is not enough. Again, Rep Data shows only one third of bad data is caught. With generative AI answering some of the questions your standard open-end analysis won’t cut it.
The use of behavioral data integrated with surveys is a big source of hope for us. Unlike traditional survey methods, which often rely on self-reported data that can be easily manipulated or inaccurately recalled, behavioral data offers a more objective and reliable source of information. It is derived from actual user activities, such as web browsing patterns, app usage, shopping behavior, and media consumption, providing a real-time, holistic view of a respondent’s actions and preferences.
Ensure the people you are talking to are real people:
Because behavioral panels collect actual device level information on consumers’ behaviors, fraudulent activity can immediately be identified. Panelists who are using survey bot tools, apps, or websites are automatically removed. Panelists who are searching for how to cheat surveys or how to automate survey open-ends are removed. By the time behavioral panelists get to your survey, you at least know they are real people. Now, just because it’s a behavioral panel, do make sure that these fraud processes are being followed.
Target the people you want to take your survey:
By integrating behavioral data into survey research, researchers can significantly enhance the screening process for consumer panel participants. This approach allows for the identification of genuine respondents, those that fit your screening criteria, by targeting them based on their digital footprints, you can effectively weed out participants who select everything or lie about what they do to try and qualify. Now, typically you can find these people if you ask something specific about the topic of the study, but why not just keep these people out of the survey from the start. For example, you want to talk to people who are engaging in a specific mobile game app – done. Behavioral data can help you target these individuals who are spending time daily or weekly with this app and on the flip side your responses and feedback on the app and functionality will be spot on because you can feel 100% confident these are the correct people. Another example, if a study is targeting individuals interested in health and fitness, researchers can verify participants’ eligibility by analyzing their interaction with fitness apps, visits to health-related websites, and online purchases of fitness equipment or health supplements. For surveys looking for category shoppers – for example people looking for electric razors – you can identify people by what they have shopped for on Amazon, or other websites they have visited that are owned by brands in the category. Bots cannot mimic this type of online and location-based behavior. This method also ensures proper targeting for participants who truly fit the demographic or segment of interest, leading to more accurate and insightful research outcomes.
Reduce the questions you ask but collect more data?
How, you might ask? When you leverage a behavioral panel, you are able to append the behavioral data on panelists to your surveys so you don’t have to ask them to recall these behaviors in the survey. This could include what social media apps they are using, what news sites they read, their hobbies and interests, how much time they spend on TikTok each day, what products they were shopping for, what ads they were exposed to, and more. All of these behavioral variables can be combined with attitudinal and perception survey data points. And by analyzing what people ACTUALLY do instead of what they say they do, you can ensure you are getting higher quality and more accurate data. People can’t recall what they do and how much time they spend doing it. While some answers are closer to reality than others, we’ve seen this gap as large as 80% inaccurate even when giving them a wide range to choose from. Think of behavioral data in this use case similar to a 3rd party data append but take away the need to conduct a PII match and take away the modelled and inferred part of the data. Behavioral data enrichment to a survey provides deterministic, 100% accurate data points that can easily be analyzed with your survey data.
Shortening Surveys: Enhancing Engagement and Accuracy
In addition to leveraging behavioral data, there are traditional processes that must be embraced too. We need to make sure that the data we are collecting via survey questions are high quality. Eliminating respondent fatigue is a necessary step in this process. While 10 minutes is still perceived as the goal standard for survey length, a study conducted by Tap Research showed that abandonment rate for a 10 minute survey rose from 23% in 2017 to 53% in 2022. It is time to recognize and address that today’s respondent has a much shorter, and less tolerant attention span.
Researchers need to abandon their inclinations towards 20-30-minute-long surveys. Remember behavioral data can help with this but it’s not the only answer.
A few personal tips for shorter, more engaging surveys include:
- Leverage behavioral data variables as much as possible
- Keep your questions focused on the main objective of the study
- Consider modules or shorter surveys fielded over a period of time instead of one long survey
- Include bad data questions but don’t trick respondents – that’s just not fair
- Always include at least one one-end question that is focused on the topic of the study to help you verify these individuals know the category and are paying attention
- Don’t re-ask questions in various ways – determine the best way to ask and the best question for your analysis
Fair Compensation: Respecting and Valuing Panelists
A lot of the root cause of our quality problems stems from the fact that many market research panels treat their panelists like numbers, and not like actual people. Paying people more motivates them to work harder. The inverse is also true. When survey takers are answering your questions, and still ending up with below minimum wage for the time they are putting in, of course the data is going to be garbage. The people who are willing to take surveys for pennies generally do not accurately represent the overall population. They are also not going to be motivated to provide thoughtful answers. It is unreasonable to expect a respondent to take part in the types of studies being run every day, yet this is exactly what is happening.
The simple conclusion is that we need to pay people fairly for the time that they put in. It’s as simple as that.
Conclusion:
As we stand at the crossroads of technological innovation and market research methodologies, the urgency to address the deteriorating quality of online survey panels cannot be overstated. The emergence of sophisticated fraud, misaligned incentives, and the undervaluing of respondent contributions have eroded the foundation upon which the credibility of survey panels rests. Yet, within this challenge lies an opportunity—a chance to revolutionize the way we gather, analyze, and interpret consumer insights. The adoption of behaviorally-enriched surveys and other advanced methodologies is key to ensuring the relevance and efficacy of market research in the digital age. The path forward is clear: to safeguard the future of market research, we must commit to a paradigm that values quality, integrity, and respect for the respondent. Only then can we unlock the true potential of survey panels as a tool for insightful, actionable, and reliable market research.
About Qrious Truths:
Qrious Truths: Behaviorally-Enriched Surveys
Our team has been pioneering behaviorally-enriched surveys for the past 3 years to build one of the most comprehensive Behavioral Data Networks on the market. We collect the digital actions people are taking from our data network of over 100,000 opted-in US consumers, including web, app, shopping, media, social ad exposure, and location data. The core of our data collection at Qrious is our proprietary ValueMe App, which passively gathers first-party behavioral data to provide insights into consumer habits and boasts a 4.5 star rating with over 3,000 reviews and 50,000 downloads.
At Qrious, we believe that the best insights come from understanding observed behaviors. What people actually do, not what they say they do or think they do. That’s why we have created Qrious Truths, our behavioral-enhanced surveys – supercharging your survey precision and targeting.
What If I am a panel provider?
We also help other panel companies evolve into behavioral panels using our technology. If this is of interest to you, request access here.