It’s natural for researchers, fieldwork managers, sample managers and sample providers to want the highest possible completion rate for every online survey. And while a certain amount of dropouts are to be expected, alarm bells can start ringing when the percentage appears to be too high.
If there’s no obvious explanation, such as the survey being very time-consuming or poorly presented, the reason may be a technical one. Having investigated a number of reported survey dropout cases at NIPO, we have identified five likely technical causes, along with ways to mitigate them:
If you’re concerned about overly high dropouts in any of your surveys, it could be worth looking into these possible causes.
When an ‘anonymous link’ to a survey is shared on a social network, allowing anyone who finds it to enter the survey, you can expect bots to pick up on it and ‘give it a try’. We have observed a number of surveys with literally millions of interview starts that can be attributed to bots, rather than human respondents.
These starts can be recognized in your survey data, as there will be no answers to any of the survey questions. You may still see automatically generated ‘respondent data’, such as device detection, which has been run before the first question is shown. But no data other than that.
How to prevent itAdd a simple landing page between the link and the actual interview start. All it needs to have on it is some text that introduces the survey and asks for confirmation to continue, with a button that brings the respondent into the actual interview.
When tested on previously mentioned cases with the millions of dropouts, which were suspected as being caused by bots starting the interviews, this simple trick lowered the dropout rate from over 99.9% to below 10%. From this, we concluded that (most of the) bots that were causing the high dropout rates do not follow up (by ‘clicking the button’) after the initial request.
It’s understandable that you may feel hesitant about adding ‘yet another screen’ to the survey. But if it already has an introduction page, on which the respondent is only required to click a button, why not simply move this page up the order, as described above?
Why ‘non-legitimate’ interview starts matter
Of course, there are also bots which are smart enough to follow up on some questions and provide pseudo answers. When that is the case, the simple landing page solution will not work.
Even if you don’t share your survey link via a social network, but invite all your respondents personally, providing each with a link containing a unique Respondent Key, bots can still find these and ‘give it a try’. Some smart bots even make up Respondent Keys that they then try.
How to prevent itAs in example #1 (Bots hitting on ‘anonymous link’ studies), a landing page can help. But if the bot is somewhat smart, this alone may not resolve the issue.
If you observe your survey being ‘polluted’ with Respondent Keys you did not hand out, you may want to set the ‘Allow only known respondents’ setting to True. With this option set, Nfield will only allow respondents with Respondent Keys that have been uploaded into the survey sample table.
If you share personalized links (containing unique Respondent Keys) via email to specific respondents, dropouts can also be caused by the respondents’ email service providers checking that link as part of their automated security process. It is common for email service providers to follow links in emails sent to their customers to check they don’t trigger malicious action. We know, for example, that Google, Hotmail and Yahoo do this to protect their users.
Following these links often means opening them, which causes the interview to be started. The email service provider will conclude that the link is safe, but until the email recipient (i.e. the actual respondent) also clicks that link, the survey will be considered by Nfield as a dropout, because the survey was started but not finished.
How to tell if this is the cause of your dropoutsTake a look at the times these unfinished interviews were started. If you see many dropouts at around the time, or shortly after, you sent out your email invites, this is a good indication.
Before respondents are even shown their first question, many Online surveys begin by automatically running processes to establish the respondent’s device and browser (to determine how the questions should be rendered) and to verify the link that started the interview has not been tampered with.
These processes often involve Nfield interviewing making API calls to services which are external to Nfield. These API calls are not always performed as well as they should be, especially when the load is high. When an API request takes too long to be serviced, Nfield can give up on the request and time out. Without getting the response necessary for the interview to start, the respondent also gives up and closes the browser.
Sometimes these API requests just fail. If no appropriate measures have been taken in the questionnaire script to deal with a failing request, the script can run into an error. When the respondent sees this, they drop out.
A less frequent cause of high dropout rates can be an issue in Nfield itself.NIPO mitigates issues once these have been reported or recognized. Depending on the severity, hotfixes can be put in place.
Consider each of the first four possible causes described above, and see if you observe anything that indicates any of them as the likely cause. Then take the suggested action.
If, after doing this, you still think what you see is the result of a bug in Nfield, create a support ticket. Share as much detail as possible.
Get notified when we publish new handy tips and important news about Nfield
© 2022 NIPO | Privacy | Cookie disclosure | Sitemap