The Science Behind Mobile Engagement: 3 Strategies for Increasing Response Rates

By the Editors

Mobile surveys and programs can see response rates of up to 15%, which is far better than the 1-3% for more standard approaches. A new study offers an in-depth look at why – and how – businesses can leverage that info to get results in their own marketing and research campaigns.

Decipher Inc. is a market research company that took a close look at 50 of its own online survey studies from the first half of 2012 and published the results in a white paper called Participation of Mobile Users in Traditional Online Studies. It’s important to note that all 50 studies involved online surveys that participants could respond to either by PC or using their mobile devices, meaning none of the studies involved surveys developed strictly for mobile users.

Still the data set is robust, encompassing information from more than 700,000 study participants. It’s also important to note that in some cases, the respondents were coming from panel providers who enlist people to respond to surveys for some kind of incentive (just over 2,000 study participants) while the rest were coming from customer lists (just over 699,000 study participants). Panel samples include people who do lots of surveys and probably want to do them on a bigger screen for the best experience. That explains why only 8% of them were coming to the survey via a mobile device. On the customer list samples, however, nearly 17% were accessing the survey on a mobile device, indicating that they weren’t necessarily expecting a survey invitation and were more likely to click on it from their mobile devices as they checked their email.

One interesting finding was that in many cases, mobile users were 1.5 times more likely to drop out of the survey without completing it. The reasons for this are two-fold. The first is that as traditional online surveys, they were not optimized for the range of possible mobile devices and may have rendered poorly. The second is that it’s also safe to assume that mobile users are prone to greater distraction as they’re more likely to be out and about doing other things when the invitation is accessed.

For users who clicked through to the survey, panel samples show high completion rates for users on a PC (82%) or a mobile device (80%), again realizing that they’ve signed on to a community of survey-takers. With the customer list samples, a different picture emerges. Those taking the survey on a PC had an 81% completion rate, but those taking the survey on a mobile device had a completion rate of only 68%. That makes the next question a natural: Why are mobile users dropping out of surveys at such a high rate?

Most of the answers lie in survey design and how it renders on mobile phones. The data showed significant spikes in dropouts when users encountered a question that involved a large grid or matrix of response categories. Those kinds of questions could require horizontal scrolling, which is a definite no-no in mobile surveys, or the need to zoom in on smartphones to ready tiny labels and tap on very small selection areas. There were lots of mobile dropouts on the first page of surveys, but this differed greatly by device – with Blackberry and Android users much more likely to drop out than iPhone and iPad users.

Clearly, it’s worth taking the time to test how your survey renders on different types of mobile devices. Mobile dropouts also spiked when encountering an open-ended question that required typing text, which is clearly more difficult and time-consuming on mobile devices than on PCs. As texting continues to blossom, however, this will probably become less of a concern. Mobile users also took longer to complete surveys than PC users, probably indicative of how long it takes the pages to render, which means it’s a good idea to minimize anything (rich graphics, video, Flash, etc.) that will hog bandwidth. For mobile users, the shorter the overall survey the better.

The takeaway here in terms of boosting response rates to mobile surveys is clear – design with mobile in mind and test it on as many different devices as possible to make sure the survey is clean, simple and easy to use. And of course, offer the best incentive you can to make it worth people’s while to fill it out.