Insights — The Impact of Mobile Devices in Survey Responses

The Impact of Mobile Devices in Survey Responses

Resources , Thought leadership / January 16, 2018
SimpsonScarborough
SimpsonScarborough

Back in August 2015, we started a blog series about our attempts to better monitor mobile device usage for online surveys and its impact on data quality, response time, and respondent experience. Not surprisingly, we found that prospective undergraduate students were the most likely of all of the audiences we study to be using mobile devices to take surveys. What did shock us was the extent to which this was true, with prospective undergraduate students more often taking our surveys on mobile devices, such as phones and tablets, than on computers. To ring in the new year, we thought it might be a good time to revisit the topic and see how things have changed for the prospective undergraduate student audience in the past two years.

We looked at data from recent surveys with undergraduate prospects and compared it to the data from 2015. I don’t think it will surprise anyone that mobile device usage for online surveys has increased by 16%.

Why do we care?

When we first started digging into the impact of mobile device usage on online surveys, we worried whether taking our surveys on such a small screen would increase the burden on survey respondents, increase the length of time it takes to complete a survey, or have a negative impact on the quality of data we collect. With constant improvements and advancements in mobile phones, however, as well as efforts on behalf of our survey programming tool, Qualtrics, to enhance usability, the burden on respondents has been less of a concern than we had feared it could be.

What’s more, again due to advancements in technology, increasing the length of time it takes respondents to complete surveys has not been an issue with mobile, either. In fact, in some cases a mobile device makes a survey easier to take, and we’ve also found that mobile respondents are often completing surveys faster than those who are taking them on a computer.

So what about data quality?

Data quality is always top of mind in everything we do. Every waking hour we are thinking about the quality of our data and what could impact it. Sometimes we find data quality concerns popping up in our dreams (not totally joking here). So regardless of how well things are going, we still worry about the impact of mobile respondents on the quality of our data, constantly asking ourselves questions such as:

  • Are mobile respondents more passive survey-takers?
  • Are mobile respondents actively thinking about the questions they are answering at the same level as those taking the survey on a computer?
  • Are there differences in the way mobile respondents answer survey questions? Are they selecting fewer responses? Are they answering open-ended questions with the same level of thought (or at all)?

The answer to each of these questions is yes and no. We do see some differences in how mobile respondents participate in surveys, but they are small. In recent replication studies we have completed, we have found that some of these trends apply to both mobile and computer respondents; they are selecting fewer responses and taking less time with the surveys in general. We are getting good quality data from our surveys, but sometimes it looks a little different from the data we got five years ago—not because of real change, but because of changes in the way respondents are participating in the surveys. Because of this, we have started reporting some benchmarking questions a little differently than just looking at Year 1 vs Year 2 percentage change. We will explore this and how we account for changes in survey respondent behavior when analyzing replication data in a future newsletter.

 

Related Insights