Insights — Making Surveys Mobile-Friendly, Part 3

Making Surveys Mobile-Friendly, Part 3

Resources , Thought leadership / September 29, 2015
SimpsonScarborough
SimpsonScarborough

In parts one and two of our series, we talked about the rise of the unintended mobile participant and how your key audiences are using mobile devices to participate in online surveys. In part three of our series, we will look at a recent case study, an online survey of alumni, and the differences we saw (and did not see) between respondents using a mobile device and those using a computer to complete the survey.

Case Study: Online Survey of Alumni

Let’s look at a recent online survey of approximately 800 alumni respondents. As shown in the chart below, 36% of respondents used a mobile device to complete this particular survey.

Mere_1

This survey was a very straightforward 20 question survey made up of multiple choice and rating scale question types. Median survey length was 12 minutes for respondents using a computer and 13 minutes for those using a mobile device, so while there was not a huge difference in survey length, it did take mobile respondents 8% longer to complete the survey than those using a computer.

How did mobile respondents differ demographically from respondents using a computer?

We wanted to see if there were any differences in the demographic profile of respondents using a mobile device and those using a computer to complete the survey. The demographic differences were minimal, and in fact, only gender varied significantly—63% of mobile respondents were female, compared to 53% of respondents using a computer. But what is arguably more interesting is where we did not see any differences in respondents.

In addition to gender, we tested to see if there were any differences in employment, income, and age for mobile vs. computer respondents. There were no statistically significant for any of these variables. The chart below shows the age breakdown of respondents using a mobile device compared to those using a computer to complete the survey.

Mere_2

While mobile respondents are skewed slightly younger, the differences by age are not statistically significant. It is important to note that there are older alumni using mobile devices to complete an online survey—while alumni are more likely than prospective undergraduate students to use a computer to complete an online survey, we cannot assume that younger alumni are the only ones using a mobile device. In fact, 28% of alumni age 60 or older completed the survey using a mobile device.

How do responses and response behavior differ between for mobile vs. computer respondents?

One concern with unintentional mobile respondents is that their responses or response behavior might differ from those responding to a survey using a computer. For example, there is a concern that if presented with a multiple response “select all that apply” question, mobile respondents might select fewer response options than those using a computer because they might have to scroll more or take more time to make selections when using the smaller screen. We tested for differences in mobile vs. computer responses for a sampling of different question types outlined in the table below. These questions represent some of the questions that we anticipate might be more burdensome for a mobile respondent.

Question Question Type Notes on additional burden for mobile respondents
Aided awareness: schools with an excellent academic reputation Multiple-choice, select all that apply from a list of 24 colleges and universities Programmed as two columns; mobile respondent might have to hold phone horizontally or scroll right to see all response options
Statements that best describe characteristics of University X Multiple-choice, select all that apply from a list of 16 short statements (6-18 words) Programmed as one column; mobile respondent might have to scroll down more than once to read all statements
Rating agreement with various statements about University X 10 point rating scale, where 1 is strongly disagree and 10 is strongly agree. Rating 16 statements Programmed in a table where respondents might have to hold phone horizontally or scroll right to see the full 10 point rating scale. Additionally, respondents will have to scroll down to rate all statements

For the multiple choice questions, we found no statistically significant differences in the responses or the number of options selected for respondents using a mobile device vs. those using a computer. For the 10-point rating scale question, we found no differences in the number of respondents who skipped the question or mean ratings for mobile vs. computer respondents. This was great news for us. The unintentional mobile respondents did not have a negative impact on the quality of our survey data. That has been the case in almost every survey we have conducted since we started tracking unintentional mobile respondents.

So what happens when we do see a difference for mobile respondents?

In this case study, we did not see any differences in how mobile respondents responded to the survey compared to respondents using a computer. But what happens when we do see a difference? In a separate recent study conducted by SimpsonScarborough, we noticed some differences in mobile vs. computer respondents for the question below, which required respondents to view a video.

Please view the video below (embed video):

How does the video influence your impression of University X?

  1. Very negatively
  2. Somewhat negatively
  3. Does not affect my impression
  4. Somewhat positively
  5. Very positively

When we took a closer look at these responses, we found that we had a number of mobile respondents who noted in the follow-up open end that they were unable to view the video. We had tested the video capability on both apple and android mobile devices, so how could this have happened?

Luckily, we had embedded metadata in our survey that was able to show us the operating system for these respondents. We found that these respondents were using an old operating system. That brings up another interesting element to the unintentional mobile respondent issue: we have no control over whether people keep their phones updated. In the rare case where we are using a video or audio file, this could be an issue.

So we know the “why,” but what did we do about it?

We found that some of these respondents who did not view the video still answered the rating question. And since they were not able to view the video, these responses were not usable. Unfortunately, because we could not assume that everyone who was not able to view the video noted that in the open-end, we chose to only report on non-mobile respondents for this question. Luckily we had a large sample size and a large number of respondents who used a computer, so this was not a problem. But we learned an important lesson. From this point forward, when using video in a survey, we will include a response option that allows the respondent to indicate that they were unable to view the video.

Conclusion

More and more respondents are choosing to use mobile devices to take surveys we intended to be taken on a computer. We started to track mobile usage on our online surveys so that we could have a better idea of how our respondents are using mobile devices and how this usage affects the quality of our survey data. Fortunately, we have found very little, if any, impact. But it’s not something we can ignore. We will continue to monitor mobile survey takers and strive to design mobile-friendly surveys that ease the burden on these respondents.

Related Insights