Fieldwork Tips: The long and short of it – Avoiding the negative effects of lengthy surveys

Posted on 11th November 2015 by Youth

GroupHappy1

By Charlotte Woods, Account Manager, Panel Services team

Recently we blogged about how to avoid the pitfalls of annoying your survey respondents – top 6 pet peeves. We noted that overly long surveys were the single biggest bugbear for our members.

We thought it would be worthwhile to develop the issue of over-long surveys in greater detail and look at the impact and implications of stretching respondents’ attention spans to snapping point. And to offer some advice based on the feedback we receive from our panellists.

We took a look at two recent surveys of varying lengths that had been scripted and hosted on our mobile-friendly platform. We compared the respondent open feedback and star ratings (we ask for both at the end of every survey we host), speed of fieldwork, straight-lining and mid-survey drop-out levels. Here’s what we found:

Surveys (Long vs Short)

Our analysis is not scientific of course (based on a sample of two!) but instead an illustrative view of how survey length can impact the outcome of your research project. It’s worth noting that the topic and question structure of the long and short survey are not the same, which can have an impact. Nonetheless, we do see similar effects on most surveys over 20 minutes.

Feedback

The truth always comes through in feedback, which is why we always collect and analyse. From the beginning to the end of fieldwork, we always read panellist feedback to check for any issues that respondents might be experiencing from questionnaire design to the technical issues that may arise.

 Long Survey:

“[it] went on way too long …I lost interest” (Sophie, The Student Panel)

“quite long to give detailed and decent answers” (Jake, The Student Panel)

Short Survey:

“It was not too long which made it easier to answer. Questions were clear…” (Will, The Student Panel)

“Clear and concise, easy to use and complete” (Tara, The Student Panel)

Respondent comments from the long survey v short surveys highlight the frustrations that respondents feel when completing lengthier questionnaires (and conversely the satisfaction of short ones). The sentiments for both types of surveys were repeated throughout the Feedback section.  When we coded the open feedback in both surveys (as in the table above) we found that there was three times more positive feedback in the shorter survey than the longer, with 20% for the 30-minute survey versus 60% in the ten-minute survey.

Apart from open-form text, we also ask respondents to rate their in-survey experience on all surveys, which we’re now starting to share with clients. Unsurprisingly, the ten-minute survey was rated significantly higher than the 30-minute survey, with the shorter one achieving a star rating of 4.3 and the longer a star rating of 2.8 (where 1 = very poor and 5 = very good).

Respondent diligence

Straight-lining (where respondents speed through questions), especially towards the end of surveys, is a common problem with lengthier surveys. And it doesn’t imply bad faith on the respondents’ part; it just means they are getting bored. It makes your data noisy and more equivocal. It is unlikely algorithms can spot all straight-lining. So if you want your findings to be robust, incisive and insightful, it’s imperative to think about respondent engagement! Predictably, in our analysis we noticed a higher level of straight-lining in our longer survey than the shorter one, with four times as many straight-liners being found in the 30-minute survey compared to the ten-minute survey.

Fieldwork times

We understand the time pressures that face many of our clients. However, it’s commonly forgotten how much of an impact survey design can have on fieldwork times. Our analysis clearly shows this, with our shorter survey having approximately three times the number of completes achieved per day compared to the 30-minute survey. A longer survey invariably results in a higher mid-survey dropout rate which means that more respondents have to be invited to get the required survey completes. Can you afford to have this happen when looking for niche sample groups? And even more importantly: given the pressure research executives are under to add value and spend time on analysis, extended fieldwork timelines just cut into good analysis time.  Who can afford that?

Mid-survey drop-out levels

We pride ourselves on having low dropout rates across all our surveys. This is in large part due to the fact that we are a youth specialist research house (as well as Panel Services provider) so frequently can run highly relevant and well-crafted surveys aimed at our 16-30 audience. It’s also attributable to our award-winning mobile-optimised survey template which ensures a better respondent experience. That said, in our analysis we still found a higher than average mid-survey dropout rate with our longer survey; with 40% who started not completing in comparison to 12% for our shorter survey. Do high drop out rates matter if you’re still able to complete fieldwork? The short answer is yes; it reduces the survey validity and means and together with our points on survey diligence, mean the resultant findings are ‘noisier’ and less incisive.

So what can you do if you’ve got a long survey? Here are three golden rules:

 

1. Keep it under 20 minutes

  • Are all the questions in the survey are necessary?
  • Remove demographic questions which are held on the panel; these can always be provided after fieldwork.
  • Are you running bad questions just to keep time series? It might be time to re-consider and break with the past.

 

2. Make it engaging

  • Make it as easy and as enjoyable to complete as possible.
  • You don’t have to gamify. Simple routes to a more engaging survey could include images, interactive icons and even improving questionnaire wording.

 

3. Be absolutely honest in communication

  • How long the survey is going to take? Don’t be tempted to cut this number when you communicate it.
  • Prepare respondents for a likely screen-out rate.

 


Contact the Panel Services Team

Reach a 135k+ large Millenial & Gen Z sample, with 400+ data points for profiling. Call us directly, or click below to find out more about what we do.

Tatenda Musesengwa, Client Services Director, 020 7374 0997, tatenda@youthsight.com

Charlotte Woods, Project Executive, 020 7374 0997, Charlotte.Woods@youthsight.com

 

 
 
Read More FIeldwork Tips

Panel Services

Clients and case studies

Related Posts

Fieldwork Tips: The long and short of it – Avoiding the negative effe...

11th November 2015

By Charlotte Woods, Account Manager, Panel Services team Recently we blogged about how to avoid the pitfalls of annoying your surve...

Fieldwork Tips: Avoid the pitfalls of annoying your survey respondents – ...

11th February 2015

Last year we celebrated our 10th year as a specialist youth research agency and panel supplier. Over the years we’ve learned a lot abo...

Fieldwork Tips: The Challenges and Opportunities Offered by Smartphones for...

8th August 2012

Tatenda Musesengwa leads the Panel services team at YouthSight and is responsible for supplying sample to the rest of the market research in...

All panels are equal, but some panels are more equal than others...

28th May 2012

Panel quality is a major issue in the market research industry and rightly so. Debates around data quality, panellist engagement and the imp...

Peanuts and Monkeys – the argument for high quality online research...

15th November 2011

By Ben Marks, Managing Director, OpinionPanel   Online access panels are the workhorse fieldwork mode for the market research...

Fieldwork Tips: Is the traditional survey dead? [Clue: no]...

15th November 2011

By Ben Marks, Managing Director, OpinionPanel The traditional survey is dead.  We’ve reached ‘peak panel’...  That’s what ortho...