by Stella Jones-Devitt
There is considerable evidence – and probably much more that goes under-reported – of high levels of student non-response rates to surveys within higher education. Several reasons have been identified as pivotal, including: those students with a record of lower performance are less likely to respond; those not appreciating the relevance or ‘salience’ of a survey are also less likely to respond; students experiencing what is known as ‘survey fatigue’ i.e. being over-surveyed will also be far less likely to engage. Put those three aspects together and a perfect storm of non-response is created. And yet, we still see surveys as a robust way of gathering useful evidence and they often dominate as a research methodology involving students….
There are still some things that you can do to improve engagement, if (a big ‘if’ that should be considered carefully) you decide to do a survey, despite the reported shortcomings. Within STEER, we have constructed an evidence-informed process, known as the Survey Research Design Checklist (SRDC) drawn initially from a systematic review of factors affecting survey response rates by Fan and Yan (2010). There are 8 factors to consider carefully when designing any kind of survey, including:
- Access to survey
- Level of student support
- Optimal length of survey
- Timing and possible information fatigue
- Question wording and ordering
- Survey question formats
- Reporting and debriefing
Access to survey
Evidence indicates that the process through which participants are introduced to any survey is extremely important for subsequent engagement. Response rates for students are usually much higher if surveys are embedded within their own programmes of study, rather than seen as additional or institutional in any way. It is crucial to minimise the number of tasks, or steps, deemed absolutely essential before respondents get directly to the survey.
Level of student support
The overall visibility of the survey subject matter and access to further support and/or discussion with programme tutors and students, pre any test, could be crucial for engagement, especially to ground the process within an integrated subject and curriculum context. Feeling part of a course of study is the foremost starting point for engagement in many surveys. Pre-empting possible benefits and disbenefits should also feature to enhance possible engagement.
Due to perceived survey saturation, considerable attention has been given to the use and effectiveness of incentives linked to response rates. Evidence shows that response rates improve when providing a clear narrative for how scheme incentives are administered at the outset; usually by using an immediate and guaranteed reward alongside longer-term ‘lottery’ opportunities. The use of incentives to re-engage non-respondents can also increase uptake, although there are some ethical and methodological challenges to consider regarding rewarding non-respondents more favourably than original survey participants.
Optimal length of survey
Response rates are linked closely to how long surveys take to complete. Studies show that an optimal time for students to engage in any survey is approximately 13 minutes, which is extremely challenging for most survey purposes, but if you really want to maximise engagement, survey design has to be much shorter than conventional norms!
Timing and possible information fatigue
Targeting first year undergraduate students during the early weeks of their studies can be problematic, especially during points of induction-related information overload and if mechanisms such as welcome surveys are used. At the time of any survey delivery, it is much more productive for response rates if the topic has a high degree of salience for potential participants.
Question wording and ordering
Evidence indicates that simplicity encourages the greatest participation, whilst ordering effect is often given insufficient attention in survey design.
Survey question formats
There is considerable literature about the implications of questionnaire layouts and how ease of survey navigation for respondents can be pivotal to engagement.
Reporting and debriefing
Considerable attention needs to be given to motivational strategies for furthering student engagement following completion of any survey. This could be monitored formatively and summatively. It may be useful to consider wider theories of social exchange for possible impact upon engagement within the survey process.
Help is at hand!
We have produced a version of the SRDC which can be used by those designing surveys, just click on this hyperlink – SRDC. We are happy to share and also hear about whether it has been useful. If you would like further help to discuss whether a survey is the right evidence-gathering process to use, or would like to hear more about the application of SRDC to many varied contexts, please get in touch with us at firstname.lastname@example.org.
Image courtesy of pixabay