The National Student Survey (NSS) has proved to be a remarkably robust and stable tool since it was introduced in 2005. With over two thirds of eligible students responding, giving a response of almost 325,000 last year, it provides information that is hugely important to inform student choice, drive institutional enhancement and provide regulatory assurance.
There have been several reviews of the Survey over the last 17 years but in practice the questions and survey have been remarkably consistent over that period. It is therefore right that we take a more wide-ranging look at the survey now, but it is important that any reforms build on the strength of the survey rather than undermining it.
The current consultation on the NSS, led by the OfS on behalf of the four funding and regulatory bodies across the UK, is proposing some very comprehensive changes including to the survey methodology (looking at response scales, moving to direct questions and reducing the length of time the survey is open), creating a fixed core of questions and others that change more regularly, and perhaps most significantly proposing a divergence between questions asked in England and other nations.
This last point is perhaps the most significant both in the principled sense of creating divergence between the four nations of the UK but also in terms of the specific proposal to remove the summative question 27 on students’ overall experiences. Students applying to universities do so as part of a UK-wide system – in most cases through UCAS – with significant cross-border flows of students. The proposal to remove question 27 would remove a key piece of information about students’ overall experiences and reduce the comparability of experiences at different institutions. The overall summative question also provides an opportunity for students to reflect on their whole experience rather than particular elements of it and would be a significant loss to prospective students and also to institutions that use it for enhancement purposes.
Secondly the proposal to shorten the length of time that the survey is open would cause significant impact for the diverse range of providers in the higher education sector. Many institutions with smaller cohort sizes struggle to reach the publication threshold in the current window and shortening it would further impact on this, reducing information for students. It would also impact on other institutions with different student enrolment dates or term dates and students going onto placement in term 2. Any changes to the National Student Survey need to be seen through the lens of the impact on student information and this proposal would fail that test.
The consultation proposes a number of additional standalone questions around freedom of expression and mental wellbeing. We believe that all questions in the Survey should align with the overarching criteria of relating to the student academic experience, but also the principle in the current survey that you can’t explore specific areas of the student academic experience – whether feedback and assessment, learning opportunities or teaching on my course – through a single question. The idea of introducing a single additional question to explore students’ experiences of freedom of expression or mental wellbeing will never get to the heart of the issue with sufficient nuance to provide meaningful data. These issues are already explored through alternative approaches, such as the HEPI student academic experience survey that looks in detail at freedom of expression. There needs to be more thought given to why and how we collect this data, as well as whether it meets the principle of considering the student academic experience. Otherwise the NSS could continually drift into other aspects of the student experience which whilst interesting has the potential to make the survey unwieldy.
Finally, as implied above the proposals to reform the NSS are very radical, and also include moving to a more direct form of questions. The scale of proposed changes needs careful cognitive testing with students, both of the individual questions but also the final survey as a whole, to ensure that it is exploring the issues that we want it to and that it is understood by students filling it in. We do not believe that the case has been made for the rushing of the analysis and survey development to be introduced from this January and would strongly propose that the finalised survey is testing properly as a whole. The strength of the National Student Survey over the last 17 years is that it has been remarkably consistent and we would not want to introduce significant changes now that hadn’t been properly tested and required significant revision in a couple of years’ time.