It’s time to measure student engagement rather than just satisfaction.
Since its introduction in 2005, the National Student Survey (NSS) has left an indelible mark on the higher education sector. A survey for final year undergraduates to give their views on 23 matters from teaching quality and feedback, to organisation and management, learning resources and assessment.
But the NSS is far from a perfect tool. I’ve always felt it is less a measure of student satisfaction – despite the title of the survey – and more a measure of whether expectations have been met or exceeded. And whilst we have seen modest improvement in satisfaction scores since the survey was introduced 7 years ago, it has undoubtedly helped to cast a greater focus on learning and teaching issues in institutions and across the sector more generally. Although never intended to be published as a league table, whenever the performance of an institution is boiled down to a single digit, newspapers will race to produce this as a rank order of institutions. Helpful or not, it has undoubtedly shaped student choices about what and where to study and has become an increasingly significant component of major league tables published by many of the leading newspapers.
The NSS has proved a helpful tool for students’ unions in their quest to argue for greater resources to be spent on the learning and teaching experience for students. It has also forced the arm of some Vice-Chancellors to spend more money on supporting teaching in their institutions. But whilst there have undoubtedly been plenty of positives from the NSS, I feel their benefits are beginning to tire.
Once a survey has been run for a number of years, and we start to see only the slightest of improvements in scores, I think it’s an opportunity to consider whether there are new and more useful ways we can ask students to reflect on their higher education experience without resorting to simple metrics and narrow measures. I argued at the start of this piece that in my eyes the NSS has never really been a measure of student satisfaction, but rather whether expectations have been met or exceeded. Not least because students will never really be able to say how their course compares to what it could have been like in another institution.
So rather than an exercise in testing student satisfaction, can we look at the extent to which students have genuinely been engaged in their course? In the United States, the National Survey of Student Engagement (NSSE) is run on an annual basis. Rather than getting students to rate aspects of their experience, they are invited to identify the extent to which they have been engaged in their course. How long have they spent preparing for lessons, working independently or participating in co-curricular activity? What are the factors that have helped shape their choice of course and institution? And what have been the quality of relationships with institutional staff and fellow students?
Getting institutions and students to think about the ways in which students are engaged and benefiting from their higher education experience will surely be more useful than a narrow snapshot of satisfaction or expectation at a given time. If students are to truly be at the heart of the system, we shouldn’t just be asking whether they are satisfied – we should be finding out how involved and engaged they are in their programme.