Skip to main content

emma-bean
4th October 2012

Should we value the National Student Survey?

How far should we value a system that consistently ranks Russell Group institutions below less traditionally academic ones?
Categories:
TLDR

The results of the National Student Satisfaction (NSS) survey, have been published to show that the University of Manchester has continued its tradition of getting relatively poor scores.

We have, however, increased on last year’s overall satisfaction rating by four percentage points (79-83%) and are now on a par with the University of Edinburgh, as well as no longer being the worst institution in the Russell Group, having passed that accursed baton onto Kings College London.

Let us not also forget, what the question that universities are falling down upon is, it’s not a test of students being overwhelmingly happy with their course and the institution, but rather just that they are satisfied, the academic equivalent of being described as merely “nice”.

The NSS could well be perceived as being a ranking system that gives unfair advantages to certain types of institution, such as campus universities. The University of Loughborough, for example, consistently performs very well in terms of student satisfaction, but is a far smaller institution than our own, with only 16,000 students, and most of its students are based on just the one campus in suburban Loughborough.

With their Students’ Union having regular club nights that are massively popular – many students’ social lives being structured around the University and the Students’ Union – it’s not surprising that they would be more satisfied with their university in broader terms, and that this might transfer onto a greater level of satisfaction with their courses as well. I find it hard to believe that students at Loughborough absolutely never have issues with feedback, for example, but when this is but one aspect of their university experience and everything else is so positive, it might seem like less of an issue.

Whilst we might not be entirely distressed by being beaten by a university like Loughborough, when the University of Teesside can consistently outperform us in these statistics, most would intuitively feel that there could well be a problem with the system, or at least with its popularity.

During the clearing process, NSS scores are often used by less traditionally-valued institutions as a way to appeal to desperate would-be students during the stressful time. Hearing that your peers are 100% satisfied with an English Studies BA from the University of Teesside, compared with a meagre 53% for English Language at our own dear university, could well be very persuasive at such a time.

When such value is placed on a ranking system that is entirely done on a relatively informal basis, with incomplete data sets, by people who are not impartial, nor experts, this seems fairly problematic.

Within some departments, and within some institutions, there are vastly differing numbers of people actually completing the surveys. In the figures quoted above, just 35 students completed the form at Teesside and 60 completed it at Manchester. With incomplete data, it does make the statistics seem less reliable, so it seems peculiar that despite this, they are still used as a way to rank institutions.

Students are not neutral actors either. The motives behind a student’s completion of the survey, whether positive or negative, might not be just to give the most accurate reflection of their time at university.

Indeed, even if they are just giving an accurate interpretation of their time, it might not be true to say that it is entirely objective. It seems plausible that if an individual feels happier in their non-academic university life, this could affect peoples general happiness, possibly influencing how they fill things out.

Whilst people’s overall experience at Manchester might well be very positive, the University itself probably often isn’t the best aspect of it. As people’s social lives often aren’t connected particularly to the University directly, unlike at other institutions, this shan’t lead to them thinking any more positively about the University. Geographically, the University is at a disadvantage through no particular fault of its own. It doesn’t seem to be awfully fair to have the same sort of measures for something so entirely subjective as a person’s level of satisfaction.

Dr Leif Jerram’s attempts to influence official student survey in 2011 show how much of an effect it can potentially have on a department’s funding, and perhaps also on employer’s perceptions of the value of a course or degree.

Two years ago he sent out an email to students who were due to complete the survey in which he said: “If our own students keep saying that this History department is failing to provide a satisfactory education, eventually employers will listen to that official verdict. You will suffer, and the value of your degree will collapse.”

When you have a system where so much value is placed on non-expert opinions from non-neutral actors, surely it should not be a nationally valued survey. Could it be imagined that any other survey with so many variables could be so widely or highly regarded?

Of course, some aspects of complaint are legitimate, such as those regarding appalling feedback in numerous departments; things that universities should act upon. Feedback issues are an incredibly widespread problem, and for people to be unhappy with this is reasonable; students have every right to be feel unsatisfied.

But, it does not seem right that these sorts of very real and pervasive issues are being judged in a way that cannot be equally applied to all institutions in a fair and objective manner.

Expert-led research into the problems of student satisfaction, which are often based on legitimate complaints, would be a very good thing to include alongside how a university scores for research or teaching. But to have it done in the present manner opens it up for numerous avenues of inaccuracies and distortion, and ultimately renders the very important data far less persuasive.

Emma Bean

Emma Bean

Middle Eastern studies at the University, originally from North Yorkshire

More Coverage

Crashing pres and judging the drinking culture at different student accommodations

On a mission to discover which student accommodation hosts the best pres, I crashed some flats to find out the answer

Community strength is the best thing about university. Long-live my house share!

As I stare out a library window, well and truly into dissertation season but thinking about my St Patrick’s Day plans, I’m reminded of how lucky I feel to live in a tight-knit student community.

Main library musings – rant column #1

Edition #1 of the Opinion section’s rant column: Dive in for some good old fruitless grumbling about issues the Opinion editors think plague student life

Is marrying a foreigner now exclusively a privilege of the rich?

As the government desperately tries to cling to power they’ve announced yet another crackdown on immigration. Infuriating changes to family and skilled worker visas are set to take effect in April and they’re going to make marrying a foreigner a privilege of the rich