Posted by AaronPorter in Higher Education.
Tags: AimHigher, cambridge university, Department for Education, IAG, league tables, Oxford University, Russell Group
The world of information, advice and guidance for prospective students entering higher education is a complex one. Some students are born into environments where access to both information and, crucially, to advice and guidance is abundant. Others are fortunate to have schools or supporters to guide them through. However, many are still left to navigate an often unfamiliar environment alone and without crucial context to help inform and shape their decision making. Many of those students may well end up making decisions they could later regret, with an accompanying price tag that won’t exactly cushion the blow. The axing of AimHigher and the hands-off approach from the Department of Education towards career guidance in schools is only likely to make the problem worse.
Photo by Procsilas Moscas
Central to the information landscape are league tables. Yet, as I see it, very little is done to challenge their obviously gaping flaws. Practically all of the broadsheet newspapers turn their hand to league tables at some point in the calendar year, with universities quick to pounce on the table which places them highest as the “most authoritative”. Yet I’d argue that for many students, perhaps even the majority, all newspaper league tables are largely redundant, and frankly don’t give any indication of many of the crucial elements that students are really interested in; such as the quality of teaching, access to work experience and curriculum content.
Of course the simplicity of league tables, the fact that universities can be boiled down to a single digit, then placed in a rank order, is on the surface at least quite appealing. Prospective students can be duped into thinking that the university ranked 43rd is somehow better than the university ranked 51st, but in reality of course there is a very good chance that the ‘lower’ ranked university may well be more suitable for huge swathes of students.
So my major problem with university league tables boils down to two central arguments. The first is that the methodology which underpins most league tables is horridly out of sync with what undergraduate students in particular care about. The second is that whilst university league tables continue to be published with a simple rank order the ability for students to determine which factors are most important to them (e.g. employability scores, staff-student ratios etc.) are usually overlooked, which means that students are forced to judge universities on the factors which The Guardian or Sunday Times considers to be important, and not what the student her/himself cares about.
At the heart of the concerns about methodology, I am of the opinion that most league table compilers feel restrained by pulling together a methodology which ensures the same universities finish in roughly the same positions every year. The prospect that Cambridge or Oxford Universities wouldn’t finish in the top 2 positions is too horrific a thought for league table compilers to contemplate, so the metrics end up being heavily weighted toward ensuring this just ends up happening year after year. Convention suggests that they are the top two universities, and the Russell Group (24 of the most research intensive universities) are somehow the best universities, so rather than worry about having their own methodology questioned, it seems to me that newspapers retreat to a convenient set of metrics which mean that Oxbridge occupy the top 2 spots, and most of the Russell Groups universities are somewhere in the top 35. Employers also fuel the vicious cycle by largely focussing their recruitment efforts on the same narrow group of universities, whilst simultaneously complaining that many graduates don’t arrive with the skills they want. But whilst many of the big employers still confine themselves to a narrow group of universities it will continue to mean those are the universities which will continue to benefit from inflated employment scores. Employers might actually find that there are graduates from other universities which are just as adept, perhaps even more so given the more business-focussed curriculum that often exists in those institutions. But whilst many employers continue to screen out graduates from outside certain universities, they won’t ever know whether they are better or worse than what they are getting at present.
But in my criticism of the methodology of league tables I want to question why such a weighting is placed on the research output of universities. The role of research in universities is crucial, but frankly it doesn’t have the disproportionate bearing on the undergraduate experience that most league tables lend it in their weightings. In fact you could argue that the more research intensive a university, the less emphasis is placed on the undergraduate experience and teaching. However, could it simply be that newspapers know that by playing the research funding game, the 24 Russell Group universities who scoop around 75% of the total research income will comfortably take slots in the top 30? Our newspapers can then breathe a sigh of relief, knowing their rank order ‘looks about right’. It surely can’t be because these institutions provide the best teaching, the most work experience opportunities, the opportunities to participate in a range of assessment methods or add most educational value to their students – because in the main these are not the universities that do that.
So do most undergraduate students really agree that Oxbridge, or indeed the Russell Group more generally, are really the best universities? According to lots of measures that exist, the majority of students actually have concerns about our so-called ‘top ranking’ universities. From student satisfaction (many Russell Group universities are ranked in the bottom quartile on this measure) to value added (the extent to which a university adds to your educational performance during your years of study) these universities actually perform very poorly.
So rather than newspapers seeking to dictate what they consider to be the most important facets of a university, I’d like to see more effort placed into providing personalised advice and guidance to individual applicants to work out which university is best for them. There is nothing wrong with saying that a post-92 university is better for some students, and a research intensive environment better for others – but let’s get comfortable with that, and stop the pretence that we should be judging all universities along the same lines. For those actively wishing to benefit from a research environment it may well be Oxbridge or the Russell Group, but for a more employer-focussed curriculum or educational value added it is likely to be somewhere else, and I don’t think that summing up a university as a simple digit does anyone any favours!
Posted by AaronPorter in First or Fail, Higher Education.
Tags: applications, consumer guide, Independent Student Finance Task Force, league tables, National Student Money Day, Observer, UCAS, which?
Which? university guide and UCAS application figures: first or fail?
Which? magazine is to publish a guide to British universities but will it get a first or a fail?
The universities admission service, Ucas, has seen a 12% drop in applicants from the UK compared to this time last year. Photograph: M4OS Photos/Alamy
Heading for a First… Which?
At the start of this week the Observer reported that well-respected consumer rights magazine Which? will now publish a guide to British universities. Not that much more evidence was required, it was further proof that our university system is moving toward a more market based system. And in the same way Which? has helped generations of consumers purchase the right car or kitchen appliance, universities are now next in the queue for the Which? treatment.
At the surface, the prospect of an independent, well researched assessment of what is provided in our universities should be welcome, particularly by prospective students and their parents, there are some considerations that need to be thought through. When deciding to assess a car or a washing machine, there are some pretty indisputable factors where transparent information is helpful. Whether that’s reliability, price, dimensions or terms and conditions. But judging an education isn’t quite so easy. It’s not just difficult to decide which measures are the most important. Is academic progress more important than final degree outcomes? How important is the research environment for an undergraduate? How do you account for the distinctiveness of a small and specialist institution that may not benefit from the economies of scale, but more than compensate with a stronger sense of community. Talking of which, how do you begin to measure ‘a sense of community’?
As I see it, for Which? to simply move beyond a compilation of the greatest hits from the plethora of league tables that already exist, their real challenge will be to try and capture the essence of different institutions, their mission, strengths and weaknesses. Simple metrics don’t do justice to the broader importance and value of a higher education experience, so here’s hoping for something more sophisticated than that.
Heading for a Fail… University applications
It always looked likely given the confusion and anger with the government’s reforms to higher education funding, but this week we got the first signs of evidence that for the first year of higher tuition fees in 2012, university applications were indeed headed for decline.
Figures from the Universities and Colleges Admission Service (UCAS) showed that the applications received by 15 October were 9% lower than this time last year. It’s important to note that the final deadline for applications is not until the middle of January 2012, so this is far from a definitive picture, but the early signs don’t look good.
Personally I don’t consider a small decline in applications overall to be a huge problem, there will still be more people applying for a place than there are places to go round. In fact even a 9% decline will mean that thousands of qualified applicants would still miss out on a university place. The analysis that is crucial is to determine which groups of students are applying in greater (or fewer) numbers than before. A fall in applications which hits 10% or more is politically damaging for the coalition. But if the decline is particularly concentrated amongst poorer applicants or certain ethnic groups, then it will be more damning.
At first glance, a 9% drop is fairly troubling though. But there are two important factors to bear in mind. The first is an issue of demographics, the numbers of 18 year olds eligible to apply in 2012 is actually fewer than 2011, as a consequence of a slowing of the birth rate in the early 1990s. This may account for as much as 5% of the fall. But countering that, a small amount of analysis of the early UCAS figures show that applications to Oxbridge, dentistry, medicine and veterinary science (which have an early deadline) fell by only 0.8%. This would therefore indicate that the fall everywhere else is actually closer to 20%, but it could also mean that applicants are actually taking more time to consider their options, but will still ultimately decide to apply.
Whilst it is important to monitor these figures closely, it’s also important not to lose sight of two crucial issues. Despite the rising cost to the individual the broader benefits of higher education need to continue to be articulated, probably louder than ever before. And when it comes to the financial information, government through initiatives like the Independent Student Finance Task Force which has organised a National Student Money Day on Monday 14th November, need to maintain a visible presence to ensure that prospective students understand the deal, whether they agree with it or not.