University

Ranking the rankings

The list of lists

International rankings have got universities in a stranglehold, even though they are completely pointless.
Text by Traci White and Christien Boomsma / Translation by Sarah van Steenderen

The RUG likes to profile itself as a top 100 university. Yet neither RUG president Poppema nor rankings expert Jules van Rooij take the rankings very seriously. ‘No overall ranking is really very useful.’

However, international students do place importance on these lists: a school’s position in the rankings plays an important role in their choice of university.

This is why people are paying attention to the rankings and Van Rooij is working hard to understand their methodology.

The best, oldest, and most stable ranking is the Shanghai ranking. Poppema thinks the RUG might just make the top 50, due among other things to the expansion to Yantai.

The other rankings – the Quacquarelli Symonds (QS) and Times Higher Education (THE) – are mainly concerned with reputation and therefore do not provide very much objective information.

Furthermore, QS and THE are commercial businesses that earn money by ‘helping’ universities with their data analyses. It is also said that they often change their methodologies to bring about shifts in the rankings and thereby generate attention.

Reading time: 14 minutes (2,185 words)

This article is an updated version of a story which the UK originally published in September 2015. All of the positions in the rankings and methodologies – where necessary – have been brought up to date.

It is bizarre. Each and every one of the rankings that pops up in newspapers, policy meetings, or reports on the RUG website that the RUG proudly uses to declare itself a ‘top 100’ university is complete nonsense.

It is hardly surprising that members of the University Council have this opinion. That the pre-eminent ranking expert at the RUG, Jules van Rooij, shares this opinion, is a little more amazing. But that even RUG president Sibrand Poppema, Mister Ranking Incarnate, also supports the above statement, is downright curious.

A high score

And yet, it is true. ‘The only reason rankings are important is because people think they are’, Poppema emphasises. Some are a bit better than others, he says. And it’s not as though they are completely out of touch with reality, but is Peking University, in 71st place, really any better than the RUG in 72nd place? And is Boston University any worse in 75th place? Not really.

A high score simply means a good research university, academics who publish a lot of work, and who are cited often as well. But that’s about it. ‘Not a single overall ranking holds any meaning’, says Van Rooij. ‘It’s much more interesting to look at the sub fields. Not even the Ivy League is number one in all fields.’

International students

But international students do look at those rankings. Ask anyone in an international student house, such as the one at the Winschoterdiep. Romanian student Elisabeth Efraim picked Groningen out of a list of possible universities because it had the highest score. Molly Qian, from China, says, ‘My choice was based more on the ranking than on the city.’

When the RUG has a booth at an educational fair in Indonesia, the first question for the employees is always: ‘Are you in the top 200? Top 100?’

If the answer is ‘no’, the students will immediately move on, and the university is left with nothing. But a university – and especially the RUG – needs those international students. Poppema: ‘This year marks the first years that the RUG has attracted less Dutch students, and that trend is only going to develop further in the coming years. We are a regional university in a region experiencing retrenchment. The amount of young Dutch people is declining, especially the amount of young people in the northern provinces. We need international students for stability. It’s that simple.’

Times Higher Education

Founded in 2010 after breaking up with QS, THE-WUR’s ranking is the best for reputation. The RUG’s position in 2015 was 74, which was a serious comeback from 117th place the previous year, but their spot in the most recent ranking – released on September 22nd – was 80th.

It is, Van Rooij confirms, a prisoner’s dilemma. The rankings do not deserve all the attention they are getting, but not participating is not an option. ‘And if you lose and claim the methodology is wrong, that’s seen as an admission of weakness. No matter how you look at it, those lists are of great influence on one’s reputation. And they’re here to stay.’

And that is why he spends a large part of his time trying to understand and analyse how those lists work. Take the Shanghai ranking, for instance. It is the oldest, most objective, and most stable of the three ‘greats’ and also the only one done by a university. ‘They collect their own data and they are therefore the only ranking you cannot influence by playing a clever game’, says Van Rooij.

Zernike

If there is one list that Poppema attaches value to, it’s the Shanghai ranking, precisely because the Shanghai list relies heavily on research citations, authors that are cited often, and publications in Nature and Science. And that is good for the RUG, because in those areas, they are doing well. Having a Nobel prize or Fields medal (for mathematics) winner among your alumni or staff also counts heavily. Although it has been 100 years since Frits Zernike worked at the RUG and 60 years since his Nobel prize, that prize still counts for the Shanghai ranking. ‘Although it decreases by 10 per cent every ten years’, Van Rooij calculates.

US News & World Report

Although U.S. News and World Report has published an American university ranking list since 1983, the Best Global Universities Ranking is a new player in the ranking game. It was launched in 2014, and the RUG’s position in this inaugural edition was 98. The RUG currently holds 93rd place.

Poppema thinks it is realistic that the RUG will go up in this ranking. It is the effect of a real increase in citations, good publications, and internationals. But the RUG is also confronted with its own limits, because it’s not so simple to beat schools like MIT, Harvard, or Cambridge.

Yantai

However, growth is still possible, according to Poppema. He even believes we could make it to the top 50 if the RUG delivers either a Nobel prize winner – a feat he does not consider impossible – or substantially increases its publications. This can be achieved through the expansion of the RUG to Yantai. Because in Groningen, the RUG has reached its maximum potential. ‘The people here already work so hard. They really can’t work any harder.’

QS World University Rankings

The Quacquarelli Symonds (QS) ranking keeps its criteria broad and makes no secret of its reliance on polling and emphasis on reputation. QS relaunched independent of THE in 2010. The RUG was hanging on in 100th place in their 2015 list, but in 2016, they moved down to 113th position.

He is quick to add that the RUG is not expanding to Yantai because of the rankings. The RUG wants to expand to bring about necessary growth. Any upward movement in the rankings is a welcome incidental benefit.

Surveys

But then there are those other two international rankings: the Times Higher Education (THE) and Quacquarelli Symonds (QS). Poppema wants nothing to do with those.

Both THE and QS rely heavily on reputation, while the data are collected through surveys. ‘QS questions random people’, says Van Rooij. ‘[They approach] basically anybody who has university e-mail address.’ In many cases, it is not even clear if the person is an active academic. For example, Van Rooij himself is always sent the survey. In previous years, THE used the database provided by publisher Thomson Reuters. This meant that the people surveyed were at least active peer reviewers or publishing academics. But it begs the question whether a biologist in Alaska has anything useful to say about the level of education at a university in Japan.

‘And even when it does concern their own expertise, the question remains how reliable the information is’, Van Rooij says. ‘They ask you to name the 30 best universities in your field. You can come up with the first five from your network. But after that? There’s a good chance you’ll just consult the top 50 or check those universities that you once heard of.’ This means that a top 100 university has a larger chance to stay in that top 100 simply based on brand awareness alone.

Multimillion-dollar companies

What may even be worse than the faltering methodology is the world behind both these rankers. Both QS and THE are multimillion-dollar companies that are mainly concerned with making money. QS can be sure that their ranking will be clicked on approximately 100 million times by prospective students, which means that universities will eagerly advertise on their website.

Shanghai Ranking

Sometimes, simple is better. Founded in 2003, the ARWU/Shanghai ranking focuses exclusively on the scientific performance of a university and is considered to be the most straight-forward and therefore the most trustworthy of the heavy-hitting rankings. The RUG fell three spots to 72nd place in this reliable ranking in 2016.

‘They have no problem admitting that they are not interested in stability’
In addition, QS generates income by ‘helping’ universities with their data, for instance by analysing and calculating the data or by sending out an advisor. ‘And that is a service you pay for’, Poppema says. How much? He does not know exactly, because the RUG has never retained this service. But it is in the tens of thousands of euros.

Changes in methodology

Finally, QS and THE will do anything to draw attention to their ranking. This includes frequent changes to their methodology, which cause considerable shifts in the ranking. After all, that gets media attention, which in turn will ensure more traffic on their websites. For instance, the RUG was in 134th place in the THE ranking in 2012, in 89th place a year later, and in 2015 they were 117th. ‘They have no problem admitting that they are not interested in stability. They don’t want the universities to specifically adapt their policies to this’, says Poppema, who had a seat on the advisory board for THE for several years.

Phil Baty, who works at THE, has a different explanation for the many changes: ‘THE Rankings are always striving to improve and we will make methodological improvements where we can. This can lead to some instability, but I believe that as long as we are very transparent about the changes we make, it is in everyone’s interests to get the clearest, most balanced picture possible.’

Most reliable

The fact remains that the Shanghai ranking did not need any of those changes in its twelve years of existence, and that this ranking is seen as the most reliable among students as well.

Perhaps that is the reason why Poppema is not losing any sleep over whether the RUG goes up or down in the Times list. And it is also why, even though the RUG is in the top 100, the university press release is not exactly glowing.

The RUG in the rankings

Below is further information about the four biggest rankings, followed by an overview of other, less trend-setting rankings. The RUG’s position in each of the rankings is given between parentheses.

Shanghai ranking (72)

For the Academic Ranking of World Universities – better known as the Shanghai ranking – it’s all about the science. For alumni or staff of a university to win a Nobel prize or a Fields medal is cause for celebration, and the Shanghai ranking dedicates ten per cent of its methodology to that, too – as long as it’s been within the past hundred years, that is. They also measure Web of Science citations from the previous five years, but the downside to the Shanghai ranking is that they don’t measure quality of teaching and pay little attention to the humanities.

Of the three largest comprehensive rankings, the Shanghai ranking seems to be the most widely respected by experts and students alike. It’s also the oldest of the three, but that isn’t saying much: it was tabulated for the first time in 2003.

The Shanghai ranking was launched with Chinese government backing and designed ‘to provide a global benchmark against which Chinese universities – enjoying billions in state and private investment – could assess their progress.’

The Shanghai ranking gets part of its findings directly from Thomson Reuters, as well as yet another ranking: CWTS Leiden, a system that measures the impact of scientific publications from 500 global universities. In the Leiden list, Groningen is in 120th position.

 

Times Higher Education (80)


Nowadays, Times Higher Education (THE) comes to its scores based on a whole lot of tiny factors, ranging from research reputation to how internationally mixed a university is. Some aspects receive as little as 2.25 per cent of the total. Last year, THE switched teams from the Thomson Reuters database to Scopus. THE also surveys academics at other schools – by invitation – to rate ‘teaching and research quality.’

 

U.S. News ‘Best Global Universities’ (93 in 2015)


In 2014, the American journal U.S. News realised there’s a whole world out there beyond the states and published its first ‘Best Global Universities Rankings’. The publisher makes painstaking efforts to explain exactly how their methodology works, and like most of the big boys, they too rely on Thomson Reuters for citation info.

 

Quacquarelli Symonds (113)

Quacquarelli Symonds (QS) has a thing for international students more than most of the other rankings, which is one way that the former partner of THE sets itself apart. They also really love surveys, deriving more than half of their score from them and using student-to-faculty ratio as a proxy for teaching quality. QS jumped ship in 2010 to Thomson Reuters to Scopus.

Relying on surveys to assess reputation is far from an exact science though. QS (and THE) seem to launch a slightly tweaked methodology each year, which can cause the position of a school to vary dramatically from one year to the next.

 

Other rankings:

Centre for World University Rankings (105)

National Taiwan University Ranking (71 in 2015)

Global Employability Survey (114 in 2015)

Webometrics (118)

UI GreenMetric World University Ranking (12)

Dutch

Subscribe
Notify of

De spelregels voor reageren: blijf on topic, geen herhalingen, geen URLs, geen haatspraak en beledigingen. / The rules for commenting: stay on topic, don't repeat yourself, no URLs, no hate speech or insults.

guest

0 Reacties
Inline Feedbacks
View all comments