This afternoon sees the start of the “Rankings and the Visibility of Quality Outcomes in the European Higher Education Area” conference in Dublin Castle, part of the events associated with Ireland’s Presidency of the EU.  A good chunk of today’s proceedings focuses on the adoption and roll-out of the EU’s new university ranking exercise, called U-Multirank, which aims to be live by 2014.

Since the initial global university ranking in 2003, there have been a plethora of ranking systems developed, with the big three being the ARWU (Shanghai) ranking, QS ranking, and the Time Higher Ed ranking.  These rankings have become key benchmarks for comparing universities within regions and across the globe, seized upon by some universities for marketing, and the media and government to either promote or denigrate institutions.  They are undoubtedly being used to shape education policy and the allocation of resources and yet they are routinely criticised for being highly flawed in their methodology.

Somewhat ironically, a sector devoted to measurement and science has been evaluated to date by weak science. There are several noted problems with existing rankings.

The rankings use surrogate, proxy measures to assess the things they purport to be measuring, and involve no site visits and peer assessment of outputs (but rather judgements of reputation, alongside indicators such as citation rates).  An example of such proxies include using the number of staff with PhDs as a measure of teaching quality; or the citation rate to judge quality of scholarship.  The relationship in both cases is tangential not synonymous.

The rankings are highly volatile, especially outside the top 20, with universities sliding up and down the rankings by dozens of places on an annual basis.  If the measures were valid and reliable we would expect them to have some stability – after all universities are generally stable entities, and performance and quality of programmes and research do not dramatically alter on a yearly basis.  And on close examination some of the results are just plain nonsense – for example, several of the universities listed in the top 20 institutions for geography programmes in the QS rankings in 2011 do not have a geography department/programme (e.g. Harvard, MIT, Stanford, Chicago, Yale, Princeton, Columbia; note the link automatically redirects to 2012 results for some reason) and other rankings barely correspond to much more thorough assessments such as the UK departments vis-a-vis the UK research assessment exercise (very few geographers would rank Oxford University as being the best department in the UK, let alone the world).  Such nonsense casts doubts on all the results.

The measures do not simply measure performance but also reputation judged by academics. The latter is highly subjective based on opinion (often little informed by experience or on-the-ground knowledge of the relative performance of universities in other country systems) and is skewed by a range of factors such as the size of alumni, resources and heritage (their past reputation as opposed to present; or simply name recognition), and is inflected by wider country reputation.  The sample of academics who return scores is also skewed to certain countries.

Because the measures add weight to data such as citation and research income they favour universities who are technical and scientific in focus, and work against those with large social science or humanities faculties (whose outputs such as books are not captured by citation and require less research funding to do high quality research).  They also favour universities with large endowments and are well resourced.  The citation scores highly skew towards English-Language institutions.

The rankings take no account of the varying national roles or systems of universities, but looks at more global measures.  Universities in these systems are working towards different ends and are in no way failing by not having the same kind of profile as a large, research-orientated university.

None of the ranking standardise by resourcing, so there is no attempt to see who is performing the best with respect to inputs; they simply look at the scale and reputation of outputs and equate these to quality and performance.  This conflation raises some serious questions concerning the ecological fallacy of the studies.

These failings favour certain kinds of institutions in certain places, with the top 100 universities in the three main rankings all dominated by US/UK institutions, particularly those which are science and technology orientated.  There is clearly an Anglo-Saxon, English language bias at work, hence the new EU ranking.  Very few people who work in academia believe that the UK has many more better universities than those in Germany, or France, or the Netherlands, or Sweden, or Japan, or Australia, etc.  Yet only a handful of universities in these countries appear in the 100, and hardly any at all in the top 50.

Whether the U-Multirank system will provide a more valid and robust ranking of universities, time will tell.  The full final report on its feasibility suggests a wider vision and methodology and some concerted attempts to address some of the issues associated with the other rankings.  One thing is certain, rankings will not disappear, as flawed as each of them are, because they serve a useful media and political function.  However, they should be viewed with very healthy scepticism, mindful of the criticisms noted above.

Rob Kitchin

For an interesting set of blog posts and links to media stories re. university rankings see these collections at Global Higher Ed and Ninth Level Ireland.

Advertisement

Part of my radio interview on Morning Ireland earlier today about being awarded an ERC Advanced Investigator award (research grants of up to €2.5m in value) focused on the perceived under-performance of Irish academics and higher education institutions in receiving such awards.  It came up again when I was filmed for a slot on 6.01 news.  So, is it the case that Irish universities are under-performing when it comes to EU research funding?

It is certainly true that compared to other countries Ireland does not do well in securing ERC funding.  The ERC have been funding projects since 2007 and to date Irish academics have received 22 Starting awards and 8 Advanced awards (of which there have been 2 to UCD, 1 to NUIM, 5 to TCD [1 of which was poached to the UK]; our population is 4.6m). I’m going to concentrate on the Advanced awards data here.  By far the largest number of awards have gone to the UK (334), followed by Germany (201) and France (178).  In fourth place is Switzerland with 126 awards, a country with a population of 7.9m people.  Other countries that do relatively well based on their population size are Israel (pop of 7.8m, 66 awards), Sweden (pop of 9.4m, 56 awards) and Norwary (pop of 5m, 21 awards).  Countries that don’t do so well in relative terms include Spain (pop of 47m, 69 awards ) and Poland (pop 0f 38m, 3 awards).

So why doesn’t Ireland achieve its fair share of these awards?  I think it’s a result of perception and structural issues.

In terms of perception, these awards are seen as involving a massive amount of effort, being very difficult to obtain, with assessment criteria set at a very high level. It is the case that completing the application is highly time intensive – the word count of the documents I submitted comes to c.18,000 words (about a quarter of a book).  The scientific argument developed within the application has to be of a standard that will get accepted for publication in the highest ranked academic journals, and it is reviewed by 8-10 international peers.  To put together a compelling application takes 3-6 months of concentrated effort.  It is also the case that they are difficult to obtain.   With respect to the social sciences and humanities there are only six panels to apply to for academics in 30+ disciplines.  In my panel – SH3 Environment and Population – there were 6 awards from applicants across the 39 eligible countries in this funding round (in previous years it was either 3 or 4 awards) and whilst the success rate as whole across all panels is 13%, it is 13% of those that thought they stood a chance and applied.  It is also the case that the bar is very high.  The application seeks evidence of esteem indicators such as honorary doctorates, major awards/prizes, keynote talks at international conferences, the number of books translated, and high citation rate.  The process seems to declare, ‘if you’re not in the top 5% in your discipline, don’t bother.’  And with such a low success rate, it seems difficult to justify the time and resources that are needed to apply, especially when trying to fit putting the application together around existing duties in a system under resource pressure due to austerity measures.

Nonetheless, it is undoubtedly the case that Irish universities are home to a large number of talented scholars who do fit the eligibility criteria and have the potential to secure these awards. Indeed, despite rhetoric in the media about the weaknesses of Irish universities internationally, we actually do have seven very good institutions that have high levels of talent.  Sometimes, I think we look down the wrong end of the telescope with respect to HE in Ireland.  Yes, we do not have any universities ranked in the top 100 in the world.  However, there are over 9000 higher education institutions globally and all 7 Irish universities are in the top 450, meaning that all of them are in the top 5%.  If we were to standardize by resourcing and staff-student ratios, etc, we’d be even higher.

Beyond perception, what is holding some of our most talented academics back from applying for ERC and other awards are structural issues – finding the time and space to put together applications.  Irish academics, by and large, have high teaching loads and staff-student ratios by international standards.  Also academic departments tend to be small, meaning that senior, more experienced staff have significant administrative responsibilities.  Moreover, it is only over the past 15 years, since the PRLTI programme and SFI funding, that research institutes have developed and capacity is still being built at a time when domestic resouces are being cut.  In this context, finding 3-6 clear months to put together an application is incredibly difficult.

It seems to me that if we want to increase our success rate we need to do four things.

1) find out what other countries are doing and learn from them.

2) proactively go through a process of identifying which academics have the profile and track record required to obtain the awards

3) encourage and facilitate those identified academics to apply by creating the time and space needed to put together compelling applications.  My initial thought is a sabbatical scheme that buys a candidate out of certain duties on the condition that an application is submitted.  A six month buyout would cost about 20-25K, yet if the candidate is successful in securing €2.5m this is leveraged back a hundred-fold.  This seems like a decent investment to me, even on a success rate of 13% (and it would be higher than that because of the targetting of suitable candidates and strong, polished applications).

4) the Irish state has to invest in basic research across the sciences, social sciences and humanities and not just applied research (which is where nearly all funding is now targetted).  A crucial element for applications is a strong track record in basic research.  If we do not enable individuals to build such a track record across their career then we are ensuring that there will not be any eligible candidates in the future.

Ireland does have the talent to secure these awards, and we have strong institutions in which they can be hosted.  However, we do need to change the perceptions of some potential candidates, and we need to remove the structural barriers to application.  If we do not do this then we will continue to under-perform in securing our relative share of awards.

Rob Kitchin

This morning AIRO released a new interactive mapping module that maps the catchments of all the universities and six of the IOTs using Irish Times school feeder data 2009-2011.  The module is available here and the talk from HEA conference in the Aviva Stadium is here – HEA Talk 2012 2.pptx.  The maps below are the 7 universities and 6 selected IoTs.  What the maps show is that no one institution has a truly national catchment, with the majority of students coming from the immediate regional area.  The dots are schools.  On the interactive version, if you click on the school it will provide information relating to how it feeds into the HE system.  The powerpoint also gives some basic demographic information on future demographic demand – it is clear that the HEIs are going to come under huge pressure as the present 0-14 age group works its way through the education system (demand is set to increase by c.30% over the next two decades).

 

Rob Kitchin, Eoghan McCarthy and Justin Gleeson

On Tuesday the Irish Times carried an opinion piece by Paul Mooney entitled ‘Inside Third Level‘.  I sent the IT the following response, but it’s not been carried so I’m putting it on the record here.

For someone who has worked in the university sector and has been the President of the National College of Ireland, Paul Mooney’s level of ignorance as to what lecturers and professors do and the purpose of the higher education is quite remarkable.  What is even more striking is that his opinion piece in the Irish Times (Inside Third Level) lacks the rigours of analysis that one would expect from an academic.  Assertion, anecdote and the partial, cherry-picking of data does not constitute evidence-informed analysis.

Where is the data and its systematic analysis to underpin the conclusions drawn?  Where are the international comparisons that would set Ireland in context of other higher education systems?  Where is the standardisation against staff/student ratios and funding that should be a part of such comparisons?  Where is the wider contextualisation and reference to the myriad of reports on the higher education sector over the past decade?

Put simply, this is not good science or analysis; it is jingoistic playing to the gallery and would make a very good example for students as to how to produce a selective story.

Let’s get one thing absolutely straight – academic working hours is not contact teaching hours and nor should it be.  The job of lecturers is divided into three main tasks:  1) teaching,  2) administration, 3) research.

Contact hours are only one element of teaching.  The other elements are lecture preparation, marking and meeting students to discuss their work and assignments.  All three are time intensive.  For example, meeting students one-on-one to give advice and critically appraise their work takes time, especially if you are teaching very large introductory classes.  Direct classroom hours are deliberately not heavy – it is not called ‘reading for a degree’ for nothing.  Students are meant to be reading the ancillary material and undertaking their assignments.  Lecturers aid them in this reading and in their self-learning. The pedagogy of higher education is philosophical not sophist and nor should it ever be sophist.

Administration, related to both teaching and research, is not a trivial task and takes time to do professionally.  The fact that students or the public do not see this work is neither here nor there.

Research takes time and resources.  It takes a lot of reading, a lot experimentation, a lot analysis, a lot of thinking, a lot of debate and a lot of writing if it is to be rigorous, systematic and valid.  It can fail.  It is not something that can be done well in a few hours.  If it could be done quickly and easily companies wouldn’t spend vast sums of money on it, and states wouldn’t give huge contracts to consultancy companies for it.  As a supposedly seasoned researcher, Paul Mooney should know this.

The fact that Ruari Quinn, the Department of Education and the Higher Education Authority do not have a clue if lecturers are doing their job says far more about the Minister and those bodies than it does about the higher education sector.

There are four good pieces of evidence that in fact it is doing its job: first, it is producing courses that are very highly scored by external examiners, the vast majority of whom come from outside Ireland.  Second, it produces graduates which the country holds up when it seeks to entice FDI to Ireland and who compete very well in the international labour market.  Third, we perform very well in attracting EU research monies and in publishing research in international refereed outlets.  Fourth, there is not a huge ground swell of complaints from our main customers – students.  In fact, Paul Mooney produces not one bit of data that even hints that the sector is not doing its job beyond saying that contact hours align with international norms (Ireland is not exception in this regard) and a lot of bluster and conjecture.

Neither does he provide one jot of data to support the statement: “the percentage of third-level lecturers that have the ability to produce economic or socially useful research is limited.”  Furthermore, such a statement belies an assumption that third level research should be instrumental in nature.  Here, worth and value get hopelessly conflated.  Third level research serves diverse constituencies and purposes, and so it should.  Newman’s ‘The Idea of a University’ has lost none of its veracity.

He also seems to be under the illusion that there are no key performance and management indicators operating in the sector.  In fact, all the third level institutions compile such data on all their activities.  All teaching is externally validated.  All publications and research funding applications are judged by peers.  Promotion is tied to academic performance.  Departments are subject to external quality reviews.  Any project with research funding is subject a range of audits, including daily timesheets for many HEA and EU projects.  All staff do PMDS.  Regular reports are prepared for the HEA.

Academics working long, productive hours is not an exception, it is the norm.  And the three month summer break in the university sector is a myth; holiday entitlements in Maynooth are 20 days plus 9 (Christmas and Easter).

The first rule of publishing in academia is to get your facts straight and to produce an evidence-informed analysis.  If his opinion piece was a student essay, I’d give it a ‘F’.

Rob Kitchin