Given the endless debate about university prestige, reputation, exclusivity and so on, you might assume we actually know how capable students are when they graduate.
This is because universities set and mark their own exams, and there is no firm national standard, let alone an international one, for deciding who deserves a first or a particular grade point average.
We do know that some universities have higher entry grades than others. We also know that graduates from certain types of universities are favoured by employers. But there is no objective way to know how talented final- year students really are and therefore no precise method to measure what, if anything, they have learned at university.
This could all be about to change, however. In the first half of 2012, the Organisation for Economic Cooperation and Development piloted internationally comparable skills tests on 23,000 students in 17 countries, from the US to Egypt, and is now considering rolling it out properly.
The Assessment of Higher Education Learning Outcomes (Ahelo) project involved students taking computerised exams in “generic skills” and in either economics or engineering, for those who had studied these subjects. They were scored using a mixture of qualitative and quantitative marking.
At present, the global league tables that have become hugely influential are based mainly on research power. The OECD claims that current ranking criteria are “narrow” and create a “distorted vision of educational success and fail to capture the essential elements of an education: teaching and learning”.
The OECD already runs global standardised tests in reading, mathematics and science for 15-year-olds – the Programme for International Student Assessment (Pisa). Sixty-four countries participated in the 2012 round, the results of which have been used to create an international league table comparing school systems.
If it were possible to run Pisa-like tests for higher education, who knows what shocks might be in store for supposedly elite institutions and lauded national university systems.
This kind of project has been attempted before at a national level, with unsettling results. The Collegiate Learning Assessment (CLA) tests students’ abilities to “think critically, reason analytically, solve problems and communicate clearly and cogently” across institutions in the US.
The data collected were used as the basis for a book, Academically Adrift: Limited Learning on College Campuses (2011), by Richard Arum, a sociology professor at New York University, and Josipa Roksa, an assistant professor of sociology at the University of Virginia. Their conclusions were damning: 36 per cent of students showed no improvement in the CLA skills over four years of higher education.
Even if culturally neutral questions can be devised, there is still the danger that Ahelo will pressure universities to teach to the test, reducing the diversity of the global curriculum.
A worldwide Ahelo exam would be deeply “homogenising”, said Alison Wolf, director of the International Centre for University Policy Research at King’s College London.
Partly for this reason, universities will refuse to sign up to the Ahelo system, Professor Wolf predicted, because they would be on a “hiding to nothing”.
At best, the results will confirm that an institution teaches complex skills – something already assumed by academics, students and the government. But at worst, Ahelo could show that a university or an entire higher education system was actually teaching students far less than anyone thought.
Most OECD members do not want a ranking, either by country or institution. Universities might be given some sort of comparative information without making the full results public.