Vox pop: the problem with university rankings

Click to enlarge
Princeton University's Nassau Hall. Three Australian architecture schools fared better than Princeton in the 2015 QS World University Rankings.

Princeton University’s Nassau Hall. Three Australian architecture schools fared better than Princeton in the 2015 QS World University Rankings. Image: Wikipedia Commons

Survey-driven university rankings are unreliable indicators of education quality and should be treated with caution, says Sandra Kaji O’Grady.

The news that 11 Australian universities made it into the top 100 in the QS World University Rankings for 2015 for architecture, including my own, was quickly seized upon by our marketing teams and the sorts of websites visited by prospective international students such as www.hotcourses.abroad.com.  Three Australian architecture programs had ranked more highly than prestige institutions such as Cornell University, Stanford University and Princeton University. I am sure I was not alone in finding it implausible that Princeton, with a staff-student ratio of 1:6, a Professoriate that includes Beatriz Colomina, Jean-Louis Cohen, Elizabeth Diller and Hal Foster working alongside rising stars such as Liam Young (a UQ grad) and wads of philanthropic income, has an architecture program inferior to that of Australian universities that, frankly, depend on the unpaid overtime of dedicated academics and practitioners to make up for compressed and crowded classes and whose professors, unlike those at Princeton, have not shaped the discipline. It may seem churlish to quibble about any news that puts Australian architectural education in a positive light, but global rankings such as these must be treated with caution. Here is why.

For the architecture subject area, the opinion of polled academics accounts for 70% of the score. Opinion polls are, by definition, subjective, and the fact that the QS index relies on reputational indicators to this degree has led to considerable scepticism and accounts for its wild variability over the years. The QS survey asks academics to list up to 10 domestic and 30 international institutions they consider excellent for research for five faculty areas they are familiar with. It is most unlikely that any individual has a thorough and up-to-date understanding of the inner workings, recent achievements and current curriculum of up to 200 programs (by comparison, Thomson Reuters undertake a similar survey but ask questions only about the one discipline area in which respondents have expertise). It is the first time that QS World University Rankings have reported on architecture as a separate subject, suggesting that the numbers of respondents answering to architecture as an area has only just reached a statistical threshold. It’s a low one though. Those that list architecture as one of their five areas of expertise account for just over 800 respondents, of which we could assume about a quarter to a fifth have architecture as their main discipline. Any academic who understands statistics should stay well clear of gloating or bemoaning their international standing based on the opinions of so few.

Furthermore, of that 200, our presence on these ranks is boosted by the disproportionate representation of Australian academics among those solicited for their opinions. Oceania as a whole represents just 2.1% of researchers according to UNESCO data from 2007 (when last measured), but Australians make up 4% of the total respondents to QS’s survey of “reputation” at the university level. There is no data on the proportion of Australians surveyed on architecture specifically, but the results suggest it may be considerable. Moreover, as the world’s third largest provider of export education for the past two decades, the older universities in Australia that have graduated large numbers of international students have the benefit of a considerable and growing cohort of alumni. Newer architectural programs like the one at Monash University, with its extraordinarily successful track record of one-to-one building projects and its faultless accreditation report by representatives of the AIA and ARB, at 41 is well below others in Australia where studio contact hours have been so severely pruned and student numbers so engorged that even field trips are out of question.

But the component of the QS World University Rankings that mines hard data is also problematic. The number of times a scholarly publication from a member of faculty is cited by another academic paper makes up the remaining 30% of the score in architecture. This is called the h-index and uses the Scopus database. Very few journals of architecture are included in the Scopus database on which h-indexes are calculated, and certainly no creative works. The impact of this measure is that the University of Sydney, with its cadre of architectural scientists, scored a 93.2 out of 100 whereas RMIT, with its strengths in design research, scored a 76.9. In the Australian peer-assessment of research quality (ERA) – a careful assessment process that recognises research by design – RMIT is ranked equal first in architecture alongside the University of Melbourne and the University of Sydney lags well behind.

Reputation and research in architectural science do not mirror student experience. Not surprisingly, the majority of responses to the announcement on the ArchitectureAU Facebook page were scornful and disbelieving. It is not, of course, that Australian architecture schools are inferior – given the circumstances, every one of them punches well above weight. It is the case, though, that inflated claims based on spurious reputational surveys rightly draw the ire of students who experience the real effects of Federal Government defunding in a national context where philanthropy does not make up the gap. It is also the case that different programs suit the learning styles and ambitions of different students. Countries with mature programs in architectural education are marked not so much by their standing on the various ranking systems, but by the distinctiveness of their programs. Go to Sheffield or Auburn University (which runs Rural Studio) and you will get an education steeped in social responsibility. At RMIT, MIT, the University of Michigan and ETH Zurich, students learn the most advanced skills in digital fabrication. At Cal Poly and Monash University the experience of making one-to-one structures is a central part of the curriculum. The profession needs this diversity and should likely eschew ranking systems that assess on such vague criteria as “reputation.”


More discourse

Most read