Oxford University’s research performance
– publication and citation metrics 1975-2004
Bruce G Charlton and Peter Andras*
Schools of Biology and Psychology and *Computer Sciences
Newcastle University, NE1 7RU, UK
bruce.charlton (at) ncl.ac.uk
What follows is a compilation of four articles published in Oxford Magazine during 2006.
***
Oxford Magazine 2006: 252: 5-6
Replacing the RAE with outcome metrics:
Quantity of published research is predictive of quality
Bruce G Charlton and Peter Andras.
We have recently analyzed of thirty years of UK University publications (excluding Wales) recorded in the ISI Web of Science (WoS) database, using computerized name-searching.
One striking finding has been that simply counting the number of papers published per year by each university correlates very closely with other measures of research performance such as number of citations, and also position in the Shanghai Jiao Tong University Academic Ranking of World Universities (http://ed.sjtu.edu.cn/ranking.htm).
This finding suggests that simply measuring the annual number of publications may be a suitable metric to monitor university research performance, and to replace the Research Assessment Exercise (RAE).
Rank based on number of WoS publications/ number of citations for 5 years 2000-2004 (SJT UK Rank)
1/ 1 (1) Cambridge
2/ 2 (2) Oxford
3/ 4 (3) Imperial College, London
4/ 3 (4) UCL
5/ 6 (6) Manchester
6/ 5 (5) Edinburgh
7/ 7 (7) Bristol
8/ 8 (11) Birmingham
9/ 9 (12-15) Glasgow
10/ 10 (9) Kings College, London
Annual publications can be seen to correlate closely with the number of citations these publications have received, and somewhat less closely with the SJT position in the lower ranks. But the simple quantity of publications is strikingly predictive of quality – especially among the best universities. Consistent with this UK data is that Harvard University (SJT number 1 in the world) is also the most productive and most cited university we have measured – generating about the same annual number of publications as Cambridge and Oxford combined.
Positions of the top seven universities exhibit particular consistency, but below this level the SJT exhibits increasing levels of statistical ‘noise’ – probably due to small-number effects. The SJT ranking sets out to measure excellence in (mainly) scientific research; and it depends on a weighted scale including Nobel Prize/ Fields Medal (10% for Alumni and 20% for staff), Number of ISI highly-cited researchers (20%), number of articles published in Nature and Science (20%), articles in Web of Science (20% - this is probably the same measure as our number of publications), and an adjustment for the size of institution. Only a few elite universities have numerous Nobel/ Fields/ highly-cited researchers; and when there are few individuals in these categories just one more or less can make a significant difference to the SJT rank.
Research metrics are dominated by the natural sciences. For instance, Oxford publishes about five times as many papers in the natural sciences as in either the social sciences, or the arts and humanities. The citation differential is even greater: science publications attract more than thirteen times the number of citations as social science, and more than two hundred times more than in the arts. Indeed, the top ten rank order for UK universities based on publications in Science Citation Index (natural sciences only) is identical to that of WoS as a whole.
But comparisons of numbers of publications within the specific categories of natural science, social sciences, and arts and humanities seem to have at least face validity.
Ranking of universities by number of publications in Social Science Index 2000-2004
- Cambridge
- Oxford
- UCL
- Manchester
- Birmingham
- Kings, London
- Nottingham
- Sheffield
- LSE
- Bristol
Ranking of universities by number of publications in Arts and Humanities Citation Index 2000-2004
- Oxford
- Cambridge
- Durham
- Edinburgh
- Leeds
- UCL
- Manchester
- Exeter
- Glasgow
- Birmingham
The idea of a relationship between number of publications and research excellence has been subjected to much criticism and ridicule over the past few decades; probably because there are many individual and departmental exceptions to the equation of ‘quantity equals quality’. Nonetheless, and however surprising or unpalatable this may seem, our 30 year WoS analysis strongly suggests that when data are aggregated to the level of whole universities, the number of publications seems to be highly predictive of research excellence.
Measuring annual publication rates may therefore be exactly the simple, cheap, sensitive, objective and transparent research performance measure which would be most useful for comparative evaluation of university research. Such a system is, like any simple measure, inevitably incomplete and imperfect. But, unlike the current UK Research Assessment Exercise, publication-counting has the great advantage that its limitations are obvious to anyone; while the method is much less prone to subjective interest, personal bias, and gamesmanship.
***
Oxford Magazine 2006; 254: 19-20
Oxford University's Research Output in the UK context
– Thirty-year analysis of publications and citations
Bruce G Charlton and Peter Andras
Oxford and Cambridge have long had the reputation of being the best research universities in the UK. Our analysis of 30 years of UK universities publications and citations in the ISI Web of Science (WoS) confirms that this reputation is justified.
Methods
The ISI Web of Science database 1975-2004 (comprising the combined databases of Science Citation Index (Sci Cit – natural sciences), the Social Science Citation Index (Soc Sci), and Arts and Humanities Citation Index (A&H)) were used to calculate the number of papers published by members of each specific university per calendar year, and the number of citations that each year of papers from a specific university had accumulated since publication. The data are presented in cumulative 5 year segments in order to reduce the effects of short-term fluctuations.
47 Universities were included in the analysis, because they had existed under their current names over the whole period. These included all of the pre-1992 universities except for the University of Wales constituent colleges, where serial name changes (especially in Cardiff) rendered the computer searching excessively inaccurate. St Georges Medical School, London (which is not attached to a University) was also excluded – the other London Medical Schools appeared indirectly in the analysis during the thirty years when they were merged with the various London University Colleges.
Three variables were generated for each university:
1. Publications – the number of publications recorded in WoS
2. Citations – the number of citations recorded in WoS
3. Quality – the average number of citations per publication – ie. Citations/ Publications
Publications and Citations were also measured for the individual Sci Cit, Soc Sci and A&H indices (the Quality measure was omitted because the small number of average citations for A&H publications introduced too much statistical noise into the data).
Market Share
The number of publications per year increased substantially throughout the thirty year period. In order to control for this expansion and to compare universities, publications and citations were also expressed as each university's 'Market Share' (MS) of the total publications and citations for the 47 universities included.
We suggest that a university’s market share for publications, and especially for citations, is a measure of the impact each university has on the system of UK scientific research.
Results
Oxford University Rankings in Web of Science – 1975-2004, 5 year periods
1975-9 1980-4 1985-9 1990-4 1995-9 2000-4
Publications 2 2 1 2 1 2
Citations 2 2 2 2 2 2
Quality 3 5 5 4 4 3
Interpretation: Oxford or Cambridge are always in first and second place for numbers of publications and citations, but not for the 'Quality' measure of average citations for publication. For example, in the last two 5-year periods (1995-2004) Dundee University ranks No. 1 for Quality, and in 1985-9, University of Leicester was 4th place and higher than Oxford at 5th.
Dundee and Leicester are both medium-sized universities, with citation ranks only in the teens, that have hosted large and highly influential groups focused in dominant sub-specialties of the medical sciences (eg. Philip Cohen and David Lane at Dundee in Biochemistry/ Molecular Biology, Alec Jeffreys at Leicester in DNA finger-printing). Leicester's great success apparently lasted less than a decade and it has now declined to 10th place.
However, Dundee might represent a 'new model' for medium-sized universities to produce a moderate volume of very high quality research by focusing research activity in large specialized units: it remains to be seen whether this model is sustainable in the longer-term.
Publications expressed as a percentage of the highest ranking number
1975-9 1980-4 1985-9 1990-4 1995-9 2000-4
1. Cam-100% Cam-100% Ox-100% Cam-100% Oxf-100% Cam-100%
2. Ox-88% Ox-98% Cam-99% Ox-95% Cam-99% Ox-98%
3. Man-78% Man-77% Man-71% Man-61% Man-60% Imp-68%
KEY – Ox = Oxford, Cam = Cambridge, Man = Manchester, Imp = Imperial College, London
Interpretation: Oxford and Cambridge are very similar in terms of numbers of publications, and the large gap in production between Oxbridge and the third placed university has tended to widen.
Citations expressed as a percentage of the highest ranking number
1975-9 1980-4 1985-9 1990-4 1995-9 2000-4
1. Cam-100% Cam-100% Cam-100% Cam-100% Cam-100% Cam-100%
2. Ox-79% Ox-79% Ox-86% Ox-85% Ox-92% Ox-95%
3. Ed-50% Ed 37% Ed-39% Ed-43% Imp-47% UCL-62%
KEY – Ox = Oxford, Cam = Cambridge, Ed = Edinburgh, Imp = Imperial College, London, UCL = University College, London.
Interpretation: Cambridge is the most cited university with Oxford second, but the gap is closing. There is a very large gap between Oxbridge and the third placed university – this may be narrowing.
Legend to Graph 1.
Science Citation Index - Thirty Year trends in Market Share (MS) of Publications and Citations (vertical axis) against Date (horizontal axis). Lighter line is % MS citations, darker line is % MS of publications.
Interpretation: Oxford has held its large market share of both publications and citations, with a slight increase in publication MS.
Legend to Graph 2.
Social Science Citation Index - Thirty Year trends in Market Share (MS) of Publications and Citations (vertical axis) against Date (horizontal axis). Lighter line is % MS citations, darker line is % MS of publications.
Interpretation: Oxford's market share of Social Science publications has declined slightly, and its MS of citations has declined substantially.
Legend to Graph 3.
Arts and Humanities Citation Index - Thirty Year trends in Market Share (MS) of Publications and Citations (vertical axis) against Date (horizontal axis). Lighter line is % MS citations, darker line is % MS of publications.
Interpretation: Oxford has the largest market share of UK universities in both publications and citations in the A&H, confirming Oxford's long term domination of this type of research in the UK. A&H market share rose to a peak in the period around 1987-92, but has since declined sharply.
Relative output of publications in Sci Cit, Soc Sci and A&H indices – Sci Cit is 100% and Soc Sci and A&H are expressed as percentages of number of science publications in 5 year period.
1975-9 1980-4 1985-9 1990-4 1995-9 2000-4
Soc Sci 35% 33% 26% 23% 21% 20%
A&H 24% 38% 39% 34% 29% 23%
Interpretation: Over this thirty-year period, the number of science publications listed in Science Citation index rose c 3.4-fold – and therefore absolute numbers of publications in all categories increased significantly.
However, the percentage of publications listed in Soc Sci declined progressively; while the proportion of A&H publications rose until about 1990 but has since declined sharply (confirming the trend seen in the market share analysis).
Over thirty years the proportion of Sci Cit listed publications rose from about 41% to 57%, a shift towards science of around 5% per decade.
Conclusion
Overall, this analysis suggests that the pre-eminence of Oxford (and Cambridge) within the UK is solidly-based upon a massive, long-term superiority in research output and influence.
Counting the number of publications and citations has significant limitations as a method for measuring research outputs. Random errors are reduced by the larger numbers attained by accumulating data by institution, and into yearly or five yearly units; however this does not reduce the systematic bias. Nonetheless, the very simplicity of these metrics means that the limitations are obvious; and our results are relatively cheap and easy to check. Furthermore, comparison of 47 universities over 30 years requires this kind of robust and quantitative measurement.
In terms of production of papers and accumulation of citations, Oxford and Cambridge are more similar to each other than to any other university, and there is a significant gap between Oxbridge and the next best. Interestingly, our 'quality' measure (average number of citations per publication) reveals a possible 'new model' of research in Dundee and Leicester Universities that that has – at times - challenged and (modestly) surpassed Oxbridge in specialized fields.
The changing nature of Oxford University is apparent. Research publications have expanded across the board, but this has been most rapid in science where Oxford's market share of output has slightly increased. The market share of Arts and Humanities output reached a peak in the period around 1987-92 but has declined since, while the Social Sciences contribution has progressively declined over thirty years.
The trend of Oxford research is therefore towards a greater concentration in Science at the expense of the Social Sciences and Arts & Humanities. In this sense, Oxford is getting more like Cambridge. Cambridge remains the top UK science university, but the gap is small and narrowing.
***
Oxford Magazine 2006: 255: 16-17
Oxbridge versus the ‘Ivy League’: 30 year citation trends
Bruce G Charlton and Peter Andras
It is frequently argued that Oxford and Cambridge have for some years been declining in research performance relative to the American ‘Ivy League’. However, from a study of citations over 30 years we made the surprising discovery that this common belief is probably false and, on the contrary, Oxbridge scientific research is improving relative to the best US universities.
Methods and Results
We studied citations in Web of Science for Oxford, Cambridge, the eight true Ivy League universities (Harvard, Yale, Princeton, Columbia, Cornell, University of Pennsylvania, Dartmouth and Brown), plus the three major private US research universities (Stanford, Duke and Chicago). The major technical universities and top public research universities were omitted from this presentation for the sake of clarity, but they do not contradict these findings.
Citations are presented for five year periods from 1975-2004 inclusive. Harvard was always the most-cited university from this sample (and almost certainly in the whole world) – and number of citations is expressed as a percentage of Harvard citations. The number of Harvard citations increased by almost twofold from 1975-9 – 1995-9.
Table 1
Ranking of Oxbridge relative to 11 'Ivy League' universities
75-9 80-4 85-9 90-4 95-9 2000-4
Cambridge Ranking: 7 9 9 8 5 4
Oxford Ranking: 11 11 11 9 9 6
Interpretation: The past twenty years show both Cambridge and Oxford climbing the citation rankings relative to the US ‘Ivy League’.
Table 2
Citations accumulated for each university over 5 year periods from 1975-2004 – expressed as percentage of Harvard citations.
Key: OX=Oxford, CAM=Cambridge, Harv=Harvard, Prin=Princeton, Col=Columbia, Cor=Cornell, Penn=University of Pennsylvania, Dart=Dartmouth, Bro=Brown, Stan=Stanford, Chic=Chicago.
1975-9 1980-4 1985-9 1990-4 1995-9 2000-4
Harv 100 Harv 100 Harv 100 Harv 100 Harv 100 Harv 100
Stan 53 Stan 48 Stan 48 Stan 41 Stan 40 Stan 48
Corn 43 Yale 40 Yale 38 Yale 33 Yale 33 Penn 39
Yale 40 Corn 39 Corn 38 Corn 33 Penn 31 CAM 36
Penn 36 Penn 30 Penn 31 Penn 29 CAM 29 Yale 35
Col 32 Col 28 Col 29 Col 29 Col 28 OX 33
CAM 30 Chic 27 Duke 25 Duke 27 Corn 27 Col 33
Chic 29 CAM 26 Chic 25 CAM 26 Duke 27 Corn 30
Duke 23 Duke 24 CAM 24 OX 21 OX 25 Duke 29
OX 22 OX 19 OX 19 Chic 20 Chic 20 Chic 23
Prin 19 Prin 14 Prin 16 Prin 13 Prin 13 Prin 18
Bro 8 Bro 9 Bro 9 Bro 8 Bro 8 Bro 9
Dart 5 Dart 5 Dart 5 Dart 5 Dart 5 Dart 5
Interpretation: Harvard is by-far the outstanding research university in this group in terms of citations, and Stanford is second – the gap between them has remained pretty constant. Over the past twenty years Cambridge and Oxford have both improved in rank relative to the ‘Ivy League’ and as a percentage of Harvard’s citations; and Oxford has narrowed the gap with Cambridge.
Papers in science attract a much higher average number of citations than in the social sciences – and the arts and humanities attract very few citations. Furthermore, total citations are dominated by specific scientific fields – currently the biomedical sciences contribute the bulk of citations. Nonetheless, this differential citation impact is a reflection of the pre-eminent status, funding and competitive nature of the biomedical science, so it does not invalidate the use of citations as a measure of performance and prestige.
We have previously shown (Oxford Magazine 2006; 252: 5-6) that UK university rankings based on annual number of citations correlate closely with the Shanghai Jiao Tong University (SJTU) rankings of the top 500 world universities (http://ed.sjtu.edu.cn/rank/2006/ranking2006.htm). The SJTU ranking sets out to measure research excellence, mainly in scientific fields (including economics); and it depends on a weighted scale including Nobel Prize/ Fields Medal (10% for Alumni and 20% for staff), number of ISI highly-cited researchers (20%), number of articles published in Nature and Science (20%), number of published articles listed in Web of Science (20% ), and a 10 % adjustment for the size of institution.
Table 3
SJTU Top 500 World Universities – 2006
Rank order for this Oxbridge/ ‘Ivy League’ sample
Institution SJTU Percentage score
- Harvard 100
- =Cambridge 73
=Stanford 73
- Columbia 62
- Princeton 59
- Chicago 59
- Oxford 58
- Yale 56
- Cornell 54
- Penn 50
- Duke 38
- Brown 25
- Dart <25
Interpretation: Despite being derived from different information sources and including a size adjustment, the SJTU rankings are broadly similar to our citation rankings, indicating a reasonable correlation between quantity of citations and the SJTU indices of high scientific quality and impact. Cambridge and Oxford are high in both rankings; however, there are some significant divergences in the ‘Ivy League’ data.
Note: The first SJTU table was published in 2003. Since 2003 Cambridge has improved its SJTU rank from 5th place to 2nd place, with the same proportion of Harvard’s score (73 percent); Oxford improved from 9th to 7th place with a slightly reduced proportion of Harvard’s score (from 60 to 58 percent). This relative Oxbridge improvement in the SJTU over just 3 years is consistent with the implications of the citation study.
Conclusions
We were surprised, and pleased, to find that Oxbridge has actually improved its relative research performance, as measured by citations, over the past twenty years. During this period UK universities have been generally assumed by most people (including ourselves) to be declining relative to the best US universities.
Of course, we have presented only a partial sample of the best US scientific research universities, leaving out some of the top research performers such as CalTech, MIT and Washington University (St Louis) among privates; and public universities such as Berkeley, University of Wisconsin at Madison and Washington University - Seattle. However, while inclusion of these institutions changes absolute rank order, it does not affect our main finding of Oxbridge improvement.
It may be that a false perception of Oxbridge’s declining performance derives specifically from comparison not with the ‘Ivy League’ as such (which are, after all an extremely diverse group of colleges) but with Harvard specifically, which remains by-far the most successful research university in the world.
At any rate, and contrary to expectations, Oxbridge seems to be doing very well in scientific research relative to the USA. If this is correct, over the next few decades we would expect to see improvements in Oxbridge’s share of top scientific honours such as Nobel Prizes, and also a progressive equalization of scientific accolades (including Fellowships of the Royal Society) between Cambridge and Oxford, where Cambridge has traditionally garnered the lion’s share.
***
Oxford Magazine 2006; 256: 25-6
Best in the arts, catching-up in science – what is the best future for Oxford?
Bruce G Charlton and Peter Andras
We are engaged in an analysis of thirty-years worth of publications and citations in both the UK and the USA in the ISI Web of Science (WoS) database. One striking finding has been the improving performance of Oxford University in league tables. But these findings create a potential dilemma.
Oxford is currently probably the best university in the UK and US in Arts and Humanities (A&H) research, and about 16th in sciences. But A&H research production overall is declining while science is thriving. Should Oxford expand in the area of its greatest relative strength or the area of greatest potential growth?
***
We have searched the WoS database from 1975-2004 looking at pre-1992 UK universities (excluding University of Wales, for technical reasons) and the major US research universities and colleges as listed by US News (www.usnews.com/usnews/edu/college/rankings/rankindex_brief.php). Annual number of publications per university was counted and the citations accumulated for these publications up to February 2006. Science Citation Index (Sci Cit), Social Sciences citation index (Soc Sci) and the Arts and Humanities Citation index (A&H) were counted separately. To reduce statistical noise from small numbers, and for clarity, we accumulated annual numbers into five year periods.
We have previously demonstrated (Oxford Magazine 2006; 252: 5-6) that rankings for publications are closely correlated with the leading international measure of scientific research quality, the Shanghai Jiao Tong University academic ranking of world universities 2005 (http://ed.sjtu.edu.cn/rank/2005/ARWU2005TOP500list.htm )
Table 1: Oxford University rank position for number of publications among combined UK US universities 1975-2004. Rankings in Natural Sciences (Sci Cit), Social Sciences (Soc Sci) and Arts and Humanities (A&H).
75-9 80-4 85-9 90-4 95-9 2000-4
Sci Cit: 36 35 33 25 16 16
Soc Sci 29 24 30 22 16 12
A&H 7 2 2 2 1 1
Interpretation: Over the past three decades Oxford has improved its rank compared with the US universities across-the-board. In Arts and Humanities, Oxford has for a decade been the top publishing university in either the UK or USA.
Table 2: 2000-2004 UK-US A&H ranking for number of publications
- Oxford
- Cambridge
- Chicago
- Harvard
- UCLA
- Yale
- Columbia
- Durham (UK)
- Berkeley
- Pennsylvania
Interpretation: The relative strength of the UK in A&H is evident, with Oxford, Cambridge and Durham all in the UK-US top ten for publications.
However when absolute values are considered, it can be seen that A&H publication numbers are generally in progressive decline in both the UK and the US.
Table 3: Proportion of A&H publications in 2000-4 compared with 1995-9 in top 10 UK-US universities.
1. Oxford 94%
2. Cambridge 96%
3. Chicago 83%
4. Harvard 87%
5. UCLA 97%
6. Yale 95%
7. Columbia 94%
8. Durham (UK) 95%
9. Berkeley 80%
10. Pennsylvania 88%
Interpretation: Number of A&H publications in the top ten UK-US universities declined by 3-20 percent between 1995-9 to 2000-4.
It seems that Oxford and Cambridge’s current superiority in rankings for A&H publications is probably generated mainly by a less-rapid-decline than in their immediate US rivals (Chicago and Harvard), rather than improving performance at Oxbridge.
This decline of A&H is in striking contrast to the general continued expansion of absolute numbers of publications in both the natural and social sciences. For example for Oxford publications, the 2000-4 period demonstrates an increase to 117% for Sci Cit and 110% for Soc Sci compared with 1995-9; and such expansion is the general pattern of universities in these indices.
Table 4: 2000-2004 Sci Cit ranking for number of publications, number of publications presented as percentage of Harvard’s publications rounded to nearest integer
- Harvard – 100%
- UCLA – 61%
- U of Washington, Seattle – 60%
- U of Michigan, Ann Arbor – 56%
- U of Pennsylvania – 54%
- Johns Hopkins – 54%
- Stanford – 52%
- U of Minnesota – 48%
- Cambridge – 48%
- Berkeley – 46%
- U of Cal, San Diego – 46%
- U of Pittsburgh – 45%
- U of Wisconsin, Madison – 44%
- Columbia – 44%
- Cornell – 44%
- Oxford – 41%
- U of Florida – 41%
- Duke – 40%
- Yale – 40%
- U of Cal, Davis – 40%
Interpretation: Harvard is the pre-eminent science publishing UK-US university. Leaving aside Harvard, the first tier (above 50% Harvard’s production) is constituted by large US universities (including many public universities) - especially those which specialize in biomedical science. Oxford, although improving, is in the second tier (40-50% Harvard production) and still has some way to go to match the best non-Harvard US universities in terms of science production.
***
In general, Oxford has demonstrated steadily improving performance relative to other UK and US universities over the past 30 years as measured by WoS publications. So Oxford is successful and trends are positive and encouraging.
When it comes to future strategy, it seems that the rankings are somewhat deceptive unless also viewed in a context of absolute numbers. Although Oxford can take legitimate pride in being quite probably the world’s best university for research in the Arts and Humanities, more productive even than the top US universities, this top raking may in fact also be indicative of a weakness.
It seems that the A&H domain of universities is apparently in long-term decline, with dwindling production in almost all the best universities. However, A&H production has not declined in Oxford and Cambridge as rapidly as in their US rivals Harvard and Chicago. This could indicate that Oxbridge is not adapting to new circumstances as rapidly as the US universities, and is failing to redirect research effort in-line with research trends.
Our interpretation of this data is that the best long-term strategy for Oxford would probably be to aim at continuing its improved performance as a science research university, rather than building upon its already achieved prominence as the best arts university. If accepted, a science-orientated strategy could influence priorities for internal organization and re-structuring. For example, postgraduate science teaching might more easily be integrated with research priorities than undergraduate or arts teaching.
Put bluntly, we suggest that Oxford may thrive better as an improving second-tier university in the expanding field of science than it would as the pre-eminent university in the declining field of arts and humanities.
bruce.charlton@ncl.ac.uk
also by Bruce Charlton
Palliative Psychopharmacology
The Malaise Theory of Depression
Public Health and Personal Freedom
Psychiatry and the Human Condition
Pharmacology and Personal Fulfillment
Awareness, Consciousness and Language
Injustice, Inequality and Evolutionary Psychology