APPENDIX
Academic Ranking of World Universities (ARWU), also known as Shanghai Ranking, is an annual publication of university rankings by Shanghai Ranking Consultancy.[1] The league table was originally compiled and issued by Shanghai Jiaotong Universityin 2003, the first global ranking with multifarious indicators,[2] after which a board of international advisories was established to provide suggestions.[3][4] The publication currently includes world's overall and subject league tables, alongside independent regionalGreater China Ranking and Macedonian HEIs Ranking. ARWU is regarded as one of the most influential and widely observed university measures, alongside QS World University Rankings and Times Higher Education World University Rankings.[5][6][7][8] It is praised for its objective methodology but draws some condemnation for narrowly focusing on raw research power, undermininghumanities and quality of instruction.[5][7][9]
Contents
[hide]Global rankings[edit]
Overall[edit]
Methodology[edit]
Criterion | Indicator | Code | Weighting | Source |
---|---|---|---|---|
Quality of education |
|
|
|
|
Quality of faculty |
|
|
|
|
Research output |
|
|
|
|
Per capita performance |
|
|
| |
* |
Reception[edit]
ARWU is praised by several media and institutions for its methodology and influence. A survey on higher education published by The Economist in 2005 commented ARWU as "the most widely used annual ranking of the world's research universities."[11] In 2010, The Chronicle of Higher Education called ARWU "the best-known and most influential global ranking of universities".[12] EU Research Headlines reported the ARWU's work on 31 December 2003: "The universities were carefully evaluated using several indicators of research performance."[13] Chancellor of University of Oxford, Chris Patten and former Vice-Chancellor of Australian National University, Ian Chubb, said: "the methodology looks fairly solid ... it looks like a pretty good stab at a fair comparison." and "The SJTU rankings were reported quickly and widely around the world… (and they) offer an important comparative view of research performance and reputation." respectively.[14] Philip G. Altbach named ARWU's 'consistency, clarity of purpose, and transparency' as significant strengths.[15] Whilst ARWU has originated in China, the ranking have been praised for being unbiased towards Asian institutions. [16]
Criticism[edit]
Like all other rankings, ARWU has criticism. It is condemned for "relying too much on award factors" thus undermining the importance of quality of instruction andhumanities.[5][7][17][18] A 2007 paper published in the journal Scientometrics found that the results from the Shanghai rankings could not be reproduced from raw data using the method described by Liu and Cheng.[19] A 2013 paper in the same journal finally showed how the Shanghai ranking results could be reproduced.[20] In a report from April 2009, J-C. Billaut, D. Bouyssou and Ph. Vincke analyse how the ARWU works, using their insights as specialists of Multiple Criteria Decision Making (MCDM). Their main conclusions are that the criteria used are not relevant; that the aggregation methodology has a number of major problems; and that insufficient attention has been paid to fundamental choices of criteria.[21] The ARWU researchers themselves, N.C Liu and Y Cheng, think that the quality of universities cannot be precisely measured by mere numbers and any ranking must be controversial. They suggest that university and college rankings should be used with caution and their methodologies must be understood clearly before reporting or using the results. ARWU has been criticised by the European Commission as well as some EU member states for "favour[ing] Anglo-Saxon higher education institutions". For instance, ARWU is repeatedly criticised in France, where it triggers an annual controversy, focusing on its ill-adapted character to the French academic system.[22][23]
Results[edit]
Alternative[edit]
As it may take much time for rising universities to produce Nobel laureates and Fields Medalists with numbers comparable to those of older institutions, the Institute created alternative rankings excluding such award factors so as to provide another way of comparisons of academic performance. The weighting of all the other factors remains unchanged, thus the grand total of 70%.
Subject[edit]
There are two categories in ARWU's disciplinary rankings, broad subject fields and specific subjects. The methodology is similar to that adopted in the overall table, including award factors, paper citation, and the number of highly cited scholars.
Broad fields[27] | Specific subjects[28] |
---|---|
Natural sciences and mathematics | Mathematics |
Computer science and engineering | Physics |
Life and agricultural sciences | Chemistry |
Clinical medicine and pharmacy | Computer science |
Social sciences | Economics and business |
Regional rankings[edit]
Considering the development of specific areas, two independent regional league tables with different methodologies were launched.
Greater China[edit]
Methodology[edit]
Criterion | Indicator | Weight |
---|---|---|
Education |
|
|
Research |
|
|
Faculty |
|
|
Resources |
|
|
Results[edit]
Institution | 2011 | 2012 | 2013 | 2014 |
---|---|---|---|---|
Tsinghua University | 1 | 1 | 1 | 1 |
National Tsing Hua University | 4 | 3 | 3 | 2 |
National Taiwan University | 1 | 2 | 2 | 3 |
The Hong Kong University of Science & Technology | 5 | 4 | 7 | 4 |
Peking University | 7 | 7 | 5 | 5 |
The University of Hong Kong | 3 | 6 | 4 | 6 |
The Chinese University of Hong Kong | 6 | 5 | 6 | 7 |
University of Science & Technology of China | 9 | 11 | 10 | 8 |
National Chiao Tung University | 8 | 8 | 8 | 9 |
Zhejiang University | 10 | 9 | 9 | 10 |
Former Yugoslavic republic of Macedonia[edit]
Methodology[edit]
Criterion | Indicator | Weight |
---|---|---|
Teaching and learning |
|
|
Research |
|
|
Social service |
|
|
No comments:
Post a Comment