highly cited researchers 2014 and 2015: an investigation ... · analysis of hcr 2014 (bornmann,...
TRANSCRIPT
Highly Cited Researchers 2014 and 2015:
An investigation of some of the world’s most influential scientific minds on the
institutional and country level
Lutz Bornmann*, Johann Bauer**, Elisabeth Maria Schlagberger**
*Corresponding author
Division for Science and Innovation Studies
Administrative Headquarters of the Max Planck Society
Hofgartenstr. 8,
80539 Munich, Germany.
Email: [email protected]
**Max Planck Institute of Biochemistry
Am Klopferspitz 18,
82152 Martinsried, Germany.
Email: [email protected]
Email: [email protected]
2
Abstract
For several years, Clarivate Analytics, formerly the IP & Science business of Thomson
Reuters, has published a list of researchers who have published the largest number of highly
cited papers in a particular discipline (see www.highlycited.com). These researchers are
described by Clarivate Analytics as highly-cited researchers (HCRs) or as some of the world's
most influential scientific minds. Whereas the HCR 2014 list refers to the publication years
2002 to 2012, the HCR 2015 list which was published a few months ago refers to the years
2003 to 2013. The new list contains a total of 3126 researchers worldwide. Compared with the
HCR 2014 list which includes 3215 researchers, the new list thus includes 89 fewer
researchers as HCRs. This study undertakes a comparison of HCR 2014 and 2015 on the
institute and country levels. As the results on the countries show, for HCR 2015, around half
the researchers work in the USA; a further ~10% in Great Britain. About 6% or 5% of the
researchers are in Germany or China respectively. In comparison with HCR 2014, no country
has lost as many HCRs as the USA with 3.4%. An especially great increase in HCRs occurred
in Australia (1.3%), Saudi Arabia (0.7%), Germany (0.5%) and Great Britain (0.5%). The
evaluation of the institutional addresses proves that - based on the primary addresses - the
largest number of HCRs is employed at the University of California, System (n=161). This is
followed by Harvard University (n=95). However, both institutions are also - if one compares
HCR 2015 with HCR 2014 - affected by a large decrease in the HCRs (both in numbers of
HCR and percentages).
Keywords
Thomson Reuters; Clarivate Analytics; Highly-cited researchers; Scientific excellence; Web
of Science
3
1 Introduction
There are basically two different possibilities for the assessment of research (which
can also be combined with one another) (Wilsdon et al., 2015): the oldest form is the peer
review procedure, which has been used since about the 17th century. Since about the 1980s,
scientometric indicators - especially bibliometric indicators - have been applied to research
evaluation (Roemer & Borchardt, 2015). The use of these indicators can be attributed above
all to the development of research from an academic to a post-academic science (Ziman,
2000). In post-academic science, research assessment is "an integral part of any scientific
activity. It is an ongoing process aimed at improving the quality of scientific/scholarly
research. It includes evaluation of research quality and measurements of research inputs,
outputs, and impacts, and embraces both qualitative and quantitative methodologies, including
the application of bibliometric indicators and peer review” (Moed & Halevi, 2015). As an
indicator in research evaluation, it is citations, above all, which play an important role, since
they are regarded as a proxy for research quality: they may serve "as a record of research
influence and performance, as indicated by Merton’s description of their function as devices
of peer recognition” (Panchal, 2012, p. 1144).
For several years, Clarivate Analytics, formerly the IP & Science business of Thomson
Reuters, has published a list of researchers who have published the largest number of highly
cited papers in a particular discipline (see www.highlycited.com). These researchers are
described by Clarivate Analytics as highly-cited researchers (HCRs) or as some of the world's
most influential scientific minds (Thomson Reuters, 2016). The list with the HCRs is
especially interesting for research evaluation, since research evaluation focuses chiefly on the
excellence area and research is performed by individual people (Parker, Allesina, & Lortie,
2013). Since Clarivate Analytics also publishes the addresses of the HCRs, the HCRs can be
evaluated on the institution and country level.
4
Bornmann and Bauer (2015b) have presented a first evaluation in which the HCR
2014 are evaluated on the basis of their affiliations. This initial study was followed by an
institutional analysis of HCR 2014 in Germany (Bornmann & Bauer, 2015a) and a gender
analysis of HCR 2014 (Bornmann, Bauer, & Haunschild, 2015). Diko (2015) analysed the
HCR database with the focus on South Africa. The most comprehensive analysis so far is that
of Li (2016), who not only performed frequency counts for institutions and countries, but also
presents the results of efficiency analyses. This included relating the number of HCRs for a
particular institution with the total number of its researchers. With regard to this analysis by
Li (2016) we would like to remark critically that - as far as we know - there is no database
which provides reliable and comparable data on the number of researchers in individual
institutions across countries.
Clarivate Analytics recently published a new list with the HCR 2015. In the current
paper we would like to present the results of our evaluation of this dataset on the institution
and country level, where we also undertake a comparison between HCR 2015 and HCR 2014.
2 Methods
2.1 Method for determining the HCRs
A few years ago, Clarivate Analytics (provider of the Web of Science) began to
analyse publications (articles and reviews) in the natural and social sciences, in order to
identify the HCRs in a publication period of 11 years. For this, Clarivate Analytics
determined those publications ranking in the top 1% by citations for their subject area and
field (used for the Essential Science Indicators, ESI, in 22 categories). The authors of these
highly cited publications were classified under the discipline (e.g. materials science; see
http://in-cites.com/thresholds-citation.html) associated with the journals where their highly
cited publications appeared. Within a discipline a sequence of names was produced: the more
highly cited publications there were for a researcher, the higher his or her rank in the
5
discipline. The list of the highly cited researchers published under www.highlycited.com
includes those researchers whose ranking is equal to or greater than the square root of the
population (of all researchers in a discipline with at least one highly cited publication). In
order to eliminate those individuals who may have a significant (qualifying) number of highly
cited papers in the last year or two of the time series, but whose total citations to these papers
are very low, Clarivate Analytics used a second indicator for selection in the HCR list: The
citations to the researcher’s highly cited papers must be sufficient to rank the author in the top
1% of an ESI field (see http://hcr.stateofinnovation.thomsonreuters.com/page/methodology).
The great advantage of the data from Clarivate Analytics is that it is normalized with
regard to time (a publication period of 11 years), the document type (only articles and
reviews) and the discipline (the ranking order of the researchers are produced within one
discipline). The data was thus collected with a procedure which one can describe as "metric-
wise" (Rousseau & Rousseau, 2015). The data (the HCRs) can therefore be aggregated
without a problem. Thus, for example, no discipline-specific differences (such as subject-
specific citation rates) need be taken into account.
2.2 Description of the dataset for this study
Whereas the HCR 2014 list refers to the publication years 2002 to 2012, the HCR
2015 list which was published a few months ago refers to the years 2003 to 2013. The new list
contains a total of 3126 researchers worldwide. Compared with the HCR 2014 list which
contains 3215 researchers, the new list thus includes fewer researchers as HCRs. The list has
reduced by 89 HCRs (2.8%).
The HCRs have provided up to four (HCR 2015) or five (HCR 2014) addresses with
which they can be associated. Many evaluations in this study refer to the most important
address, the primary address; but with some evaluations all the addresses are taken into
account - counted fractionally. For example, if an HCR has provided four addresses, the
6
fractionated counting method counts a quarter of the HCR for each address. We selected the
fractionated counting method (and not the multiplicative counting method) to take into
account all addresses, in order to arrive at a total of 100% in the use of all addresses for all
HCRs.
For the analyses in this study an elaborate cleaning process was necessary, since many
institutions had been not just named once by the authors, but with several variants of their
name. In this cleaning process, we combined all naming variants of an institution - if
recognizable -into one variant. Unfortunately a few of the institutions named by the authors
could not be used since entries were too unspecific (when, e.g., only "USA" was entered in
HCR 2014) or ambiguous (an abbreviation can stand for many institutions, even within one
country).
3 Results
The following section 3.1 indicates how the HCRs are distributed across the various
disciplines. Then the results of the evaluations on the country and institution level are
presented.
3.1 Distribution of the HCRs across subject categories
Table 1 shows the results of the analysis with respect to the disciplines associated with
the researchers (or their publications). The table shows the results for HCR 2015 and HCR
2014 (for comparison). The number of HCRs and the percentage distribution are given. The
results point out that the HCR 2015 are especially often associated with the disciplines
"Clinical Medicine" (12%), "Biology & Biochemistry" (~ 7%) and "Chemistry" (~ 7%). The
last column of Table 1 shows the difference between the percent values for HCR 2014 and
2015. The difference is not calculated on the basis of the absolute numbers, but the percent
values, since the total number of HCRs changed between HCR 2014 and HCR 2015. As the
7
differences in the percent values make clear, the share increased the most in "Biology &
Biochemistry" (1.1%) and fell the most in “Engineering” (-0.8%).
Table 1. Distribution of HCR 2014 and 2015 across disciplines (evaluation of the primary
address, sorted in descending order by number of HCR 2015)
Absolute
number of
HCR 2014
Relative
number of
HCR 2014
Absolute
number of
HCR 2015
Relative
number of
HCR 2015
Difference
percent
Clinical Medicine 402 12.5 375 12.0 -0.5
Biology & Biochemistry 195 6.1 225 7.2 1.1
Chemistry 198 6.2 205 6.6 0.4
Molecular Biology &
Genetics
201 6.3 198 6.3 0.1
Plant & Animal Science 176 5.5 172 5.5 0.0
Social Sciences, general 177 5.5 165 5.3 -0.2
Engineering 187 5.8 157 5.0 -0.8
Geosciences 159 5.0 148 4.7 -0.2
Neuroscience &
Behaviour
129 4.0 148 4.7 0.7
Environment/Ecology 137 4.3 132 4.2 0.0
Materials Science 147 4.6 130 4.2 -0.4
Agricultural Sciences 112 3.5 128 4.1 0.6
Pharmacology &
Toxicology
133 4.1 128 4.1 0.0
Physics 144 4.5 119 3.8 -0.7
Immunology 87 2.7 110 3.5 0.8
Psychiatry/Psychology 100 3.1 110 3.5 0.4
Computer Science 117 3.6 108 3.5 -0.2
Mathematics 99 3.1 99 3.2 0.1
Microbiology 114 3.6 99 3.2 -0.4
Space Science 106 3.3 99 3.2 -0.1
Economics & Business 95 3.0 71 2.3 -0.7
Total 3215 100 3126 100
Note. Since the percentages have been rounded to the first decimal place (in this table and the
other tables), there are small rounding differences.
3.2 Distribution of HCRs across countries
Table 2 shows the distribution of the HCRs across countries using the primary address
of the HCRs. A total of 56 countries had at least one HCR in either HCR 2014 or HCR 2015
(or in both lists) (with a primary address in this country). Addresses of HCRs which contained
8
the entry "Taiwan" were attributed to China. We decided on this step because all the
addresses of the Taiwanese HCRs which are named in the HCR 2014 list also contain
"China". Table 2 provides the number of HCRs per country for each of HCR 2014 and HCR
2015. Since the total number of HCRs has changed between the two lists, the percent share of
HCRs of a country in the total number of HCRs in a list is also given. The percent share of the
two lists can be directly compared with one another (see above). In order to indicate a rise or
fall in the number of HCRs, Table 2 shows the difference of the percent share in the two lists.
The column "rank difference" gives the rank of the country if the countries are sorted in
descending order of the difference in percent share between the two lists. The country with
the greatest difference has the highest rank.
Table 2. Distribution of HCRs across countries (evaluation of the primary address, sorted in
descending order by number of HCR in 2015)
Country Absolute
number of
HCR
2014
Relative
number
of HCR
2014
Rank
2014
Absolut
e
number
of HCR
2015
Relative
number
of HCR
2015
Rank
2015
Difference
percent
Rank
Differenc
e
USA 1701 52.9 1 1548 49.5 1 -3.4 56
UK 303 9.4 2 310 9.9 2 0.5 4
Germany 164 5.1 3 176 5.6 3 0.5 3
China
(Taiwan)
160
(11)
5.0 4 163
(18)
5.2 4 0.2 8
Australia 65 2.0 10 103 3.3 5 1.3 1
Canada 89 2.8 6 86 2.8 6 0.0 35
Netherlands 76 2.4 8 83 2.7 7 0.3 7
Japan 98 3.0 5 80 2.6 8 -0.5 55
France 84 2.6 7 72 2.3 9 -0.3 54
Switzerland 67 2.1 9 71 2.3 10 0.2 9
Saudi Arabia 33 1.0 13 54 1.7 11 0.7 2
Spain 43 1.3 12 52 1.7 12 0.3 5
Italy 52 1.6 11 44 1.4 13 -0.2 53
Belgium 32 1.0 14 30 1.0 14 0.0 45
Denmark 27 0.8 16 26 0.8 15 0.0 33
Sweden 28 0.9 15 25 0.8 16 -0.1 47
South Korea 21 0.7 17 24 0.8 17 0.1 11
Singapore 14 0.4 19 23 0.7 18 0.3 6
Austria 19 0.6 18 18 0.6 19 0.0 34
Finland 14 0.4 19 16 0.5 20 0.1 14
9
Country Absolute
number of
HCR
2014
Relative
number
of HCR
2014
Rank
2014
Absolut
e
number
of HCR
2015
Relative
number
of HCR
2015
Rank
2015
Difference
percent
Rank
Differenc
e
Israel 10 0.3 24 11 0.4 21 0.0 19
Ireland 12 0.4 21 10 0.3 22 -0.1 46
Iceland 11 0.3 22 10 0.3 22 0.0 36
Norway 7 0.2 27 9 0.3 24 0.1 15
Turkey 10 0.3 24 9 0.3 24 0.0 37
New Zealand 4 0.1 32 9 0.3 24 0.2 10
Greece 5 0.2 29 7 0.2 25 0.1 16
South Africa 6 0.2 24 7 0.2 25 0.0 20
Iran 11 0.3 22 6 0.2 29 -0.2 52
Portugal 2 0.1 37 5 0.2 28 0.1 12
Brazil 5 0.2 29 5 0.2 28 0.0 25
Czech 2 0.1 37 4 0.1 32 0.1 17
Poland 4 0.1 32 4 0.1 32 0.0 24
India 8 0.2 26 4 0.1 32 -0.1 50
Malaysia 3 0.1 35 3 0.1 35 0.0 29
Thailand 0 0.0 50 3 0.1 35 0.1 13
Russia 5 0.2 29 2 0.1 37 -0.1 48
Mexico 1 0.0 41 2 0.1 37 0.0 21
Romania 0 0.0 50 2 0.1 37 0.1 18
Slovakia 1 0.0 41 1 0.0 40 0.0 28
Serbia 2 0.1 37 1 0.0 40 0.0 38
Argentina 1 0.0 41 1 0.0 40 0.0 28
Uganda 0 0.0 50 1 0.0 40 0.0 22
Croatia 0 0.0 50 1 0.0 40 0.0 22
Estonia 0 0.0 50 1 0.0 40 0.0 22
Chile 1 0.0 41 1 0.0 40 0.0 28
Pakistan 0 0.0 50 1 0.0 40 0.0 22
Jordan 2 0.1 37 1 0.0 40 0.0 38
Qatar 0 0.0 50 1 0.0 40 0.0 22
Slovakia 1 0.0 41 0 0.0 50 0.0 40
Indonesia 4 0.1 32 0 0.0 50 -0.1 51
Lithuania 1 0.0 41 0 0.0 50 0.0 40
Hungary 3 0.1 35 0 0.0 50 -0.1 49
Vietnam 1 0.0 41 0 0.0 50 0.0 40
Colombia 1 0.0 41 0 0.0 50 0.0 40
UAE 1 0.0 41 0 0.0 50 0.0 40
Total 3215 100.0 3126 100.0
As the results in Table 2 for HCR 2015 show, about half the researchers work in the
USA; a further ~10% in Great Britain. About 6% or 5% of the researchers work in Germany
10
or China respectively. In order to identify the countries with the greatest increase or decrease
in HCRs between HCR 2014 and HCR 2015, the difference between the percent shares were
calculated for both lists and the rank determined for each country from these differences. This
shows on the one hand that the USA has lost the most HCRs: at -3.4% no other country lost
so many HCRs as the USA. It is followed by Japan (-0.5%), France (-0.3%), Italy (-0.2%) and
Iran (-0.2%). The winners for HCRs are Australia (1.3%), Saudi Arabia (0.7%), Germany
(0.5%) and Great Britain (0.5%).
In order to check the equality or inequality of the distribution of HCRs across
countries for HCR 2014 and 2015, we calculated the Gini coefficients. The Gini coefficient
takes a value between 0 (i.e. for a uniform distribution: if each institution had the same
number of HCRs) and 1 (i.e. for maximally unequal distribution: if one institution employed
all HCRs). The coefficients of 0.84 for HCR 2014 and 0.83 for HCR 2015 make clear that the
HCRs are distributed very unequal across countries and that this inequality of distribution has
hardly changed between the two lists.
Table 3. Distribution of HCRs across countries (evaluation of the all addresses, sorted in
descending order by number of HCR 2015)
Country Absolute
number
of HCR
2014
Relative
number
of HCR
2014
Rank
2014
Absolute
number
of HCR
2015
Relative
number
of HCR
2015
Rank
2015
Difference
percent
Rank
Difference
USA 1673.2 52.0 1 1535.0 49.1 1 -2.9 57
UK 300.2 9.3 2 312.8 10.0 2 0.7 3
Germany 158.7 4.9 3 173.2 5.5 3 0.6 4
China 142.6 4.4 4 167.3 5.4 4 0.9 2
Australia 68.3 2.1 10 103.0 3.3 5 1.2 1
Canada 86.2 2.7 8 85.0 2.7 6 0.0 23
Netherlands 74.0 2.3 9 80.0 2.6 7 0.3 6
Japan 96.6 3.0 5 79.7 2.5 8 -0.5 55
France 86.3 2.7 7 72.0 2.3 9 -0.4 54
Switzerland 60.8 1.9 11 66.6 2.1 10 0.2 7
Saudi Arabia 95.8 3.0 6 61.2 2.0 11 -1.0 56
Spain 40.5 1.3 13 46.7 1.5 12 0.2 8
Italy 50.2 1.6 12 44.5 1.4 13 -0.1 52
Belgium 30.2 0.9 14 29.5 0.9 14 0.0 33
11
Country Absolute
number
of HCR
2014
Relative
number
of HCR
2014
Rank
2014
Absolute
number
of HCR
2015
Relative
number
of HCR
2015
Rank
2015
Difference
percent
Rank
Difference
Denmark 29.5 0.9 15 26.3 0.8 15 -0.1 48
Sweden 27.7 0.9 16 25.8 0.8 16 0.0 45
South Korea 21.0 0.7 17 25.5 0.8 17 0.2 10
Singapore 15.2 0.5 19 23.8 0.8 18 0.3 5
Austria 17.3 0.5 18 18.3 0.6 19 0.0 22
Finland 14.0 0.4 20 15.4 0.5 20 0.1 17
Israel 10.0 0.3 24 11.5 0.4 21 0.1 18
Turkey 10.0 0.3 24 10.5 0.3 22 0.0 24
Norway 7.0 0.2 24 10.3 0.3 23 0.1 11
Ireland 11.5 0.4 21 10.0 0.3 24 0.0 46
Iceland 11.0 0.3 22 10.0 0.3 24 0.0 41
New Zealand 4.0 0.1 32 9.5 0.3 26 0.2 9
South Africa 8.2 0.3 26 7.0 0.2 25 0.0 42
Greece 5.2 0.2 28 6.8 0.2 24 0.1 16
India 8.0 0.2 25 6.5 0.2 29 0.0 47
Iran 11.0 0.3 22 6.0 0.2 28 -0.2 53
Portugal 2.8 0.1 37 5.5 0.2 31 0.1 12
Brazil 4.5 0.1 31 5.0 0.2 32 0.0 29
Czech 2.0 0.1 38 4.0 0.1 33 0.1 15
Poland 3.5 0.1 33 4.0 0.1 33 0.0 28
Malaysia 3.0 0.1 35 3.0 0.1 35 0.0 34
Thailand 0.5 0.0 48 3.0 0.1 35 0.1 13
Romania 0.0 0.0 53 2.5 0.1 37 0.1 14
Russia 6.3 0.2 29 2.3 0.1 38 -0.1 51
Mexico 0.5 0.0 48 2.0 0.1 39 0.0 19
Pakistan 1.0 0.0 41 2.0 0.1 39 0.0 24
Jordan 1.0 0.0 41 2.0 0.1 39 0.0 24
Serbia 1.5 0.0 39 1.5 0.0 42 0.0 35
Uganda 0.0 0.0 53 1.5 0.0 42 0.0 20
Estonia 0.0 0.0 53 1.5 0.0 42 0.0 20
Chile 1.0 0.0 41 1.5 0.0 42 0.0 31
Slovakia 1.0 0.0 41 1.0 0.0 46 0.0 36
Argentina 1.5 0.0 39 1.0 0.0 46 0.0 37
Croatia 0.0 0.0 53 1.0 0.0 46 0.0 26
Qatar 0.0 0.0 53 1.0 0.0 46 0.0 26
Vietnam 1.0 0.0 41 0.5 0.0 50 0.0 38
UAE 0.3 0.0 52 0.5 0.0 50 0.0 32
Slovakia 0.5 0.0 48 0.0 0.0 52 0.0 39
Panama 0.5 0.0 48 0.0 0.0 52 0.0 39
Indonesia 3.5 0.1 33 0.0 0.0 52 -0.1 50
Lithuania 1.0 0.0 41 0.0 0.0 52 0.0 43
Hungary 3.0 0.1 35 0.0 0.0 52 -0.1 49
Colombia 1.0 0.0 41 0.0 0.0 52 0.0 43
12
Country Absolute
number
of HCR
2014
Relative
number
of HCR
2014
Rank
2014
Absolute
number
of HCR
2015
Relative
number
of HCR
2015
Rank
2015
Difference
percent
Rank
Difference
Total 3215.0 100.0 3126.0 100.0
Table 3 shows the distribution of the HCRs across countries, if all the addresses
named by the HCR are used fractionally for the analysis. This evaluation covers a total of 58
countries. As the results in Table 3 clarify, the results hardly change when all addresses are
taken into account instead of the primary address.
In an additional evaluation of the country data, we have related the number of HCRs to
the total number of researchers in a country. The World Bank provides data on the number of
scientists in Research and Development (R&D) per million inhabitants in recent years (see
http://data.worldbank.org). We used this to calculate the share of HCRs in the total of
scientists in a country (per million inhabitants). For HCR 2015 we took the share of scientists
for the year 2013, since the HCRs were determined from publications up to 2013. For HCR
2014, we used the corresponding data from the World Bank for the year 2012. However, the
World Bank data is not complete for all countries. For example, there is data for the USA for
the year 2012, but not for the year 2013. We therefore included in the evaluation only the 22
countries for which the complete data was available to us.
The interpretation of this evaluation should always take into account that the World
Bank data and the Clarivate Analytics data arise from different sources and are therefore
difficult to compare with one another. The World Bank data includes the number of people
who found themselves at a particular location at a particular time. The Clarivate Analytics
data refers to publications from a time window of 11 years. Thus, for example, it is possible
that a researcher is listed as HCR in HCR 2015 although he/she had been no longer
scientifically active well after 2013. A researcher can have lived and researched for ten out of
11 years in one country; but if he lives in another country at the time when Clarivate
Analytics collects the data, the researcher will only be assigned to this country as an HCR.
13
Table 4 provides the results of the evaluation of the HCR data which takes into
account the number of scientists in a country. The data in the table is sorted by the share of
HCRs 2015 in the total number of scientists (per million inhabitants) (see column "percent
2015”). As the results in the column clarify, Great Britain has the largest share with 7.6%,
followed by Germany (3.9%) and Italy (2.2%). The extent to which the ranking places have
changed due to taking into account the total number of scientists can be seen from the column
"ranking changes". For example, the value of -4 for the ranking change of Italy in HCR 2015
means that Italy loses four places due to the taking into account of the total number of
scientists as compared with considering just the number of HCRs. Italy thus has fewer HCRs
than one could expect from the total number of scientists. With a value of -6, Turkey loses
more ranking places than Italy when taking into account the total number of scientists.
In Table 4 a comparison of ranking places is performed, which the countries have
achieved with their percent share of HCRs in the total number of researchers in HCR 2014
and 2015. If there is a positive value in the column "Ranking comparison 2014/2015", the
country has improved its percent share between HCR 2014 and 2015. A negative value
indicates a worsening. There is an improvement of 3 places for Spain and an improvement of
2 places for Norway and the Czech Republic. With a value of -4, Russia exhibits the greatest
deterioration.
Since the comparison of the results on the distribution of HCRs across countries from
Table 2 (evaluation of the primary addresses) and Table 3 (evaluation of all addresses)
indicated no great difference, we did not undertake the additional evaluation for Table 4 with
all addresses.
14
Table 4. Distribution of HCRs across countries taking into account the number of researchers in Research and Development per million inhabitants (evaluation
of primary address), sorted in descending order by the share of HCRs 2015 in the total number of researchers in a country.
Country Absolute number
of researchers
2012
Absolute
number
of HCR
2014
Rank
number
2014
Relative
number
of HCR
2014
Rank
percent
2014
Rank
change
2014
Absolute number
of researchers
2013
Absolute
number
of HCR
2015
Rank
number
2015
Relative
number
of HCR
2015
Rank
percent
2015
Rank
change
2015
Rank
comparison
2014/2015
UK 4029.3 303 1 7.5 1 0 4055.1 310 1 7.6 1 0 0
Germany 4379.1 164 2 3.7 2 0 4472.2 176 2 3.9 2 0 0
Italy 1853.0 52 6 2.8 3 -3 1973.8 44 7 2.2 3 -4 0
Spain 2718.4 43 7 1.6 7 0 2652.6 52 6 2.0 4 -2 3
Netherlands 4246.9 76 5 1.8 6 1 4302.7 83 3 1.9 5 2 1
France 4075.8 84 4 2.1 4 0 4153.5 72 5 1.7 6 1 -2
Japan 5083.7 98 3 1.9 5 2 5201.3 80 4 1.5 7 3 -2
Turkey 1097.2 10 14 0.9 8 -6 1168.6 9 14 0.8 8 -6 0
Belgium 3954.4 32 8 0.8 9 1 4003.3 28 8 0.7 9 1 0
Sweden 5163.7 24 9 0.5 10 1 6472.6 25 10 0.4 10 0 0
Austria 4655.2 19 12 0.4 11 -1 4704.0 18 12 0.4 11 -1 0
South
Korea 6361.6 21 11 0.3 13 2 6456.6 24 11 0.4 12 1 1
Denmark 7310.5 25 10 0.4 12 2 7264.6 26 9 0.4 13 4 -1
Greece 2232.3 5 16 0.2 15 -1 2628.2 7 16 0.3 14 -2 1
Finland 7460.1 14 13 0.2 16 3 7187.9 16 13 0.2 15 2 1
Poland 1735.3 4 18 0.2 14 -4 1850.7 4 18 0.2 16 -2 -2
Norway 5547.8 7 15 0.1 19 4 5575.5 9 14 0.2 17 3 2
Czech 3150.0 2 19 0.1 20 1 3249.9 4 18 0.1 18 0 2
Portugal 4041.7 2 19 0.0 21 2 4141.7 5 17 0.1 19 2 2
Serbia 1313.9 2 19 0.2 18 -1 1380.9 1 21 0.1 20 -1 -2
Russia 3093.6 5 16 0.2 17 1 3073.1 2 20 0.1 21 1 -4
Slovakia 2819.9 1 22 0.0 22 0 2717.6 1 21 0.0 22 1 0
Total 86323.3 999
88686.3 998
15
3.3 Distribution of HCRs across institutions
In an additional step we have analysed how many HCRs work in research institutions
worldwide. For this evaluation we have not only combined all address variants for an
institution, but also all individual institutes of an organization - as far as can be recognized.
Thus, for example, we have combined all Max Planck Institutes to the Max Planck Society,
all institutes of the Helmholtz Society to the Helmholtz Society and all federal state
universities in California to University of California, System. In the following, the results of
HCR 2014 and 2015 are compared with each other.
In the two lists, many HCRs have provided not just one, but up to four (HCR 2015) or
five (HCR 2014) different institute addresses (see above). For this reason, we have produced
two ranking lists. The first ranking list of the institutions (see Table 5) is based on the first
address assigned to an HCR (the primary address). The second ranking of the institutions (see
Table 6) is based on all addresses mentioned by a researcher - counted fractionally (see
above).
Corresponding to the primary addresses, the most HCRs work at the University of
California, System (n=161). This is followed by Harvard University (n=95). Since the total
number of HCRs changed between HCR 2014 and HCR 2015, the institutional comparison in
Table 11 between the two lists is performed with the difference of the percent values. Thus,
for example, the University of California, System, is marked by a decrease of 18 HCRs
between the two lists. This corresponds to a difference of 0.4% in the percent values.
Among the 20 institutions in Table 5 which employ the most HCRs, 11 have a lower
number in HCR 2015 than in HCR 2014. The greatest decrease can be seen at the University
of California, System (-0.4%), the Chinese Academy of Sciences (-0.3%) and Harvard
University (-0.3%). The greatest increase is at King Abdulaziz University (0.4%), Stanford
16
University (0.2%) and Washington University (0.2%). But for King Abdulaziz University one
must take in to account that – according to a report by Bhattacharjee (2011) in Science –
universities in Saudi Arabia deliberately conclude contracts with HCRs in order to improve
their performance in international rankings.
Table 5. Distribution of HCRs across institutions (evaluation of the primary address, sorted in
descending order by number of HCR 2015)
Institution Absolute
number
of HCR
2014
Relative
number
of HCR
2014
Absolute
number
of HCR
2015
Relative
number
of HCR
2015
Difference
percent
University of California, System 179 5.6 161 5.2 -0.4
Harvard University 107 3.3 95 3.0 -0.3
National Institutes of Health
(NIH)
91 2.8 88 2.8 0.0
Stanford University 56 1.7 61 2.0 0.2
Max Planck Society 52 1.6 43 1.4 -0.2
The University of Texas 43 1.3 40 1.3 -0.1
Chinese Academy of Science 46 1.4 35 1.1 -0.3
Duke University 32 1.0 32 1.0 0.0
University of Oxford 33 1.0 31 1.0 0.0
Brigham & Women’s Hospital 26 0.8 24 0.9 0.1
Massachusetts Institute of
Technology (MIT)
32 1.0 24 0.9 -0.1
Northwestern University 24 0.9 24 0.9 0.0
King Abdulaziz University 15 0.5 26 0.8 0.4
Broad Institute 24 0.9 25 0.8 -0.1
Princeton University 25 0.8 25 0.8 0.0
University of Michigan 31 1.0 25 0.8 -0.2
Washington University (in St.
Louis)
20 0.6 25 0.8 0.2
European Molecular Biology
Laboratory (EMBL, UK,
Germany)
24 0.9 24 0.8 -0.1
University of Cambridge, UK 23 0.7 24 0.8 0.1
University of Washington 25 0.8 23 0.7 -0.1
The second ranking list of the institutions (see Table 6) is based on the fractionated
counting of all addresses of the HCRs. Compared with Table 5, the ranking order of the
institutions hardly changes in the higher positions; but now King Abdulaziz University
17
appears in fifth place. Apparently many HCRs give King Abdulaziz University as an
additional institution alongside a primary institution concerning scientists from the USA,
Australia, China and Europe. However, the comparison with the figures from HCR 2014 also
shows a great reduction in the number of HCRs who mentioned the King Abdulaziz
University alongside their primary institution. Among the 20 institutions in Table 6, 13 have a
lower number in HCR 2015 than in HCR 2014.
Table 6. Distribution of HCRs across institutions (evaluation of the primary address, sorted in
descending order by number of HCR 2015)
Institution Absolute
number
of HCR
2014
Relative
number
of HCR
2014
Absolute
number
of HCR
2015
Relative
number
of HCR
2015
Difference
percent
University of California, System 178.0 5.5 159.0 5.1 -0.4
Harvard University 110.5 3.4 92.5 3.0 -0.5
National Institutes of Health (NIH) 93.0 2.9 87.5 2.8 -0.1
Stanford University 55.5 1.7 60.7 1.9 0.2
King Abdulaziz University 80.3 2.5 41.8 1.3 -1.2
Max Planck Society 49.5 1.5 40.3 1.3 -0.2
The University of Texas 39.5 1.2 39.0 1.2 0.0
Chinese Academy of Science 41.3 1.3 35.3 1.1 -0.2
University of Oxford 32.1 1.0 31.0 1.0 0.0
Massachusetts Institute of
Technology (MIT)
33.1 1.0 30.1 1.0 -0.1
Duke University 29.5 0.9 29.5 0.9 0.0
Northwestern University 26.5 0.8 26.7 0.9 0.0
University of Michigan 30.8 1.0 25.0 0.8 -0.2
University of Washington 29.0 0.9 24.5 0.8 -0.1
Washington University (in St.
Louis)
19.5 0.6 24.0 0.8 0.2
Princeton University 27.3 0.9 23.8 0.8 -0.1
University of Cambridge, UK 23.7 0.7 23.8 0.8 0.0
Wellcome Trust Sanger Institute 31.3 1.0 22.8 0.7 -0.2
Brigham & Women’s Hospital 20.5 0.6 22.3 0.7 0.1
Mayo Clinic 21.5 0.7 22.0 0.7 0.0
The evaluations in Table 5 and Table 6 have shown that a large part of the institutions
exhibit a decrease of HCRs between the two lists. Since the study by Gazni and Thelwall
(2016) could also show that the 100 best institutions worldwide – measured on the basis of the
18
Leiden Rankings (http://www.leidenranking.com) – may cooperate decreasingly with each
other, but increasingly with institutions which are positioned lower in the Leiden Ranking,
one could suppose a trend in science to a more equal distribution of high performers.
Table 7. Gini coefficients for the distribution of HCR 2014 and 2015 across the institutions.
The evaluations are based on the primary addresses and refer in each case to a different
number of institutions with the most HCRs.
Number of institutions with the most HCRs HCR 2014 HCR 2015
<=200 0.485 0.461
<=100 0.418 0.394
<=50 0.36 0.394
<=20 0.324 0.329
We have tested this hypothesis on the HCR with the help of the Gini coefficient. The
Gini coefficient takes a value between 0 (i.e. for a uniform distribution: if each institution had
the same number of HCRs) and 1 an (i.e. for maximally unequal distribution: if one institution
employed all HCRs). Table 7 lists for a varying number of institutions with the most HCRs
(<=200 … <=20) Gini coefficients for HCR 2014 and 2015. A different number of institutions
per year was used for the analyses in order to check whether the differences between the years
are independent of the number of institutions covered.
As the results in Table 7 reveal, as expected, a more equal distribution of the HCRs is
associated with a stronger focus on the best institutions in both lists: the Gini coefficients
become smaller the fewer institutions are taken into account. But when we compare the
coefficients between HCR 2014 and 2015, no definite trend is visible. If up to 100 or 200
institutions are taken into account, the coefficient for HCR 2015 is lower than for HCR 2014.
This would confirm our hypothesis of a smoother distribution. However, the evaluation of up
to 20 or 50 institutions shows the opposite result: here the coefficient for HCR 2015 is higher
than that for HCR 2014.
19
Thus the results of the comparison of the two lists cannot definitely confirm that high
performers are concentrating less and less on particular institutions. In order to make a
convincing statement on the equal or unequal distribution of the HCRs, it would be necessary
in the evaluation of the HCRs that not only more than two lists are compared, but that the
publication years used to produce the lists with the HCRs are also more separated in time.
4 Discussion
A few months ago, Clarivate Analytics published a new list of the HCRs – the world’s
most influential scientific minds. This list can be assigned a special position in research
evaluation since today's science operates in an "economy of reputation" (Whitley, 1984) in
which the reputational status or "symbolic capital" of researchers is mainly determined by
publication performance. As the publication of similar lists as the HCR by Ioannidis, Klavans,
and Boyack (2016) and Boyack, Klavans, Sorensen, and Ioannidis (2013) reveal, the
identification of outstanding researchers (using bibliometrics) is a complex task, but attracts
wide interest.
Whereas the HCR 2014 list refers to the publication years 2002 to 2012, the new list
refers to the years 2003 to 2013. This paper presents the results of analyses which have
evaluated this data mainly on country and institution level. In most of the evaluations, a
comparison is undertaken between HCR 2015 and HCR 2014. The new Clarivate Analytics
list contains a total of 3126 researchers worldwide. Compared with the old list which contains
3215 researchers, the new list thus includes fewer researchers as HCRs (n=89).
As shown by the results for the countries in which the HCR 2015 work, around half
the researchers work in the USA; a further ~10% in Great Britain. About 6% or 5% of the
researchers are in Germany or China respectively. In comparison with HCR 2014, no country
has lost as many HCRs as the USA with 3.4%. An especially great increase in HCRs occurred
in Australia (1.3%), Saudi Arabia (0.7%), Germany (0.5%) and Great Britain (0.5%). The
20
evaluation of the institutional addresses points out that - based on the primary addresses - the
largest number of HCRs 2015 is employed at the University of California, System (n=161).
This is followed by Harvard University (n=95). However, both institutions are also affected
by a large decrease in HCRs, as a comparison with HCR 2014 shows.
Seen as a whole, the results on the HCRs at institute and country level should be
interpreted with caution. The numbers of cases are generally very low and therefore subject to
wide variations over the HCR lists. Since there are, so far, only two lists (HCR 2014 and
2015) one cannot say anything about trends. Whether the number of highly cited papers from
a researcher lies above the limiting value to be counted as an HCR in a discipline also
depends on random variations (such as the random variations in the number of citations, see
Williams & Bornmann, in press).
In the following we would like to go more deeply into three problems with the
identification of HCRs by Clarivate Analytics, which may be resolved or reduced by a change
in methodology:
One problem with the method for identifying HCRs consists its disadvantaging
researchers who work in interdisciplinary areas. Since the HCRs are determined within a
discipline, those researchers have an advantage who publish very many papers in one
discipline and who do not distribute their papers across different disciplines. As the overview
from Rijcke, Wouters, Rushforth, Franssen, and Hammarfelt (2015) points out that it could be
determined for the UK Research Assessment Exercise that disciplinary assessments have a
negative influence on the undertaking of interdisciplinary research. Clarivate Analytics might
therefore consider also publishing a cross-discipline evaluation in the new list of the HCR.
This could list those researchers who have published the most highly-cited papers across all
disciplines. Since we can expect different publication and citation cultures within the
disciplines, the number of papers must be normalized. This normalization could perhaps be
performed by counting the share of highly-cited papers associated with a researcher (with at
21
least one highly-cited paper) within a discipline instead of the absolute number of highly-cited
papers. One would then calculate the sum of the percent values for a researcher across all
disciplines and the researchers can be ranked by the sums.
An additional problem in the determination of the HCRs consists in its giving
preference to researchers who have already been carrying out research in a discipline for a
long time (Pepe & Kurtz, 2012). Young researchers hardly have a chance to have enough
highly-cited papers for a high position in the ranking of researchers. In order to measure the
number of highly-cited papers independent of the scientific age, Clarivate Analytics could
divide the number of papers by the years which have passed since the publication of the first
paper. In this way one would normalize the number of highly-cited papers by the academic
age of the HCR. However one should not use the years which have passed since the
publications of the first highly-cited papers. It is possible that a researcher has already been
publishing for many years before he/she publishes his/her first highly-cited paper.
A final difficulty relates to the selection of the HCR among the researchers publishing
highly-cited paper. HCR 2014 and 2015 list those researchers whose ranking is equal to or
greater than the square root of the population (i.e. all researchers in a discipline with at least
one highly-cited publication). Although the use of the square root to identify the elite of a
population is described by Price (1971), the use is more or less arbitrary; the selection of the
HCRs could also be performed differently. Thus, for example, the h index could be used for
the selection by sorting all researchers in a discipline by the number of their highly-cited
papers, determining the h index for this list and selecting all researchers with a number greater
than h for the presentation on www.highlycited.com. In order to avoid an arbitrary selection
of the HCRs by specifying a particular formula, Clarivate Analytics could avoid making a
selection among the HCRs and could present all HCRs with at least one highly-cited paper.
In conclusion we would like to note that the list of HCRs was composed on the basis
of only one indicator (citation impact). But research excellence cannot generally be
22
determined with one indicator alone, but should be measured in multiple dimensions.
Ultimately, peers should decide in an informed peer review process on the basis of an
indicator report which person's research performance should be seen as excellent (Derrick &
Pavone, 2013).
23
Acknowledgements
We would like to thank David Pendlebury from Clarivate Analytics, formerly the IP &
Science business of Thomson Reuters, for his support in cleaning the institutional addresses
of the HCR 2014 and 2015 as well as discussing an earlier version of this paper.
24
References
Bhattacharjee, Y. (2011). Saudi universities offer cash in exchange for academic prestige.
Science, 334(6061), 1344-1345. doi: 10.1126/science.334.6061.1344.
Bornmann, L., & Bauer, J. (2015a). Evaluation of the highly-cited researchers' database for a
country: proposals for meaningful analyses on the example of Germany.
Scientometrics, 105(3), 1997-2003. doi: 10.1007/s11192-015-1619-1.
Bornmann, L., & Bauer, J. (2015b). Which of the world's institutions employ the most highly
cited researchers? An analysis of the data from highlycited.com. Journal of the
Association for Information Science and Technology, 66(10), 2146-2148.
Bornmann, L., Bauer, J., & Haunschild, R. (2015). Distribution of women and men among
highly cited scientists. Journal of the Association for Information Science and
Technology, 66(12), 2715-2716. doi: 10.1002/asi.23583.
Boyack, K. W., Klavans, R., Sorensen, A. A., & Ioannidis, J. P. A. (2013). A list of highly
influential biomedical researchers, 1996-2011. European Journal of Clinical
Investigation, 43(12), 1339-1365. doi: 10.1111/eci.12171.
Derrick, G. E., & Pavone, V. (2013). Democratising research evaluation: Achieving greater
public engagement with bibliometrics-informed peer review. Science and Public
Policy, 40(5), 563-575. doi: 10.1093/scipol/sct007.
Diko, M. L. (2015). South African scholars make Thomson Reuters ‘Highly Cited
Researchers 2014’. South African Journal of Science, 111(9/10). doi:
10.17159/sajs.2015/a0121C.
Gazni, A., & Thelwall, M. (2016). The citation impact of collaboration between top
institutions: A temporal analysis. Research Evaluation, 25(2), 219-229. doi:
10.1093/reseval/rvv039.
Ioannidis, J. P. A., Klavans, R., & Boyack, K. W. (2016). Multiple Citation Indicators and
Their Composite across Scientific Disciplines. PLoS Biol, 14(7), e1002501. doi:
10.1371/journal.pbio.1002501.
Li, J. T. (2016). What We Learn from the Shifts in Highly Cited Data from 2001 to 2014?
Scientometrics, 108(1), 57-82.
Moed, H. F., & Halevi, G. (2015). Multidimensional assessment of scholarly research impact.
Journal of the Association for Information Science and Technology, n/a-n/a. doi:
10.1002/asi.23314.
Panchal, H. (2012). David A. Pendlebury. Current Science, 103(10), 1144-1145.
Parker, J., Allesina, S., & Lortie, C. (2013). Characterizing a scientific elite (B): publication
and citation patterns of the most highly cited scientists in environmental science and
ecology. Scientometrics, 94(2), 469-480. doi: 10.1007/s11192-012-0859-6.
Pepe, A., & Kurtz, M. J. (2012). A Measure of Total Research Impact Independent of Time
and Discipline. PLoS ONE, 7(11), e46428. doi: 10.1371/journal.pone.0046428.
Price, D. J. D. (1971). Some Remarks on Elitism in Information and Invisible College
Phenomenon in Science. Journal of the American Society for Information Science,
22(2), 74-75.
Rijcke, S. d., Wouters, P. F., Rushforth, A. D., Franssen, T. P., & Hammarfelt, B. (2015).
Evaluation practices and effects of indicator use—a literature review. Research
Evaluation. doi: 10.1093/reseval/rvv038.
Roemer, R. C., & Borchardt, R. (2015). Meaningful metrics : a 21st century librarian’s guide
to bibliometrics, altmetrics, and research impact. Chicago, Illinois, USA: Association
of College and Research Libraries, A division of the American Library Association.
Rousseau, S., & Rousseau, R. (2015). Metric-wiseness. Journal of the Association for
Information Science and Technology, 66(11), 2389-2389. doi: 10.1002/asi.23558.
25
Thomson Reuters. (2016). The world's most influential scientific minds. Bethesda, MD, USA:
Thomson Reuters.
Whitley, R. (1984). The intellectual and social organization of the sciences. Oxford, UK:
Clarendon Press.
Williams, R., & Bornmann, L. (in press). Sampling issues in bibliometric analysis. Journal of
Informetrics.
Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S., . . . Johnson, B. (2015).
The Metric Tide: Report of the Independent Review of the Role of Metrics in Research
Assessment and Management. Bristol, UK: Higher Education Funding Council for
England (HEFCE).
Ziman, J. (2000). Real science. What it is, and what it means. Cambridge, UK: Cambridge
University Press.