a tool for librarians to select metrics across the research lifecycle

51
| 0 Chris James, Product Manager Research Metrics, Elsevier A tool for librarians to select metrics across the research lifecycle ER&L Conference, Austin, Texas, 5 th April 2017

Upload: libraryconnect

Post on 22-Jan-2018

511 views

Category:

Education


0 download

TRANSCRIPT

Page 1: A tool for librarians to select metrics across the research lifecycle

| 0

Chris James, Product Manager Research Metrics, Elsevier

A tool for librarians to select metrics across

the research lifecycle

ER&L Conference, Austin, Texas, 5th April 2017

Page 2: A tool for librarians to select metrics across the research lifecycle

| 1

Session outline

Following this session, you will:

• Be familiar with the range of research impact metrics

available to you

• Have some guidelines to help you select appropriate

sets of metrics for different use cases

• Understand a free, new set of journal metrics, called

CiteScore metrics

Page 3: A tool for librarians to select metrics across the research lifecycle

| 2

Many places to share research

Page 4: A tool for librarians to select metrics across the research lifecycle

| 3

Librarians need to report and track research outputs

Whether it’s to:

• Improve your electronic resource

collections

• Assist researchers with funding

applications

To do this a basket of metrics is

needed…

Metrics

Page 5: A tool for librarians to select metrics across the research lifecycle

| 4

The basket of metrics is diverse…

Metric theme Metric sub-theme

A. Funding Awards

B. Outputs Productivity of research outputs

Visibility of communication channels

C. Research Impact Research influence

Knowledge transfer

D. Engagement Academic network

Non-academic network

Expertise transfer

E. Societal Impact Societal Impact

F. Q

ua

lita

tive

in

pu

t

Page 6: A tool for librarians to select metrics across the research lifecycle

| 5

… and the diverse metrics are available for all entities

Outputse.g. article, research data, blog,

monograph

Custom set of outputse.g. funders’ output, articles I’ve

reviewed

Researcher or group

Institution or group

Subject Area

Seriale.g. journal, proceedings

Portfolioe.g. publisher’s title list

Country or group

Metric theme

A. Funding

B. Outputs

C. Research Impact

D. Engagement

E. Societal Impact

F. Q

ua

lita

tive

in

pu

t

Page 7: A tool for librarians to select metrics across the research lifecycle

| 6

Users in different countries select different metrics

Metric World Australia Canada China Germany JapanUnited

KingdomUnited States

Field-Weighted Citation Impact

1 1 1 3 2 4 3 1

Outputs in Top Percentiles 2 2 3 1 4 1 1 6

Publications in Top Journal Percentiles

3 4 2 2 6 2 2 5

Collaboration 4 6 6 5 1 3 5 7

Citations per Publication 5 3 7 6 3 5 4 3

Citation Count 6 5 5 4 8 6 6 2

h-indices 7 7 4 8 7 7 7 4

Usage of metrics available in SciVal’s Benchmarking module from 11 March 2014 to 28 June 2015.

A partial list of the metrics available at that time is shown, focusing on the most frequently-used. Scholarly Output it excluded since this is the default.

Note that recently added metrics based on e.g. media mentions and awards data were not available at this time and so are not represented in this

analysis.

Page 8: A tool for librarians to select metrics across the research lifecycle

| 7

Resources for Librarians

https://libraryconnect.elsevier.com/articles/librarian-quick-reference-cards-research-impact-metrics

Page 9: A tool for librarians to select metrics across the research lifecycle

| 8

Select the metrics to match the situation

Both evaluation and profiling use cases

Page 10: A tool for librarians to select metrics across the research lifecycle

| 9

Introducing, the latest free addition…

This comprehensive, current and open metric for journal

citation impact (introduced in December 2016) is available

in a free layer of Scopus.com. It includes a yearly release

and monthly CiteScore Tracker updates. Find CiteScore

metrics for journals, conference proceedings, book series

and trade journals at https://www.scopus.com/sources

https://libraryconnect.elsevier.com/metrics

citations in a year to documents publishedin previous 3 years

# of documents in previous 3 years

• CITESCORE

Page 11: A tool for librarians to select metrics across the research lifecycle

| 10

CiteScore is a simple metric for all Scopus journals

A free basket of comprehensive, transparent and current

metrics that provide a simple way to measure the citation

impact of serials, such as journals, conference

proceedings and books, over a 3 year period.Metrics

Page 12: A tool for librarians to select metrics across the research lifecycle

| 11

CiteScore calculation

CiteScore Impact Factor

A = citations to 3 years of documents A = citations to 2 or 5 years of documents

B = all documents indexed in Scopus, same as A B = only citable items (articles and reviews), different from

A (not editorials or letters-to-the-editor)

B

CiteScore 2015 value

B

=

ADocuments from 3 years

2012 2013 2014 2015 2016

A

2011

B

ADocuments from 2 years

Impact Factor 2015 value

B

=

A

Citations received in

2015

Citations received in

2015

New from Scopus

Page 13: A tool for librarians to select metrics across the research lifecycle

| 12

Advantages of CiteScore metrics

Comprehensive

Based on Scopus, the

world’s broadest abstract

and citation database

CiteScore metrics will be

available for all serial

titles, not just journals

CiteScore metrics could

be calculated for

portfolios

Transparent

CiteScore metrics will be

available for free

CiteScore metrics are

easy to calculate for

yourself

The underlying

database is available for

you to interrogate

Current

CiteScore Tracker is

updated monthly

New titles will have

CiteScore metrics the

year after they are

indexed in Scopus

Page 14: A tool for librarians to select metrics across the research lifecycle

| 13

How does CiteScore compare to the

Impact Factor?

Page 15: A tool for librarians to select metrics across the research lifecycle

| 14

R² = 0.7524

-20

0

20

40

60

80

100

120

140

0 10 20 30 40 50 60 70

Impact F

acto

r 2015

CiteScore 2015

2015 Impact Factor and 2015 CiteScore

CiteScore 2015 correlates 75% with Impact Factor

Page 16: A tool for librarians to select metrics across the research lifecycle

| 15

0

50

100

150

200

250

300

350

4001 3 5 7 9

11

13

15

17

19

21

23

25

27

29

31

33

35

37

39

41

43

45

47

49

51

53

55

57

59

61

63

65

67

69

71

73

75

77

79

81

83

85

87

89

91

93

95

97

99

All CS %iles JIF %iles Unique CS %ilesHIGHLOW

Nu

mb

er

of jo

urn

als

• 22,256 titles have CiteScore 2015

• 22,620 titles have CiteScore Tracker 2016

Journals with CiteScore cover all levels of performance

Page 17: A tool for librarians to select metrics across the research lifecycle

| 16

Comparison of CiteScore, CiteScore Tracker and Impact Factor

Desirable characteristic CiteScore CiteScore Tracker Impact Factor

Metric measures citations per document

Replicate strong

characteristicsSimple method

Annual snapshot for reporting purposes

Document type consistency (num. and denom.)

Improved

methodology

Fair compromise for all fields – 3y citation window

Derivative metric addresses disciplinary

differences

Ongoing inclusion of error correction

Available for all serials indexed (not only journals) Comprehensive

New titles have the metric next calendar year

CurrentTracking view for verification and decision making

Metric is current – updated monthly

It’s calculated from the same database I use

Transparent

Metric and derivative metrics are free

I can use a free widget on my webpage

Journal-level evaluation functionality is free

Underlying database available to verify calculation

Page 18: A tool for librarians to select metrics across the research lifecycle

| 17

Which other metrics would you like more details on?

Page 19: A tool for librarians to select metrics across the research lifecycle

| 18

# of citations received by a document

expected # of citations for similar documents

FIELD-WEIGHTED

CITATION IMPACT (FWCI)

Similar documents are ones in the same discipline,

of the same type (e.g., article, letter, review) and of the

same age.

An FWCI of 1 means that the output performs

just as expected against the global average. More than

1 means that the output is more cited than expected

according to the global average; for example,

1.48 means 48% more cited than expected.

https://libraryconnect.elsevier.com/metrics

Page 20: A tool for librarians to select metrics across the research lifecycle

| 19

# of articles in the collection (h) thathave received at least (h) citations overthe whole period

h-INDEX

For example, an h-index of 8 means that 8 of the

collection’s articles have each received at least 8 citations.

h-index is not skewed by a single highly cited paper,

nor by a large number of poorly cited documents.

This flexible measure can be applied to any collection

of citable documents. Related h-type indices emphasize

other factors, such as newness or citing outputs’ own

citation counts. http://www.harzing.com/pop_hindex.htm

https://libraryconnect.elsevier.com/metrics

Page 21: A tool for librarians to select metrics across the research lifecycle

| 20

average # of weighted citations receivedin a year

# of documents published in previous 3 years

SCIMAGO JOURNAL

RANK (SJR)

Citations are weighted – worth more or less – depending

on the source they come from. The subject field, quality

and reputation of the journal have a direct effect on the

value of a citation. Can be applied to journals, book

series and conference proceedings.

https://libraryconnect.elsevier.com/metrics

Calculated by SCImago Lab based on Scopus data.

http://www.scimagojr.com

Page 22: A tool for librarians to select metrics across the research lifecycle

| 21

journal’s citation count per paper

citation potential in its subject field

SOURCE NORMALIZED

IMPACT PER PAPER (SNIP)

The impact of a single citation will have a higher

value in subject areas where citations are less likely,

and vice versa. Stability intervals indicate the reliability

of the score. Smaller journals tend to have wider

stability intervals than larger journals.

https://libraryconnect.elsevier.com/metrics

Calculated by CWTS based on Scopus data.

http://www.journalindicators.com

Page 23: A tool for librarians to select metrics across the research lifecycle

| 22

citations in a year to documentspublished in previous 2 years

JOURNAL IMPACT FACTOR

Based on Web of Science data, this metric is updated

once a year and traditionally released in June following

the year of coverage as part of the Journal Citation

Reports®. JCR also includes a Five-year Impact Factor.

https://libraryconnect.elsevier.com/metrics

# of citable items in previous 2 years

Page 24: A tool for librarians to select metrics across the research lifecycle

| 23

# of users who added an article intotheir personal scholarly collaborationnetwork library

SCHOLARLY

ACTIVITY ONLINE

The website How Can I Share It? links to publisher

sharing policies, voluntary principles for article sharing

on scholarly collaboration networks, and places to share

that endorse these principles, including Mendeley, figshare,

SSRN and others. http://www.howcanishareit.com

https://libraryconnect.elsevier.com/metrics

Page 25: A tool for librarians to select metrics across the research lifecycle

| 24

# of mentions in scientific blogsand/or academic websites

SCHOLARLY

COMMENTARY ONLINE

Investigating beyond the count to actual mentions by

scholars could uncover possible future research

collaborators or opportunities to add to the promotion

and tenure portfolio. These mentions can be found in

the Scopus Article Metrics module and within free and

subscription altmetric tools and services.

https://libraryconnect.elsevier.com/metrics

Page 26: A tool for librarians to select metrics across the research lifecycle

| 25

SOCIAL ACTIVITY ONLINE

https://libraryconnect.elsevier.com/metrics

# of mentions on micro-blogging sites

Micro-blogging sites may include Twitter, Facebook,

Google+ and others. Reporting on this attention is

becoming more common in academic CVs as a way

to supplement traditional citation-based metrics,

which may take years to accumulate. They may also

be open to gaming.

http://www.altmetric.com/blog/gaming-altmetrics

Page 27: A tool for librarians to select metrics across the research lifecycle

| 26

# of mentions in mass or popular media

MEDIA MENTIONS

https://libraryconnect.elsevier.com/metrics

Media mentions are valued indicators of social impact

as they often highlight the potential impact of the

research on society. Sources could include an

institution’s press clipping service or an altmetric

provider. Mendeley, Scopus (Article Metrics module),

Pure and SciVal also report on mass media.

Page 28: A tool for librarians to select metrics across the research lifecycle

| 27

compares items of same age, subject area& document type over an 18-month window

PERCENTILE BENCHMARK

(ARTICLES)

https://libraryconnect.elsevier.com/metrics

The higher the percentile benchmark, the better. This

is available in Scopus for citations, and also for

Mendeley readership and tweets. Particularly useful

for authors as a way to contextualize citation counts

for journal articles as an indicator of academic impact.

Page 29: A tool for librarians to select metrics across the research lifecycle

| 28

extent to which a research entity’sdocuments are present in the most citedpercentiles of a data universe

OUTPUTS IN TOP

PERCENTILES

https://libraryconnect.elsevier.com/metrics

Found within SciVal, Outputs in Top Percentiles can

be field weighted. It indicates how many articles are

in the top 1%, 5%, 10% or 25% of the most cited

documents. Quick way to benchmark groups of

researchers.

Page 30: A tool for librarians to select metrics across the research lifecycle

| 29

The 2 Golden Rules…

29

Page 31: A tool for librarians to select metrics across the research lifecycle

| 30

Two Golden Rules for using research metrics

When used correctly, research metrics together with qualitative input

give a balanced, multi-dimensional view for decision-making

Always use both qualitative

and quantitative input into

your decisions

Always use more than one

research metric as the

quantitative input

Page 32: A tool for librarians to select metrics across the research lifecycle

| 31

Example: importance of using multiple metrics from the basket -

compensate for weaknesses

Compensates for differences in

field, type and age

Meaningful benchmark is “built in”

– 1 is average for a subject area

× People may not like small

numbers

× Complicated; difficult to validate

× No idea of magnitude: how many

citations does it represent?

with

Large number

Simple, easy to validate

Communicates magnitude of

activity

× Affected by differences in field,

type and age

× Meaningless without additional

benchmarking

Field-Weighted Citation

Impact

= 2.53

Citations per Publication

= 27.8

Page 33: A tool for librarians to select metrics across the research lifecycle

| 32

Filling the gap in the Scopus basket of journal metrics

32

withSNIP and SJRCiteScore

and associated metrics

Compensates for differences in

field, type and age

Meaningful benchmark is “built

in” – 1 is average for a subject

area

× People may not like small

numbers

× Complicated; difficult to validate

× No idea of magnitude: how many

citations does it represent?

Large number

Simple, easy to validate

Communicates magnitude of

activity

× Affected by differences in field,

type and age

× Meaningless without additional

benchmarking

Page 34: A tool for librarians to select metrics across the research lifecycle

| 33

Evaluation(top-down)

Profiling(bottom-up)

Should ideally be openly communicated

A core set of metrics (KPIs)

– determined per evaluator SS

S

S

Supplementary sets per

discipline and entity-type

S

SS

S

Supplement with tailored metrics

– customized per individual entityS

Draw from the core set

– as relevant to the evaluatorSS

S S

S

Evaluation and profiling both draw from the same basket

Page 35: A tool for librarians to select metrics across the research lifecycle

| 34

CiteScore

metrics

Usage

Citations

Audience

Patents

Scholarly Activity

Academic

Opinion

Social Activity

Media Activity

Outputs

Funding awards

Entities to which metrics

apply:

Outputs

e.g. article, research

data, book, monograph

Custom set

e.g. articles I’ve reviewed

Serial

e.g. journal

Publisher or library

portfolio

Researcher

Institution

Country

Subject Area

Editor

Board

Authors

Community Contributions ConsumptionScholarly

ImpactSocial Impact

Geographical

spread

Collaboration

network

Sector

distribution

h-, g-, m-

indices

Scholarly

Output

Research data

output

Conference

outputCitation counts

Usage counts

SNIP, SJR, IF

Audience

Scholarly

Discussion

Peer review

metrics

Prizes and

awards

Social media

mentions

Media

mentions

Medical

guidelines

Influence

policies

Mendeley

Counts

Type of metric:

Individual

metrics

Funding

sources

Patent metricsSnowball

Metrics

CiteScore metrics anchor the broader basket of metrics

Page 36: A tool for librarians to select metrics across the research lifecycle

| 35

From where can you access these metrics?

There are a number of free and paid resources

Online demo time…

Altmetrics

CiteScore metrics, including SNIP and SJR

Page 37: A tool for librarians to select metrics across the research lifecycle

| 42

SciVal - Institutions monitor their researchers’ overall output

Page 38: A tool for librarians to select metrics across the research lifecycle

| 43

CiteScore metrics on Elsevier.com

Page 39: A tool for librarians to select metrics across the research lifecycle

| 44

SNIP and SJR

http://www.journalindicators.com/indicators http://www.scimagojr.com/journalrank.php

Page 40: A tool for librarians to select metrics across the research lifecycle

| 45

A glance to the future…

Page 41: A tool for librarians to select metrics across the research lifecycle

| 46

https://blog.scopus.com/get-involved

Page 42: A tool for librarians to select metrics across the research lifecycle

| 47

Summary

Golden Rules: both expert opinion and research metrics are needed to fully

describe research performance

The basket of metrics is a “menu” that contains diverse metrics for all entities

Metrics relevant to your question should be selected from the basket, for both

entities you are investigating and suitable peers

The basket of metrics enables both evaluation and profiling use cases

Page 43: A tool for librarians to select metrics across the research lifecycle

| 48

Resources

• CiteScore metrics - https://journalmetrics.scopus.com/

• CiteScore on Scopus - https://www.scopus.com/sources

• Librarian Quick Reference Cards for Research Impact Metrics -https://libraryconnect.elsevier.com/articles/librarian-quick-reference-cards-research-impact-metrics

• SciVal Metrics Guidebook - https://www.elsevier.com/research-intelligence/resource-library/scival-metrics-guidebook

• Plum Analytics - http://plumanalytics.com/

• SNIP - http://www.journalindicators.com/indicators

• SJR - http://www.scimagojr.com/journalrank.php

Page 44: A tool for librarians to select metrics across the research lifecycle

| 49

www.elsevier.com/research-intelligence

Questions?

Page 45: A tool for librarians to select metrics across the research lifecycle

| 50

Appendix

Page 46: A tool for librarians to select metrics across the research lifecycle

| 51

LEGEND

Document*

Author

Journal

Indicates that the Snowball Metrics group

agreed to include as a standardized metric,

which is data source and system agnostic.

https://www.snowballmetrics.com

*“Document” in the definitions refers to primary document types such as journal

articles, books and conference papers. See Scopus Content Coverage Guide

(page 9) for a full list of document types: https://goo.gl/bLYH0v

Page 47: A tool for librarians to select metrics across the research lifecycle

| 52

# of citations accrued since publication

CITATION COUNT

A simple measure of attention for a particular article,

journal or researcher. As with all citation-based measures,

it is important to be aware of citation practices. The paper

“Effective Strategies for Increasing Citation Frequency” lists

33 different ways to increase citations.

http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2344585

https://libraryconnect.elsevier.com/metrics

Page 48: A tool for librarians to select metrics across the research lifecycle

| 53

# of items published by an individual or group of individuals

DOCUMENT COUNT

A researcher using document count should also provide

a list of document titles with links. If authors use an

ORCID iD – a persistent scholarly identifier – they can

draw on numerous sources for document count

including Scopus, ResearcherID, CrossRef and PubMed.

Register for an ORCID iD at http://orcid.org

https://libraryconnect.elsevier.com/metrics

Page 49: A tool for librarians to select metrics across the research lifecycle

| 54

Partnering with the Library Community

LIBRARY CONNECT

https://libraryconnect.elsevier.com

Content by

Elsevier Library Connect &

Jenny Delasalle

Freelance librarian & consultant

@JennyDellasalle

CC for Quick Reference

Cards:

Elsevier, Scopus, SciVal, Mendeley, Pure and other Elsevier trademarks are the property of Elsevier B.V. and its

affiliates. Other trademarks, including the SNIP and SJR icons, are the property of their respective owners.

Page 50: A tool for librarians to select metrics across the research lifecycle

| 55

Join the conversation:

@library_connect

libraryconnect

company/libraryconnect

Page 51: A tool for librarians to select metrics across the research lifecycle

| 56