responsible metrics for open...
Post on 30-Sep-2020
4 Views
Preview:
TRANSCRIPT
Responsible Metrics for Open
Science
Paul Wouters
BE-OPEN Workshop, Leiden, 17–19 May 2017
CWTS
1
Centre for Science and Technology
Studies (CWTS)
• Research center at Leiden
University focusing on
quantitative studies of
science (bibliometrics and
scientometrics)
• Bibliometric contract research
– Monitoring & evaluation
– Advanced analytics
– Training & education
2
Research leaders face key questions
• How should we monitor our research?
• How can we profile ourselves to attract the right
students and staff?
• How should we divide funds?
• What is our scientific and societal impact?
• What is actually our area of expertise?
• How is our research trans-disciplinary connected?
3
Research leaders need strategic
intelligence
• Increasing demand for information about research:
– hyper competition for funding
– globalization
– industry – academic partnerships
– interdisciplinary research challenges
– institutional demands on research & university management
• Increased supply of data about research:
– web based research
– deluge of data producing machines and sensors
– increased social scale of research: international teams
– large scale databases of publications, data, and applications
– citation metrics and altmetrics
4
Valuing science and scholarship
• new needs for information about research:
– formulate research programs and priorities
– evaluate research and its scientific and societal impact
– develop strategic visions by universities and research institutes
– inform national and international science and innovation policies
• synergy between quantitative and qualitative
research in science, technology and innovation
studies and in library and information science
• develop richer variety of evaluation practices:
– novel forms of evaluation
– new performance criteria
– new contextualized indicators
5
– Quantitative Science
Studies (QSS) (led by Ludo
Waltman)
– Science and Evaluation
Studies (SES) (led by Sarah
de Rijcke)
– Science, Technology and
Innovation Studies (STIS)
(led by Robert Tijssen)
6
Research groups
Research themes
• Open Science (led by
Thed van Leeuwen and
Paul Wouters
• Responsible Metrics (led
by Paul Wouters)
• Research Integrity (led by
Barend van der Meulen)
7
Research organization
• Research groups push the boundaries of
knowledge:
– Research lines are priority topics within these groups
• Chairs and working groups focus on PhD training
and education:
– Scientometrics
– Science, technology & innovation studies
– Evidence for Science Policy
– Working group Career Studies
• Themes translate expertise into practical knowhow
and professional & policy advice
8
Theme Open Science
• Infrastructures
• Policies
• Data
• Indicators
• Assessment
9
Responsible
Metrics
10
A focus on excellence
• Science and research policy is no longer neutral with
respect to research quality
• From RAE to REF: ”excellence” as the new new thing
• Global university rankings have created a global market
for quality markers
• Within universities faculty competes for resources and
recognition
• Universities have actually delegated their responsibilities
to funding agencies and journals
• Tendency to narrow the variety of academic tasks and
qualities to those that are dominant in the evaluation
systems
11
12
Across the research
community, the
description,
production and
consumption of
‘metrics’ remains
contested and open
to
misunderstandings.
The Leiden Manifesto
• Quantitative evaluation should support expert assessment.
• Measure performance in accordance with the research mission.
• Protect excellence in locally relevant research
• Keep data collection and analytical processes open, transparent and simple.
• Allow for data verification
• Account for variation by field in publication and citation practices
• Data should be interpreted taking into account the difficulty of credit
assignment in the case of multi-authored publications.
• Base assessment of individual researchers on qualitative judgment.
• False precision should be avoided (eg. the JIF).
• Systemic effects of the assessment and the indicators should be taken into
account and indicators should be updated regularly
14
Diana Hicks (Georgia Tech), Paul Wouters (CWTS), Ismael
Rafols (SPRU/Ingenio), Sarah de Rijcke and Ludo Waltman
(CWTS)
http://www.hefce.ac.uk/rsrch/
metrics/
Responsible metrics
Responsible metrics can be understood in terms of:
• Robustness: basing metrics on the best possible
data in terms of accuracy and scope;
• Humility: recognizing that quantitative evaluation
should support – but not supplant – qualitative,
expert assessment;
• Transparency: keeping data collection and
analytical processes open and transparent, so that
those being evaluated can test and verify the results;
• Diversity: accounting for variation by field, using a
variety of indicators to reflect and support a plurality
of research & researcher career paths;
• Reflexivity: recognizing the potential & systemic
effects of indicators and updating them in response.
Measuring is changing
• What counts as excellence is shaped by how we measure
and define “excellence”
• What counts as impact is shaped by how we measure
and define “impact”
• Qualities and interactions are the foundation for
“excellence” and “impact” so we should understand
those more fundamental processes first
• We need different indicators at different levels in the
scientific system to inform wise management that
strikes the right balance between trust and control
• Context crucial for effective data standardization
17
Responsible Metrics for Open Science
• EU Expert Group on Altmetrics, chaired by James
Wilsdon
• Aim: to develop a framework for responsible
metrics and altmetrics for research management
and evaluation, which can be incorporated into the
successor framework to Horizon 2020
• Call for evidence:
https://ec.europa.eu/research/openscience/index.cfm?pg=alt
metrics_eg
• A new type of metrics for a new type of science
18
Context counts
• Responsible metrics is not supposed to be a
universal standard
• Responsible metrics should be responsive and
inclusive metrics
• The context shapes what responsible metrics
means:
– the urgency of social problems (poverty, inequality,
unemployment and corruption)
– local research and educational missions
– the local appropriation of “the global”
– the values embedded in the policies and communities
19
Open Science
20
21
Selected recommendations
Recommendations Short Term Goals Long Term Goal
Ground an open science system in a mix of expert judgement, quantitative, and qualitative measures
Provide guidelines for responsible metrics in support of open science
Fostering
open
science Make better use of existing metrics for open science
Assess suitability of indicators, encourage development of new indicators
Selected recommendations
Recommendations Short Term Goals Long Term Goal
Open, transparent and linked data infrastructure for metrics in open science
Use open metrics and reward adoption of open science principles and practices
Removing barriers
to open
science
Measure what matters
Highlight how inappropriate use of indicators can impede open science
Ambitions for Open Science
• More comprehensive measurement of traditional
scientific publications (eg Mendeley)
• Recognizing and capturing the diversity of scientific
output including new forms (eg software and blogs)
• Opening up the whole scientific publication system
(open access) and more interactive communication
• Opening up the very core of knowledge creation
and its role in higher education and innovation
(participatory science)
24
Thank you for your attention
top related