uksg conference 2016 breakout session - institutional insights: adopting new metrics, terry...
TRANSCRIPT
Work smart. Discover more.
Supported by
Institutional insights:adopting new metrics
Work smart. Discover more.
Supported by
What are altmetrics?
Mentions in news reportsMentions in social media
Mentions in blogsSocial bookmarks
Reference manager readers
Policy documents citationsWikipedia citations
Journal Impact FactorCitation counts
METRICS OFACADEMIC ATTENTION
BROADER INDICATORS OF
ATTENTION
Alternative metrics“altmetrics”
+Traditional metrics
Article-centric, as opposed to journal-centric
Work smart. Discover more.
Supported by
The conversation happens online...
Source: Altmetric internal data, Feb 2016
Work smart. Discover more.
Supported by
Early indicators of attention
Work smart. Discover more.
Supported by
Funders: show us Impact & Dissemination In the REF, impact is defined as an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia.
http://impact.ref.ac.uk/CaseStudies/FAQ.aspx
Whereby all aspects will receive particular attention, i.e. the extent to which project outputs should contribute to the expected impacts described for the topic, to enhancing innovation capacity and integration of new knowledge, to strengthening the competitiveness and growth of companies by developing and delivering innovations meeting market needs, and to other environmental or social impacts[…]http://ec.europa.eu/research/participants/data/.../h2020-evaluation-faq_en.pdf
Work smart. Discover more.
Supported by
Open Access content gets more AttentionThere does seem to be a marked difference in the attention received by OA vs non-OA articles - although this is most noticeable in the attention received from Twitter and Mendeley readers.
Coverage from news and blogs did not change so dramatically between the two groups - and this is likely because the publisher (NPG) do not distinguish between the model of publication when choosing which articles to press release.
http://dx.doi.org/10.6084/m9.figshare.1543424
Work smart. Discover more.
Supported by
How Altmetric Tracks Mentions1. We find a mention of a
domain that we are interested in
2. We follow the link to the page
3. We look for the item’s identifier in the HTML metadata tags
4. We record the connection between the item and the mention in our database
Work smart. Discover more.
Supported by
Tracked Content• Publisher journal platforms
DOIs registered at CrossRef• PubMed Central• arXiv• RePEc• Repositories with handles or URNs• Research Data (inc figshare, Dryad)
DOIs registered at DataCite • Publication series tracked by URL
(e.g. The Conversation, World Bank)• Soon: books and book chapters (tracking
ISBNs as well as DOIs)
Work smart. Discover more.
Supported by
Sources of AttentionNews outlets
• Over 2,400 sites• Manually curated
list• Global coverage• Text mining
Social media and blogs
• Nearly 10,700 blogs
• Twitter, Facebook, Google+Public posts only
Reference managers
• Mendeley, CiteULike• Reader counts• Don’t count towards
the Altmetric score
Other sources
• Wikipedia• YouTube• Reddit
Peer Review and
Recommendations
• Publons• PubPeer• F1000
Policy documents• IPCC• IMF• FAO• gov.uk
• WHO• Red Cross• MSF• Many more
Work smart. Discover more.
Supported by
The Donut and the Score
Work smart. Discover more.
Supported by
Altmetric is all about providing indicators of attention and engagement, not counting beans. We broadly agree with:
The Metric Tide Reporthttp://www.hefce.ac.uk/pubs/rereports/Year/2015/metrictide/Title,104463,en.html
The Leiden Manifesto for Research Metricshttps://vimeo.com/133683418http://doi.org/10.1038/520429a
But it is not about the numbers
Work smart. Discover more.
Supported by
Click on the donut to view the full details for any article
Work smart. Discover more.
Supported by
Click through to every news story, every blog post, every tweet, every Twitter profile
Work smart. Discover more.
Supported by
• Site-wide access – IP authentication (Shibboleth to come)• Reporting at the institution, department, or author level• Explore all the articles we track too – a current awareness service
for ‘talked about’ research• Create custom groups of articles to report on• Apply filters, save searches, set alerts• Export via Excel or PDF• Full API integration – api.altmetric.com
Explorer for Institutions
Work smart. Discover more.
Supported by
Librarians Enhance the repository or discovery service Demonstrate value of Open Access Provide a novel current awareness service Help researchers improve their reach
Research administrators Monitor and report on reach and attention for
publications by department Find leads for impact case studies Show that you meet funder expectations re
engagement
Communications / PR team Find institutional success stories Manage the institution’s reputation Understand the reach of your research Find media-friendly stories/researchers
Researchers Find potential collaborators, e.g. for EU funding
bids Find indicators of impact for grant applications or
CV Establish academic social network Inform decisions on publishing choices
Benefits: timely, visible, auditable
Altmetric at theUniversity of Glasgow
@williamjnixonHead of Digital Library Services#UKSG16
Why altmetrics (and Altmetric)?• Adding value and engaging colleagues and researchers • Complementing traditional metrics (like citation counts)• Discovering new outlets for content• Exploring new measures of impact• Working with services like Altmetric.com• Integrating with repositories (like EPrints)• Tracking attention and identifying impact• Aggregating Glasgow Altmetric data• Providing new ways to discover research
A Glasgow Timeline
Jan2013
Aug2013
May 2015
Altmetric in the Research mix
Implementation: EPrints & API
Implementation: Explorer
Reaction & Usage
Next Steps
[email protected]@williamjnixon
www.glasgow.ac.uk/enlightenORCID: 0000-0003-1780-1106
Thanks for your attention
Of parachutes and bicycle helmets!Yvonne Nobis, Head of Science Information Services
Betty and Gordon Moore Library
University of Cambridge
#altmetrics or #bibliobollocks?
Article level metrics are not uncontroversial as the hash-tags above demonstrate..
Article-Level Metrics: An Ill-Conceived and Meretricious Idea“Many are excited about innovative measures that purport to quantify scholarly impact at a more granular level. Called article-level metrics or ALMs, these measures depart from time-honored computations of scholarly influence such as the journal impact factor. Instead, they rely on data generated from popular sources such as social media and other generally non-scientific and meager venues. As someone who studies predatory open-access scholarly publishers, I can promise you that any system designed to measure impact at the article level will be gamed, rendering the metrics useless and invalid.”
Jeffrey Beall (Librarian at Auraria Library, University of Colorado Denver)
http://scholarlyoa.com/2013/08/01/article-level-metrics/
• #bibilobollocks courtesy of @david_colquhoun http://www.dcscience.net/
What’s sauce for the goose, sauce for the gander!
Peer reviewed journal articles are still the standard currency of science, irrespective of publishing medium
Exponential increase in scholarly publishing activity and different forms coming on stream
However not all research is included – ‘grey literature/ policy papers’ – many still have impact (A success of the Cambridge colloboration with Altmetric is the inclusion of policy papers).
What’s sauce for the goose, sauce for the gander!
Impact –Altmetrics measure attention Important for REF /Engagement/
Cambridge 300 users –working with the research office to encourage use
Feed from Symplectic/ trial developments
Interdisciplinary conversations
Facebook – Bicep2/ LIGOR results Undermine the currency of the JIF?
Using altmetrics
The scale of the problem
Problem for academics (and for those working in research support)
To find out whether your research is being cited (referenced)
To find out the impact of your research
Which new papers are relevant to you?
Which papers should you read if you are going to pursue research in a field slightly out of your comfort zone?
To identify relevant research papers
Information Overload or Filter Failure?
Clay Shirky
You can complain about information overload but the only way to deal with it is to build and use better filters
(Web 2.0 Expo NY: Clay Shirky (shirky.com) It's Not Information Overload. It's Filter Failure.)
http://blip.tv/web2expo/web-2-0-expo-ny-clay-shirky-shirky-com-it-s-not-information-overload-it-s-filter-failure-1283699
This challenges the idea that we've got information overload problems …what we have is a series of filter failures, as our systems for managing information abundance are swamped by the growth of information (Cory Doctorow)
Altmetrics: taking citations out of the dataset
The challenge is taking metrics out of the datasets and reflecting user behaviour…
The ‘filtering’ problem becomes even greater!
The Altmetric manifesto maintains that scholarship’s three main filters for importance are failing: peer review (slow, encourages conventionality and fails to hold referees to account), citation counting, useful but not sufficient and Journal impact factors.
http://altmetrics.org/manifesto/
Uses of metrics/altmetrics
Citations are arguably a ‘quality control mechanism’?
Discovery tools: identifying research and individuals
Funding
REF
Reputation of department / institution
Career progression
Citation analysis ‘My H-index is bigger than yours’!
In an attempt to normalise citations Hirsch or H-Index increasingly popular (which is the number h such that the author has h papers cited at least h times)
Citation Analysis problems
Different systems not all citation counts are equal
The "same" metric usually varies from database to database (e.g. a h-index from Scopus or from Web of Science), because of their different journal coverage, time-periods, or citation verification.
Time delay between publication and first citations
Some citations don’t count ‘officially’ – for example pre-prints
Sentiment of citing author
Using altmetrics
Altmetrics are designed to highlight the attention a scholarly output has received, and not to be a measure of quality (which many consider the impact factor to be) Caveat –Wakefield and MMR!
‘Impact’ is a term that can be understood in many different ways: the impact of research on other research, impact of research on wider society.
The term ‘impact’ for REF purposes is used to mean
“any effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia”
The Cambridge experience
The Cambridge experience
What next?
Roll out across the university (Research Office)
Advocacy
Research skills training
Bibliometrics/Altmetrics service
Online profile and maximising impact
23 things programme