evolving and emerging scholarly communication services in libraries: public access compliance and...
Post on 16-Apr-2017
328 Views
Preview:
TRANSCRIPT
Evolving and emerging scholarly communication services in libraries:
public access compliance and research impact
Claire StewartAssociate University Librarian for Research & Learning
University of Minnesota
Guest lecture, Dominican University LIS772 February 20, 2016
What do libraries do?
Libraries preserve knowledge, provide access to knowledge, and support
creation of new knowledge
Libraries develop solutions to information problems
SCHOLARLY COMMUNICATION & PUBLIC ACCESS COMPLIANCE
Research & Learning
scholarly comm. support
Liaisons
Research Services
Coordinators
Functional specialists
Liaisons Research Services Coordinators
Functional Specialists
Scholarly communication program, institutional repository
Scholarly communication program, open access policy
2013 OSTP memo
University of Minnesota research funding (FY15)
Source: OVPR Annual Report 2015
Existing challenges in compliance support
Source: Pamela Webb, UMN Associate Vice President for Research Administration, September 2015
Roles for libraries in supporting grant funded research
Services
Education Pre-award services
Post-award support
Planning
Policy/advocacy
Infrastructure
Education
Education
Pre-award services
Pre-award services
Post-award support
Public Access Compliance Monitor
ExpandedAccess
Health Sciences Libraries
My NCBI Awards View
NIH Manuscript Submission System
Submission Methods Journal, by contract with NIH, deposits the published version of all NIH-funded articles in PMCal
Author reviews and approves the PMC-formatted manuscript. PMCID is assigned
Author arranges with Publisher to deposit published version of specific NIH-funded article in PMC
NIHMS sends author an email asking author to approve the submitted materials for processing
NIHMS sends author an email asking author to approve the submitted materials for processing
Author confirms the article is deposited in PMC
Author reviews and approves the PMC-formatted manuscript. PMCID is assigned
Author or delegate submits final peer reviewed manuscript to the NIHMS
Journal publisher submits final peer reviewed manuscript to the NIHMS
A
B
C
D
Credit (and previous slide): Katherine Chew, UMN Health Sciences Libraries
Extending support
• Baseline from Libraries: extend and continue advisory/education role for articles and data, curation and repository services
• Charged a new team to work with Sponsored Projects Administration (SPA)– Monitor agency plans as they are finalized– Test systems and processes– Develop a new joint education program, staffing
recommendations
Complex compliance picture
• NIH gap between voluntary (2005), mandatory (2008), funding impacts (2013). Will other agencies follow this pattern?
• Will Sponsored Projects Administration (SPA) have to work with more than 75 different systems?
• Administrative burden a significant concern: how can libraries, research administration, scholars work together to address?
• How will journals/publishers respond?
Policy/advocacy
Policy/advocacy
BEYOND THE FACTOR: TALKING ABOUT RESEARCH IMPACT
Interest in impact and metrics
• In hiring, in support of promotion & tenure• Funders and publishers, evaluating proposals• Institutional productivity• Impact on our communities
What do we want to know when we talk about impact?
• How has this [researcher’s] work advanced knowledge?
• Has this research been evaluated and by whom? • What is field shaping research?• Who are the researchers shaping my field?• What is going on in my field that’s important, or in a
field that could benefit my work? • What is the broader societal benefit of this work?
(value of higher education, research investments)
THE CITATION as unit of
measurement
Abbreviated metrics overview
Alphabet Miso. https://www.flickr.com/photos/bean/322616749/
JIF: Journal Impact Factor
Source: The Thompson Reuters Impact Factor
Significant variance across disciplines:• Top ranked journal overall:
JIF = 144.800• Top ranked journal in history:
JIF = 2.615
Not based on any single author/article
Oft criticized (DORA, Leiden, HEFCE statements)
eigenfactor
Based on same citation source as the Impact Factor (Thomson’s Journal Citation Reports)
Weights journals by importance based on citation frequency, similar to Google page rank
Also calculates an Article Influence score, over the first 5 years of an article
h index
Scholar-specific:“A scientist has index h if h of his or her Np papers have at least h citations each and the other (Np – h) papers have ≤h citations each.”
Dependent on citation index source (Google scholar and Scopus might have different values)
Doesn’t really account for different citation/usage patterns between fields
Source: Hirsch, J. E. “An Index to Quantify an Individual’s Scientific Research Output.” Proceedings of the National Academy of Sciences of the United States of America 102, no. 46 (November 15, 2005): 16569–72. doi:10.1073/pnas.0507655102.
Altmetrics
• Shares and mentions in non-traditional places, on social media, etc. (twitter, FB, Mendeley, blogs, wikipedia etc.)
• Often dependent on identifiers (DOIs, PubMedIDs, arXivIDs, etc.) which can have lower penetration in arts & humanities fields
Article level metrics
Recent expressions of concern about strictly quantitative approach to research
assessment
Advice to HEFCE (UK)
Framework for responsible metrics• Robustness: use the best
possible data• Humility: quantitative should
support expert assessment (e.g., peer review)
• Transparency: be able to show where data came from & let results be verified
• Diversity: account for variation by field
• Reflexivity: indicators updated as the system & effects change
20 specific recommendations to HEFCE around use of metrics
What are the other questions we will want to ask?
And what kinds of information will we need to answer these questions?
How was it evaluated? (r)evolutions in peer review
Open (PeerJ) and Post-publication (f1000)
Is it reproducible?
Source: Weinberg, Bruce A., Jason Owen-Smith, Rebecca F. Rosen, Lou Schwarz, Barbara McFadden Allen, Roy E. Weiss, and Julia Lane. “Science Funding and Short-Term Economic Activity.” Science 344, no. 6179 (April 4, 2014): 41–43. doi:10.1126/science.1250055.
Where was CIC federal research funding $ actually spent?
IRIS sample products, November 2015
Where did our grad students, postdocs and other research staff find employment?
Across all grants & agencies, what are the kinds of research activities underway at UMN?
What are the other questions we will want to ask?
What do we know about how conversations about research happen? Who do astrophysicists talk to?
Holmberg, Kim, Timothy D. Bowman, Stefanie Haustein, and Isabella Peters. “Astrophysicists’ Conversational Connections on Twitter.” PLoS ONE 9, no. 8 (August 25, 2014): e106086. doi:10.1371/journal.pone.0106086.
Table 1. Roles of the users mentioned in the tweets.Figure 2. Number of people contacted and the number of conversations had by the 32 astrophysicists.Figure 5. Conversational connections in the astrophysicists’ tweets.
What are the other questions we will want to ask?
Who at UMN is doing research in or about countries other than the United States? Who are they collaborating with? What kind of effect has this work had?
‘Effect’ could include: articles, books and reports published, presentations offered, information about who benefited from these outputs, integration into policy development (conversations about and/or new legislation, regulation, etc.)
Why is this hard?
• Wide variety in what constitutes a valuable research output/indicator across disciplines
• Types of outputs expanding• Data about research outputs is messy partly
because it has the typical big data problems: volume, velocity, variety
• Highly distributed scholarly communication infrastructure (the data about outputs is everywhere)
Data on research outputs is messy: inconsistent use of identifiers, etc.
Variant author names
Highly distributed scholarly communications ecosystem
A purely hypothetical picture of where different research outputs might be stored & disseminated
Examples of new types of outputs
nanopub.org publons.com
Nanopublication-like things in the humanities: Should we give credit for assertions about
authorship and provenance?
publons: credit for peer review
Outputs/indicators vs metrics
“The observations here relate to the fact that while there is unease about the use of metrics as a mode of
‘measuring’ the excellence of research produced in the UK’s HEIs, the rich array of data presented as part of REF2014 demonstrates that the arts and humanities
sector are comfortable with deploying numbers (albeit framed as data rather than metrics) to present a case
about the excellence of their research cultures.”
Mike Thelwall, and Maria M Delgado. “Arts and Humanities Research Evaluation: No Metrics Please, Just Data.” Journal of Documentation 71, no. 4 (June 25, 2015): 817–33. doi:10.1108/JD-02-2015-0028.
Why the Libraries?
• This is an information problem• We have the data: metadata and full text
(or at least we know where it is and how to get it)• We understand the science of the data (metadata
in particular)• We’re really into standards• We are discipline neutral• We have strong technology expertise
top related