gaps and assumptions in our research assessment approach: kaust experience

13
Gaps and Assumptions in our research assessment approach KAUST Experience Thibaut LERY, Director, Research Evaluation [email protected] May 18, 2015 kaust.edu.sa King Abdullah University of Science and Technology

Upload: orcid-inc

Post on 26-Jul-2015

94 views

Category:

Presentations & Public Speaking


0 download

TRANSCRIPT

Gaps and Assumptions in our research assessment approach KAUST Experience Thibaut  LERY,    Director,  Research  Evaluation    [email protected]  May  18,  2015  

kaust.edu.sa King Abdullah University of Science and Technology

Vision of a 5 year old University

KAUST   aspires   to   be   a   destination   for   scienti4ic   and   technological  education   and   research.   By   inspiring   discoveries   to   address   global  challenges,  we   strive   to   serve   as  a   beacon   of   knowledge   that   bridges  people  and  cultures  for  the  betterment  of  humanity.    

2  

Mission of the Office of Research Evaluation

1.  Conduct  evaluation  and  foresight  to  foster  data  and  context  driven  decision-­‐making  within  the  Of4ice  of  the  Vice  President  for  Research,    

2.  Collect,  curate,  analyze  and  present  research  related  data,    3.  Promote  a  culture  of  assessment,  strategic  planning  and  research  

integrity  at  KAUST.  

5/11/15   Of4ice  of  Research  Evaluation  –  VPR  Of4ice  –    Dr.  Thibaut  LERY  –  T.  [email protected]    

KAUST by the NUMBERS in 2015

840  Students    

401  Post  Docs    

315  Research  scien8sts    

137  Faculty  

6065  Community  members      

2124  Employees    

1345  School  children    

109  Community    

84  Workforce    

Na8onali8es    

Scholarly Publications

320 Invention Disclosures

219 Patent Applications

16 Patents

33 Active Start Ups

37 Industry Collaborations

KAUST IMPACT (since 2009)

Source of information for the Office

Researchers  +  Students  

Infrastructures  CORE  LABS  

Funds  

ResearchData  +  

Strategy  Publica<ons  +  Reports  

Patents  

Technology    Transfer  

Evalua8on  

Internal funding Agency

+ External

grant management

Office

Panels of experts Reports Senior

Management

SAP

5/11/15   Of4ice  of  Research  Evaluation  –  VPR  Of4ice  –    Dr.  Thibaut  LERY  –  T.  [email protected]     5  

Data Warehouse

Converis + Gifts + BI

Library Repository

Research Evaluation Process

The  Of4ice  agrees  with  the  various  stakeholders  about  the  format,  criteria  and  indicators  of  the  evaluation  that  occurs  every  second  year.    In  preparation  for  the  site  visit,  we  provide:  •  Fact  sheets  with  information  about  the  research  groups  (by  the  Of4ice).  •  Self-­‐evaluations  (by  the  research  groups).  •  Bibliometrics  and  output  analysis  (by  the  Of4ice)  •  Templates  for  the  hearings  conducted  between  an  external  

Committee  and  the  research  groups.    Following  the  recommendations  of  the  4inal  report  and  the  discussions  with  the  Research  groups,  the  University  draws  new  strategic  plans.  

 

The Research Council of Norway

Recipes for Research Evaluation

•  Methodologies  and  data  collection  should  be  open,  transparent  and  explained  upfront  (training)  

•  Indicators  and  criteria  should  be  agreed  with  all  the    stakeholders  (no  hidden  agenda)  

•  Weak  signals  and  informal  discussions  are  key  •  Quantitative  evaluation  must  support  Qualitative  expert  assessment  

•  Evaluation  should  lead  to  strategy  building    •  Strategies  should  follow  stable  and  ef4icient  policies  and  practices  

5/11/15   Of4ice  of  Research  Evaluation  –  VPR  Of4ice  –    Dr.  Thibaut  LERY  –  T.  [email protected]     7  

Close collaboration with the Library

The  Of4ice  of  Research  evaluation  works  closely  with  the  Library  to  train  and  educate  researchers  and  students  about  bibliometrics  and  their  usage.  Key  topics:  •  Value  of  using  citation  databases  in  the  literature  search  •  Understanding  citation  metrics  and  tools  (h-­‐Index,  FWCI,  Scopus,  WoS,  

Scival,  Incites,  Altmetrics,  etc.)  •  Role  of  publications  in  effecting  institutional  rankings  •  Understanding  researcher  pro4iling  (ORCID,  Google  Scholar  etc.)  •  Bene4its  of  open  access  and  best  practices  of  the  institutional  repository  •  Research  integrity,  plagiarism,  and  the  use  of  similarity  checking  tools  

The Research Council of Norway

Scival.:  May  2014    

4.8  2.7   1.7   1.6   1.6  

3.0  

29  33  

35  33  

38  42  

0  

5  

10  

15  

20  

25  

30  

35  

40  

45  

2009   2010   2011   2012   2013   2014  

%  of  Publica8ons  in  top  1%  journals   %  of  Publica8ons  in  top  10%  journals  

%  

•  On  average,  1/3  of  KAUST  publica<ons  are  in  the  top  10%  journals  •  21  of  the  1277  publica<ons  in  2013  were  in  the  top  1%  journals  

average  

9  5/11/15   Of4ice  of  Research  Evaluation  –  VPR  Of4ice  –    Dr.  Thibaut  LERY  –  T.  [email protected]    

Benchmarking against other Universities

Based on Scopus data

between 2011 and 2013

(excluding self-citations)

CALTECH'

Carnegie'Mellon'

EPFL'

ETH'Zurich'

GeorgiaTech'

Harvard'

HKUST'

Istanbul'Univ.' King'Abdulaziz'KFUPM'

KAIST'Lehigh'Univ.'

MIT'

NU'Singapore'

Princeton'

Shanghai'J.'Univ.'

Texas'A&M'

Berkeley'

Cambridge'

Copenhagen'

Univ.'of'Tokyo'

0'

1'

2'

3'

4'

5'

6'

7'

8'

9'

10'

0' 10' 20' 30' 40'

CitaWo

ns'per'pub

licaW

ons'

PublicaWons'in'Top'10'Journal'PercenWles'(%)'

Benchmarking against other Universities

Based on Scopus data

between 2011 and 2013

(excluding self-citations)

CALTECH'

Carnegie'Mellon'

EPFL'

ETH'Zurich'

GeorgiaTech'

Harvard'

HKUST'

Istanbul'Univ.' King'Abdulaziz'

KAUST'

KFUPM'

KAIST'Lehigh'Univ.'

MIT'

NU'Singapore'

Princeton'

Shanghai'J.'Univ.'

Texas'A&M'

Berkeley'

Cambridge'

Copenhagen'

Univ.'of'Tokyo'

0'

1'

2'

3'

4'

5'

6'

7'

8'

9'

10'

0' 10' 20' 30' 40'

CitaWo

ns'per'pub

licaW

ons'

PublicaWons'in'Top'10'Journal'PercenWles'(%)'

Gaps and assumptions in Evaluations

12

Selection  

• Dif4icult  career  tracking  • Af4iliation  issues  • Unique  ID  for  Universities  and  research  facilities  

Data  

• Non-­‐standard  format  • Missing  data  • Tools  and  expertise  • Connected  tools  and  DB  

Evaluation  

• Ex-­‐ante  • Ex-­‐post  • Stakeholder  contribution  • Follow-­‐up  after  evaluation  

Foresight  

• Hidden  agenda  • No  clear  guidance  • Time  constraints  • Funding  issues  

Experts  

• Availability  • Personalities  • No  usage  of  social  networks  

Selection  

• Full  publication  list  • Identi4ied  science  area  • Researcher/Facilities/management  

Data  

• Standard  data  collection,  curation,  and  analysis  • Research  data  management  • Access  to  data  

Evaluation  

• Self-­‐evaluation  • Expert  panels  • Dialogue  with  all  the  stakeholders  • Budget  consequences  

Foresight  

• Grand  challenges  • Strategies  in  place  • Com.  Plans  • Stakeholder  engagement  

Experts  

• Access  to  adequate  experts  • Bibliometrics  are  good  indicators  

AS

SU

MP

TIO

NS

G

AP

S

5/11/15   Of4ice  of  Research  Evaluation  –  VPR  Of4ice  –    Dr.  Thibaut  LERY  –  T.  [email protected]    

Thank you

[email protected]  

Contributions  from    Janardhanan  Vijayakumar    

Mohamed  Baessa  Daryl  Grenz