evaluation 2006 1 new directions in the use of network analysis in r&d evaluation evaluation...
TRANSCRIPT
Evaluation 20061
New Directions in the Use of Network Analysis in
R&D Evaluation
Evaluation 2006
Annual Meeting of the American Evaluation Assocation
November 1-4, Portland, OR
Jonathon E. Mote, University of Maryland
Gretchen Jordan, Sandia National Laboratories
Jerald Hage, University of Maryland
Work presented here was completed for the U.S. DOE Office of Science by Sandia National Laboratories, Albuquerque, New Mexico, USA under Contract DE-AC04-94AL8500 and under contract with the National Oceanic and Atmospheric Agency (NOAA). Sandia is operated by Sandia Corporation, a subsidiary of Lockheed Martin Corporation. Opinions expressed are solely those of the authors.
Evaluation 20062
New Directions in Network Analysis
Research conducted by conducted Jerald Hage and Jonathon Mote at the Center for Innovation, University of Maryland, in collaboration with Gretchen Jordan at Sandia National Laboratories.
Part of a long-standing U. S. Department of Energy (DOE) Office of Basic Energy Sciences interest in understanding and developing tools to assess key factors in the research environment that foster excellence in order to improve performance.
Began exploring the use of network analysis three years ago.
Interested in moving network analysis into new areas – knowledge networks and interactions with the research environment
Evaluation 20063
Networks: How R&D Really Gets Done
Create or AccessResources-People-Knowledge-Equipment-Funds
Accomplish/Disseminate Work/R&D-Focus, plan, communicate-Integrate ideas, functions-Make R&D progress-Disseminate/absorb R&D outputs
Produce Desired Outcomes-Knowledge advance and product / process innovation-Problems solved-With what speed-Affecting whom
Evaluation 20064
Networks: Still Many Questions
Despite the importance of social networks in R&D, much is still unclear.
Need to distinguish between networks and network outcomes. Networks - How do they work? Still a black box. Network outcomes - How can we evaluate them?
How do they work? Emergent and self-organizing or can they be structured and directed? Is increased networking always good? What kind of network is appropriate?
What do we expect as outcomes of networks? Maximize the use of resources? Increase the development of knowledge/innovation? Increase the dissemination of outputs? Maximize use/build critical mass?
Evaluation 20065
Networks and R&D Evaluation
What can social network analysis (SNA) provide to answer these questions? Offers a way to analyze and measure the network structure of R&D – how R&D
really gets done. Identify effective network structures. Measure network outcomes.
But obstacles remain for the use of SNA in R&D evaluation (Rogers et al, 2000) SNA needs to focus on the content of ties rather than just structure SNA needs to develop a concept of “network effectiveness” in terms of its impact
on the uses of knowledge SNA needs to more closely examine “untidy” networks SNA needs to reformulate the typical evaluation questions
Evaluation 20066
Challenges of SNA in R&D Evaluation How to address the challenges of SNA in R&D evaluation?
Focus on the content of ties rather than just structure SNA is structural analysis, hard to get away from Need to identify the appropriate network (and ties) for knowledge production If a matrix organization, project affiliation network (project ecology) provides good proxy
for knowledge network But other networks (multiplexity) are also important – collaboration (bibliometric), for
example
Developing a concept of “network effectiveness” Effectiveness in terms of the network or network outcomes? What are the best network measures?
Centrality? But which centrality measure? What is the best network structure?
Clumpy, dense, sparse (Borgatti, 2005)? It depends on the specific research setting and goals…one size does not fit all.
Need to study “untidy” networks All networks are untidy But boundaries are a necessary evil for delimiting the study
Evaluation 20067
Challenges of SNA in R&D Evaluation
SNA needs to reformulate the typical evaluation questions – toughest obstacle Move beyond the “QBEQ” – does this project yield value? Scientist do not necessarily think of their work in terms of “projects” and
“outcomes” But GPRA and PART drive the value orientation Part of the solution lies in improved performance metrics – knowledge
growth, not just outcomes
As Rogers et al (2001) suggest: Look to social studies of science to understand knowledge growth Apply a general notion of a network approach Borrow from the network analysis tool box
Evaluation 20068
Moving Forward with SNA in R&D Evaluation
We would also suggest the following:
Need to move away from “traditional” use of SNA Most SNA focuses on properties of individual performance Most SNA is better oriented for managers
Focuses on the identification of particular individuals in networks and taking corrective action
Not necessarily appropriate for evaluation of projects/programs
What is the appropriate network to study for R&D? R&D consists of knowledge networks
How is knowledge produced and communicated, particularly tacit knowledge.
Project networks highlight the network of knowledges and skills within the organization
How does the research environment interact with networks? Does the environment inhibit or facilitate networks? What network characteristics are most effective in a given profile of R&D?
Evaluation 20069
From the SNA Toolbox: A Brief Primer on Centrality
Centrality is the number and distance of ties a network node has with other nodes of the network
Highlights the characteristics of the flow of knowledge
Highlights an node’s relationship to the flow of knowledge
Four primary types of centralityDegree – number of links to other nodes
Highlights well-connected nodes (A-red)
Closeness – shortest “distance” to other nodes
Highlights nodes with good visibility of the overall network (D, E and H - blue)
Betweenness – “distance” between groups of nodes
Highlights nodes that act as intermediaries in the overall network (H – blue shaded)
Eigenvector – the diversity of an node’s network
Highlights nodes with diverse links (A, D, and E)
Evaluation 200610
• Large Multi-Disciplinary National Laboratory
• 2-mode Network - Projects and Research Department
• Conceptualizes network as a knowledge network, not individuals
• 20 Research Projects – 216 Researchers
– Blue nodes=Research Departments
– Red nodes=Projects
• Some projects/departments are more central to the network
• Some projects/departments appear to the play the role of intermediaries
• But what is important in terms of outcomes?
Looking at Knowledge Networks in R&D
Evaluation 200611
• Derived centrality measures for each project– Centrality measures show different properties of the flow of knowledge.
• Regression of centrality measures against measures of productivity (for the project)– Papers and patents….imperfect, but the best we had
• Eigenvector centrality showed the greatest positive impact– The number of links is not necessarily the important factor
– The diversity of links (knowledges) is more important
• Betweenness centrality was negative– Suggests the role of knowledge intermediary is not important in knowledge ecology
Looking at Knowledge Networks in R&D
Regression of Scientific Productivity (Papers and Patents) on Centrality Measures
Model1 2 3 4 5 6
Personnel .015 -0.427 -0.606 -.090 0.002 -0.116
Number of Research Centers
.638** 1.049* .641** .952** .588* 0.353
Standardized Complexity - -0.353 - - - -
Degree Centrality - - 0.623 - - -
Betweenness - - - -.371 - -
Closeness - - - - 0.096 -
Eigenvector Centrality - - - - - .498*
R2 0.422 0.453 0.427 0.499 0.427 0.516
N=20, *<.1, **<.05
Evaluation 200612
Networks and the Research Environment What is the relationship between networks and the research environment?
Can help to understand the functioning of the networkNetworks are “emergent” and self-organizing, but can be influenced
While numerous studies highlight networks in science, the organizational environment or context is often not considered
But the work/research environment has been identified as a key factor for creativity and innovation (Cummings, 1965; Pelz and Andrews, 1976;Balachandra and Friar, 1997).
Need to better understand the interaction between social networks and the organizational/research environment and how these might facilitate or inhibit the performance of the network
How do networks affect scientists’ perception of the research environment?
Evaluation 200613
STAR - small research organization with the National Oceanic and Atmospheric Administration (NOAA)
Approximately 70 scientists focused on atmospheric science.
Organized into three divisions that encompass satellite meteorology, oceanography, climatology, and cooperative research with academic institutions
Complex physical structure, consisting of one primary office, a nearby secondary office and several smaller offices scattered around the country
Chartered to develop operational algorithms and applications using satellite data
In addition to actively developing new data products, the scientists currently provide support to nearly 400 current satellite-derived products
Finally, much of the work of these scientists is conducted in close partnerships with other agencies, academic institutes, and industry.
NOAA’s STAR
Evaluation 200614
The survey administered covers key attributes of organizational structure and management practices. Sponsored by the Department of Energy.
Focuses on thirty-six attributes in four discrete categories were identified as most important to do excellent research that has an impact.
Survey items identified and defined through an extensive literature review and input from fifteen focus groups that included bench scientists, engineers, and technologists, as well as their managers, across various R&D tasks (Jordan et al, 2003a).
Survey has been developed over six years of research and field-tested in several large research laboratories in the United States
At STAR, 81 potential respondents and 64 surveys were completed, yielding a response rate of 79 percent.
Network data – project affiliations (over 50 projects) and a name generator .
The Research Environment Survey
Evaluation 200615
Development of Human Resources
Creativity & Cross-fertilization
Internal Support Systems
Set and Achieve Relevant Goals
People Treated with Respect
Time to Think and Explore
Good Research Competencies
Sufficient, Stable Project Funding
Optimal Mix of StaffResources/Freedom to
Pursue New Ideas
Good Equipment/Physical
Environment
Good Planning and Execution of Projects
Management IntegrityAutonomy to Make
DecisionsGood Salaries and
BenefitsGood Project-level
Measures of Success
Teamwork & Collaboration
Cross-Fertilization of Ideas
Good Allocation of Internal Funds
Good Relationship with Sponsors
Good Internal Project Communication
Frequent External Collaborations
Informed and Decisive Management
Reputation for Excellence
Management Adds Value to Work
Relevant Research Portfolio
Rewards and Recognizes Merit
Management Champions Foundational Research
High Quality Technical Staff
Commitment to Critical Thinking
Efficient Laboratory Systems
Good Lab-wide Measures of Success
Good Professional Development
Identification of New Opportunities
Laboratory Services Meet Needs
Clear Research Vision and Strategy
Good Career Advancement Opportunities
Sense of Challenge and Enthusiasm
Overhead Rates Not Burdensome
Invests in Future Capabilities
The Research Environment Survey
Evaluation 200616
STAR Network Data
The name generator yielded 39 respondents and the project affiliation question yielded 63 respondents.
Due to the lack of detail in the name generator responses, the data was simply quantified in terms of the number of internal and external contacts.
The project affiliation data was gathered in 2-mode format which was then transformed into 1-mode.
This facilitated the derivation of network measures, principally those of centrality.
STAR Project Network (n=63)STAR Project Network (n=63)
Evaluation 200617
STAR Project Networks
Of all the centrality measures, closeness had the greatest impact
Divided actors by mean closeness
Those with high closeness reported more time true on a number of environmental attributes
Higher ratings of the research environment, but more “productive”?
Low
ClosenessHigh
Closeness Significance
People show a commitment to critical thinking 3.35 4.17 0.01
There is teamwork and collaboration 3.42 4.04 0.02
External collaborations and interactions occur frequently for this project 3.18 3.78 0.06
My management rewards and recognizes merit 3.39 3.96 0.08
People are treated with respect as individuals 3.88 4.43 0.06
My management adds value to my work 2.82 3.74 0.01
People are given the authority to make decisions about how to do their jobs 3.76 4.43 0.00
There is good planning and execution of research projects 3.27 3.83 0.03
My management has a clear research vision and strategies 3.06 3.70 0.03
My management maintains an integrated and relevant research portfolio 3.28 3.83 0.07
Overall, I would rate my research/work environment as... 4.79 5.43 0.06
The Laboratory is a great place to work 3.76 4.13 0.08
Evaluation 200618
STAR Project Networks
Developed closeness composition for each project
Categorized projects by product orientation – current product or new products
Projects with a majority of members with high closeness are clustered around new product development
Network theory suggests that closeness is a key factor for new product development
Projects by Closeness Composition and Product Orientation
Product Orientation
Total Current Product
New Product Development
Closeness Composition
Low Closeness 5 5 10
Mixed Closeness 9 7 16
High Closeness 9 21 30
Total 23 33 56
Evaluation 200619
New Directions in the Use of Network Analysis in R&D Evaluation
Successful efforts at moving network analysis in new directions with promise for applications in evaluation
Project network as a good proxy for knowledge network
SNA to show the characteristics of the flow of knowledge and actors’ relationships to that flow
Combined with research environment survey, shows that network position influences perceptions of the research organization
Much is still neededBetter performance measures
Identification of “effective” network structures and properties
Identification of network “effectiveness”
Next stepsContinue exploration of research environment and networks with performance measures
Explore the nature of work of those with different ego networks and high closeness