towards a methodology for developing evidence-informed ... · british journal of management, vol....

16
British Journal of Management, Vol. 14. 207-222 (2003) Towards a Methodology for Developing Evidence-Informed Management Knowledge by Means of Systematic Review* David Tranfield, David Denyer and Palminder Smart Advanced Management Research Centre (AMRC), Cranfield School of Management, Cranfield University, Cranfield, MK43 OAl, UK Corresponding author email: D. [email protected] Undertaking a review of the literature is an important part of any research project. The researcher both maps and assesses the relevant intellectual territory in order to specify a research question which will further develop the knowledge hase. However, traditional 'narrative' reviews frequently lack thoroughness, and in many cases are not undertaken as genuine pieces of investigatory science. Consequently they can lack a means for making sense of what the collection of studies is saying. These reviews can he hiased by the researcher and often lack rigour. Furthermore, the use of reviews of the available evidence to provide insights and guidance for intervention into operational needs of practitioners and policymakers has largely been of secondary importance. For practitioners, making sense of a mass of often-contradictory evidence has hecome progressively harder. The quality of evidence underpinning decision-making and action has heen questioned, for inadequate or incomplete evidence seriously impedes policy formulation and implementation. In exploring ways in which evidence-informed management reviews might be achieved, the authors evaluate the process of systematic review used in the medical sciences. Over the last fifteen years, medical science has attempted to improve the review process hy synthesizing research in a systematic, transparent, and reproducihie manner with the twin aims of enhancing the knowledge hase and informing policymaking and practice. This paper evaluates the extent to which the process of systematic review can be applied to the management field in order to produce a reliable knowledge stock and enhanced practice by developing context-sensitive research. The paper highlights the challenges in developing an appropriate methodology. Introduction: the need for an evidence- informed approach Undertaking a review of the literature to provide the best evidence for informing policy and *This paper results from research undertaken in Cran- field IMRC (EPSRC) grant no IMRC19, 'Developing a methodology for evidence-informed management knowledge using systematic review', Professor David Tranfield and Dr David Denyer. © 2003 British Academy of Management practice in any discipline, is a key research objective for the respective academic and practi- tioner communities. The post-World-War-II era witnessed a sharp focus of attention by academics and practitioners on the discipline and profession of management (Blake and Mouton, 1976; Tisdall, 1982). The pace of knowledge production in this field has been accelerating ever since and has resulted in a body of knowledge that is increasingly fragmen- ted and transdisciplinary as well as being inter- dependent from advancements in the social

Upload: others

Post on 11-Oct-2019

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Towards a Methodology for Developing Evidence-Informed ... · British Journal of Management, Vol. 14. 207-222 (2003) Towards a Methodology for Developing Evidence-Informed Management

British Journal of Management, Vol. 14. 207-222 (2003)

Towards a Methodology for DevelopingEvidence-Informed Management

Knowledge by Means of SystematicReview*

David Tranfield, David Denyer and Palminder SmartAdvanced Management Research Centre (AMRC), Cranfield School of Management,

Cranfield University, Cranfield, MK43 OAl, UKCorresponding author email: D. [email protected]

Undertaking a review of the literature is an important part of any research project. Theresearcher both maps and assesses the relevant intellectual territory in order to specify aresearch question which will further develop the knowledge hase. However, traditional'narrative' reviews frequently lack thoroughness, and in many cases are not undertakenas genuine pieces of investigatory science. Consequently they can lack a means formaking sense of what the collection of studies is saying. These reviews can he hiased bythe researcher and often lack rigour. Furthermore, the use of reviews of the availableevidence to provide insights and guidance for intervention into operational needs ofpractitioners and policymakers has largely been of secondary importance. Forpractitioners, making sense of a mass of often-contradictory evidence has hecomeprogressively harder. The quality of evidence underpinning decision-making and actionhas heen questioned, for inadequate or incomplete evidence seriously impedes policyformulation and implementation. In exploring ways in which evidence-informedmanagement reviews might be achieved, the authors evaluate the process of systematicreview used in the medical sciences. Over the last fifteen years, medical science hasattempted to improve the review process hy synthesizing research in a systematic,transparent, and reproducihie manner with the twin aims of enhancing the knowledge haseand informing policymaking and practice. This paper evaluates the extent to which theprocess of systematic review can be applied to the management field in order to produce areliable knowledge stock and enhanced practice by developing context-sensitive research.The paper highlights the challenges in developing an appropriate methodology.

Introduction: the need for an evidence-informed approach

Undertaking a review of the literature to providethe best evidence for informing policy and

*This paper results from research undertaken in Cran-field IMRC (EPSRC) grant no IMRC19, 'Developing amethodology for evidence-informed managementknowledge using systematic review', Professor DavidTranfield and Dr David Denyer.

© 2003 British Academy of Management

practice in any discipline, is a key researchobjective for the respective academic and practi-tioner communities.

The post-World-War-II era witnessed a sharpfocus of attention by academics and practitionerson the discipline and profession of management(Blake and Mouton, 1976; Tisdall, 1982). Thepace of knowledge production in this field hasbeen accelerating ever since and has resulted in abody of knowledge that is increasingly fragmen-ted and transdisciplinary as well as being inter-dependent from advancements in the social

Page 2: Towards a Methodology for Developing Evidence-Informed ... · British Journal of Management, Vol. 14. 207-222 (2003) Towards a Methodology for Developing Evidence-Informed Management

208 D. Tranfield, D. Denyer and P. Smart

sciences (Friedman, Durkin, Phillips and Volt-singer, 2000).

In management research, the literature reviewprocess is a key tool, used to manage the diversityof knowledge for a specific academic inquiry. Theaim of conducting a literature review is often toenable the researcher both to map and to assessthe existing intellectual territory, and to specify aresearch question to develop the existing body ofknowledge further. Management reviews areusually narrative and have been widely criticizedfor being singular descriptive accounts of thecontributions made by writers in the field, oftenselected for inclusion on the implicit biases of theresearcher (Fink, 1998; Hart, 1998). Not surpris-ingly they have also been condemned for lackingcritical assessment. The management-researchcommunity perpetuates this type of practice bynot actively commissioning infrastructural ar-rangements to ensure previous investments inliterature reviews are not lost. This tolerance toloss of knowledge forms a high-risk strategy thatwill inevitably become unsustainable as organiza-tions endeavour further into the networked andknowledge-based economy.

Reviews of the available evidence in manage-ment to assimilate 'best evidence' to provideinsights and guidance for intervention into theoperational needs of practitioners and policy-makers have largely become a secondary con-sideration.

Sufficient momentum from academics, practi-tioners, and government has stirred an urgentneed to re-evaluate the process by which manage-ment researchers conduct literature reviews. Overthe last fifteen years, medical science has at-tempted to improve the quality of the reviewprocess. This paper proposes the view thatapplying specific principles of the systematicreview methodology used in the medical sciencesto management research will help in counteractingbias by making explicit the values and assump-tions underpinning a review. By enhancing thelegitimacy and authority of the resultant evidence,systematic reviews could provide practitioners andpolicy-makers with a reliable basis to formulatedecisions and take action. This is particularlysobering if one considers the growing pressuresupon practitioners in today's global tradingenvironments to do this in shorter cycle times.

This paper will begin by discussing theevidence-based approach in medical sciences

through the effective use of systematic reviews.The following sections will compare and contrastthe nature of reviews in medical science andmanagement research and evaluate the extent towhich the systematic review process can beapplied to the management field. Finally thispaper will present the challenges in designing anappropriate methodology for management re-search.

The origins of the evidence-basedapproach

Since the 1980s, the British central governmenthas placed increasing emphasis on ensuring thatpolicy and practice are informed through a morerigorous and challenging evidence base. The'three E' initiatives (economy, efficiency andeffectiveness) have focused attention on thedelivery of public services and have led to thedevelopment of detailed guidance and best-practice manuals in many disciplines. Effective-ness in this context is concerned both withappropriateness and the validity of the methodsused by professionals in their day-to-day work toachieve their basic aims and also with the overallability of agencies to deliver the services they arerequired to provide (Davies, Nutley and Smith,2000). The concern for effective service deliveryhas attracted considerable attention, and hasfocused interest on basing policy and practice onthe best evidence available. Consequently, anevidence-based movement has developed underNew Labour, and in May 1997 Tony Blairannounced that 'what counts is what works', theintention being to signal a new 'post-ideological'approach to public policy where evidence wouldtake centre stage in the decision-making process(Davies, Nutley and Smith, 2000).

The evidence-based approach in medicalscience and healthcare

The evidence-based movement has had a majorimpact in certain disciplines. Pre-eminent havebeen applications in medical science, where thepace of knowledge production has meant thatmaking sense of an often-contradictory mass ofevidence has become increasingly difficult (Ohls-son, 1994). Specifically in the late 1980s, attention

Page 3: Towards a Methodology for Developing Evidence-Informed ... · British Journal of Management, Vol. 14. 207-222 (2003) Towards a Methodology for Developing Evidence-Informed Management

Developing Evidence-Informed Management Knowledge 209

was drawn to the comparative lack of rigour insecondary research (Mulrow, 1987). Critics ar-gued that the preparation of reviews of secondarysources were dependent on implicit, idiosyncraticmethods of data collection and interpretation(Cook, Mulrow and Haynes, 1997; Greenhalgh,1997). In addition, practice based on poor-qualityevaluations of the literature sometimes had led toinappropriate recommendations (Cook, Green-gold, Ellrodt and Weingarten, 1997). In 1991,Smith questioned the overall wisdom of much ofmedical science, arguing that only 15-20% ofmedical interventions were supported by solidmedical evidence (Smith, 1991). The result, it wasargued, was that patients were being regularlysubjected to ineffective treatments and interven-tions, and for many practices there was little orno understanding of whether or not the benefitsoutweighed the potential harm (Davies, Nutleyand Smith, 1999).

The National Health Service (NHS) Researchand Development Strategy identified that toolittle research was being carried out in theimportant clinical areas and that much of theexisting research was ad hoc, piecemeal andpoorly conducted (Peckham, 1991). The reportalso argued that researchers rather than practi-tioners, managers or policymakers drove theresearch agenda.

Furthermore, there was Uttle dissemination, letalone diffusion, of research findings. The Strategynot only argued for an increase in the level ofresearch conducted but also for systematic re-views of existing research on important clinical oroperational questions, assessing the best evidenceavailable, collating the findings and presentingthem in a way that was accessible and relevant todecision-makers (Peckham, 1991).

Systematic review - a key tool indeveloping the evidence base

Over the last decade medical science has madesignificant strides in attempting to improve thequality of the review process by synthesizingresearch in a systematic, transparent and repro-ducible manner to inform policy and decision-making about the organization and delivery ofhealth and social care (Cook, Greengold, Ellrodtand Weingarten, 1997; Cook, Mulrow andHaynes, 1997; Wolf, Shea and Albanese, 2001).

Systematic reviews differ from traditionalnarrative reviews by adopting a replicable,scientific and transparent process, in other wordsa detailed technology, that aims to minimize biasthrough exhaustive literature searches of pub-lished and unpublished studies and by providingan audit trail of the reviewers decisions, proce-dures and conclusions (Cook, Mulrow andHaynes, 1997). The process of systematic reviewand its associated procedure, meta-analysis, hasbeen developed over the last decade and nowplays a major role in evidence-based practices.

Whereas systematic review identifies key scien-tific contributions to a field or question, meta-analysis offers a statistical procedure for synthe-sizing findings in order to obtain overall relia-bility unavailable from any single study alone.Indeed, undertaking systematic review is nowregarded as a 'fundamental scientific activity'(Mulrow, 1994, p. 597). The 1990s saw severalorganizations formed with the aim of establishingagreed and formalized procedures for systematicreview and to undertake systematic reviews tosynthesize and disseminate evidence across allareas of healthcare. These organizations includedthe Cochrane Collaboration (2001), the NationalHealth Science Centre for Reviews and Dissemi-nation (2001) and the National Institute forClinical Excellence (2001).

Evidence-based approacbes in otberdisciplines

The movement to base practice on the bestavailable evidence has migrated from medicineto other disciplines. In the UK, the Departmentfor Education and Skills (DfES) has established aCentre for Evidence Informed Policy and Prac-tice in Education. Furthermore, a 'What Works?Programme' was introduced in the probationservice following the Crime Reduction Strategypublished by the Home Office in July 1998. Theaim of the programme was to develop successfulintervention programmes based on hard evidenceso that they could be used as models for day-to-day probation practice (HM Inspectorate ofProbation, 1998; Home Office, 1998). An Effec-tive Practice Initiative also has sought to addressthe difficult problem of ensuring that offendersupervision changes in line with research evidenceon what works (Furniss and Nutley, 2000). The

Page 4: Towards a Methodology for Developing Evidence-Informed ... · British Journal of Management, Vol. 14. 207-222 (2003) Towards a Methodology for Developing Evidence-Informed Management

210 D. Tranfield, D. Denyer and P. Smart

Department for the Environment, Transport, andthe Regions (DETR) commissioned a review ofthe evidence base as it relates to regenerationpohcy and practice (Dabinett, Lawless, Rhodesand Tyler, 2001), Other disciplines such asnursing (Evans and Pearson, 2001), housingpolicy (Davies and Nutley, 1999; Maclennanand More, 1999), social care (Macdonald, 1999)and criminal justice (Laycock, 2000) have alsoadjusted the approach with varying degrees ofsuccess. In 2001, the Economic and SocialResearch Council (ESRC) funded the establish-ment of a network (the Evidence Network) ofmulti-disciplinary centres dedicated to the im-provement of the evidence base for policy andpractice in social sciences. The Evidence Networkaims to use systematic review to inform andimprove decision-making in government, busi-ness and the voluntary sector.

Internationally, in February 2000 the Camp-bell Collaboration was launched in Philadelphiaby about 150 pioneering social scientists. Thisequivalent of the Cochrane collaboration aims:

'to help people make well-informed decisions aboutthe effects of interventiotis in the social, behaviouraland educational arenas' (Campbell Collaboration,2001),

Within the approach taken by the CampbellCollaboration, delegates considered questionssuch as how practitioners might engage thereview process, what makes research useful anduseable and what standards and quality criteriadistinguished reliable from unreliable research?In this sense, discussions addressed the need for

research to be both well founded and sociallyrobust. This emphasis on producing a sciencebase, which is both rigorous in formulation andrelevant to practice, is a key characteristic of anevidence-based approach.

The quality of information accepted as evi-dence in a discipline is dependent on a number ofcriteria. These include the broad intellectualapproach, the value system adopted by research-ers and commissioning bodies and the usualresearch methods employed (Davies and Nutley,1999), Medical science has traditionally adopteda 'normal science' approach within which double-bhnded randomized controlled trials have beenwidely accepted as the most rigorous method fortesting interventions before use. So far, systema-tic reviews have tended to be applied in, and toemanate from, fields and disciplines privileging apositivist tradition, attempting to do for researchsynthesis what randomized controlled trialsaspire to do for single studies (Macdonald,1999), Systematic reviews entail a series of techni-ques for minimizing bias and error, and as suchsystematic review and meta-analysis are widelyregarded as providing 'high-quality' evidence.Figure 1 highlights the hierarchy of evidence inthe medical sciences (Davies and Nutley, 1999),

In other disciplines such as education, socialservices and criminal justice there is often bothless consensus regarding the appropriate metho-dology to be used for evaluating the evidencebase, and little agreement as to how use researchevidence to inform policy and practice (Davies andNutley, 1999; Laycock, 2000; Macdonald, 1999;Maclennan and More, 1999), Furthermore, policy

Hierarchy of evidence

I-I Systematic review and meta-analysis of two or more double blind randomized controlled trials,

1-2 One or more large double-blind randomized controlled trials,

n-1 One or more well-conducted cohort studies,

II-2 One or more well-conducted case-control studies,

II-3 A dramatic uncontrolled experiment,

n i Expert comtnittee sitting in review; peer leader opinion,

IV Personal experience.

Figure 1. Hierarchies of evidence. Source: reproduced by kind permission of the publisher from Davies, H. T. O. and S. M. Nutley(1999). 'The Rise and Rise of Evidence in Health Care'. Public Money & Management, 19 (l),pp. 9-16. © 1999 Blackwell Publishing.

Page 5: Towards a Methodology for Developing Evidence-Informed ... · British Journal of Management, Vol. 14. 207-222 (2003) Towards a Methodology for Developing Evidence-Informed Management

Developing Evidence-Informed Management Knowledge 211

questions are rarely addressed by the use ofrandomized controlled trials. For example, insocial care the nature of evidence is often hotlydisputed and there exists strong resistance toprivileging one research method over another.Indeed, postmodern perspectives generally mis-trust any notion of objective evidence.

Divergences such as these are deeply rooted inthe ontological and epistemological assumptionsof specific fields. Despite these difficulties, Davies,Nutley and Smith argue optimistically:

'The different ontological and epistemological start-ing points in different professional traditions un-doubtedly colour the methods and enthusiasm withwhich professionals engage with evidence. However,what is clear is that there remains in all of the areasexamined great potential for research evidence to bevastly more influential than hitherto'. (2000, p. 4)

The nature of management research

The nature of the field of management researchhas been subject, over the years, to considerableanalysis and discussion. Much of this discussionand debate has focused upon the ontologicalstatus of the field, particularly its fragmented anddivergent nature. For example, Whitley (1984a,1984b), in two influential articles, investigated thescientific status of management research as a'practically oriented social science'. He identifiedits fragmented state and argued that the conse-quence of this is a:

'low degree of reputational control over significancestandards ... (which) means that the significance ofproblems and preferred ways of formulating themare unstable, subject to disputes, and arc assessedby diffused and diverse standards.' (Whitley, 1984a,p. 343)

Whitley (2000) further refined this position,suggesting that the continued fragmentation ofthe management field may displace academics askey stakeholders in the research process. Incomparing management research with industrial,work and organizational psychology, Hodgkin-son, Herriot and Anderson (2001, s45) alsoconclude that there is a considerable and widen-ing divide between academics and other stake-holder groups and that 'this divergence is likely tofurther proliferate irrelevant theory and untheor-ized and invalid practice'.

Pettigrew (1997, p. 291), in much the same veinas Whitley, emphasized the significance of thesocial production of knowledge in viewingmanagement research, emphasizing stakeholderperspectives. His influential view was that man-agement research faces a series of challenges:

'best captured in a series of concurrent doublehurdles, which together raise a wide spectrum ofcognitive, social and political demands on [the]skills and knowledge of [management] researchers.'

He argued for a thematic approach:

'to meet the double hurdle of embeddedness in thesocial sciences and the worlds of policy andpractice' (Pettigrew, 1997, p. 292).

Berry (1995) ofl'ered a Gallic perspective,arguing strongly the case for the importance ofqualitative work. Several writers (Aram andSalipante, 2000; Pfefler and Sutton, 1999; Vande Ven, 1998; Wind and Nueno, 1998) haveargued convincingly for the applied nature ofmanagement research. Likewise, Hambrick(1994) and Huflf (2000) both used their addressesas President of the Academy of Management toaddress the ontological status of the field. Morerecently, Wilcoxson and Fitzgerald (2001) havefocused on the nature of management as adiscipline and the consequences of this forresearchers and practitioners in an Australasiancontext and Van Aken (2001) has developed aview of management research based as a designscience, rather than as a formal or explanatoryscience. By conceptualizing management researchin this way, he identifies the need for a field ofstudy to deliver output not only of high academicquality but also which is practitioner andcontext-sensitive. He argues that the mission ofdesign sciences is to develop valid and reliableknowledge in the form of 'field tested andgrounded technological rules' to be used indesigning, configuring and implementing solu-tions to specific problems.

The 1990s saw an extensive debate concerningthe nature of management research within theBritish Academy of Management, which focusedon the ontological status of the field, andparticularly the extent to which academic-practi-tioner relations were to be privileged. The workof Gibbons et al. (1994) on modes of knowledgeproduction has become increasingly influential insuch debates. In particular, their notion of mode

Page 6: Towards a Methodology for Developing Evidence-Informed ... · British Journal of Management, Vol. 14. 207-222 (2003) Towards a Methodology for Developing Evidence-Informed Management

212 D. Tranfield, D. Denyer and P. Smart

2 knowledge production, where there is 'a constantflow back and forth between the theoretical andthe practical' and where 'knowledge is produced inthe context of application' has been argued to becentral to debates about the future of managementresearch (Tranfield and Starkey, 1998). Creating amanagement research which is both theoreticallysound and methodologically rigorous as well asrelevant to the practitioner community has been atheme explored by both the British Academy ofManagement and the Foundation for Manage-ment Education (Starkey and Madan, 2001).This discussion was developed further in a specialissue of the British Journal of Management(Hodgkinson, 2001).

Comparing the management andmedical fields

Tranfield and Starkey (1998), in an article whichboth refiected and drove the debate in the BritishAcademy of Management, used Becher's (1989)dimensions drawn from the sociology of knowl-edge to characterize management research as'soft' rather than 'hard' 'applied' rather than'pure', rural' rather than 'urban', and 'divergent'rather than 'convergent'. The creation of such aprofile, with the use of dimensions drawn fromthe sociology of knowledge, enabled contrasts tobe made with other disciplines, particularlymedical science, where systematic review hasbeen applied to considerable effect. Comparisoncan be made in both epistemological andontological realms.

Whereas medical research enjoys considerableand extensive epistemological consensus, this isuntrue of management research, in general. Theconsequential difiiculties of establishing agreedthresholds for high-quality work result from thislack of consensus.

Key ontological differences between manage-ment research and medical science concern thedimension 'convergent-divergent'. The extent towhich a discipline resides at one end of thisdimension or another is purported to dependupon similarities in research ideologies, valuesand quality judgements which create a sharedsense of nationhood amongst researchers withinthe field (Becher, 1989). Agreements concerningkey research questions to be addressed lead to arelatively low tolerance of deviance, but have the

advantage of defining disciplinary boundaries,making them easy to defend. Thus, the extent towhich disciplines are opening up research ques-tions, or addressing a previously defined andagreed agenda, dictates positioning on thisdimension.

Management research is a relatively youngfield, far less well developed in terms of agendaand question formulation than much of medicalscience. As a result there tends to be lowconsensus concerning key research questions inmanagement research. Studies in the field rarelyaddress identical problems and share a researchagenda or, more importantly, ask the samequestions. Therefore, it is unlikely that aggrega-tive approaches to research synthesis, such asmeta-analysis will be appropriate in managementresearch as the heterogeneity of studies preventsthe pooling of results and the measurement of thenet effectiveness of interventions.

Table 1 outlines the similarities and differencesbetween medical science as an applied field ofstudy stemming from the biological sciences, andmanagement research as an applied field withstrong connections to the social sciences.

The main question here is to what extentreview processes developed in fields that arestriving to become evidence based, such as themore convergent field of medicine, can inform thereview process in the management field to helpcreate rigorous and relevant reviews. As manage-ment research questions need to be clearlyspecified, either as replication of an existingstudy, as further development of an existingstudy, or as a new study to meet a defined 'gap' inthe literature, a more systematic literature reviewprocess can help to justify/qualify the near/finalresearch question which is posed. Furthermore,the process described/proposed in this papervalues and takes steps to encourage participation,by both academics and by managers/policy-makers, and is pragmatic in intent.

Systematic reviews have traditionally beenapplied in fields and disciplines privileging apositivist and quantitative tradition:

'Positivists seek cause-and-effect laws that aresufficiently generalizable to ensure that a knowledgeof prior events enables a reasonable predication ofsubsequent event ... Because positivists see knowl-edge as accumulating, they have been more inter-ested in developing approaches to research

Page 7: Towards a Methodology for Developing Evidence-Informed ... · British Journal of Management, Vol. 14. 207-222 (2003) Towards a Methodology for Developing Evidence-Informed Management

Developing Evidence-Informed Management Knowledge 213

Table I. Differences between medical research and management research

Medicine Management

Nature of the disciplineResearch culture

Research questionsInterventionsResearch designsTheory

Aims of policy

Weight of inputs into policyMethodsLiterature reviewsThe need for a review

Preparation of the review

Review protocol

Identifying research

Selection of studies

Study quality assessment

Data extraction

Data synthesis

Reporting andDissemination

Evidence into practice

Convergent.Subjected to rigorous scientific evaluation.

High consensus over research questions.Can be measured through experiments.Based upon a hierarchy of evidence.Concerned with what works-did the interventionoffer overall benefits.Generally reducing illness and death, andimproving health.Scientific evidence.Predominantly quantitative.Systematic review and meta-analysis.Reviews of effectiveness are used by clinicalpractitioners.A review panel (including practitioners) guidesthe process.A brief scoping study is conducted to delimit thesubject area.A plan prior to the review states the criterion forincluding and excluding studies, the searchstrategy, description of the methods to be used,coding strategies and the statistical procedures tothe employed.Protocols are made available by internationalbodies to enhance networking the exchange ofknowledge.A comprehensive, structured search is conductedusing predetermined keywords and search strings.

Inclusion and exclusion criteria are expressed inthe protocol to ensure a review of the bestavailable evidence.Draw upon 'raw data' from 'whole studies' foranalysis to create a study in its own right.

Studies are assessed against predeterminedcriteria. The internal validity of the study isjudged. Assessing and including qualitativestudies is problematic.

Data extraction forms are used which act as ahistorical record for the decisions made duringthe process and provides the basis on which toconduct data synthesis.A qualitative synthesis provides a tabulation ofkey characteristics and results. Meta-analysispools the data across studies to increase thepower of statistical analysis. Aims to generate'best' evidence.

Standardized reporting structures used Non-explanatory style adopted. Short scripts recordedand made widely avail able throughinternationally recognized institutions.Comprehensible by practitioners.

Collaborative process and practice-oriented.

Divergent.Split between positivist and phenomenologicalperspectives.Low consensus over research questions.Experimentation may or may not be feasible.Triangulation is recommended.Concerned with why something works or doesnot work and the context in which this occurs.Multiple and competing and the balance betweenthem may change over time.Many extraneous factors.Quantitative and qualitative.Largely narrative reviews.To develop a research question and informempirical research practice.Usually an informal/ad hoc process involving theresearcher, peers and supervisor.

Level of formality and standardisation indesigning/adopting protocols is usually low.Unacceptable to 'tightly' plan literature review,as this may inhibit the researchers capacity toexplore, discover and develop ideas.

Identifying a field/sub- fields of study generallyoccurs through informal consultation. Implicitidiosyncratic methods of data collection are used.Based on studies that appear relevant orinteresting. Researchers bias disables criticalappraisal. Decisions regarding choice are notrecorded precluding any audit trails 'Raw data' isoften not available in academic articles, whichusually represent 'partial studies'. Preciseinclusion/exclusion criteria are often not formallyagreed, applied recorded or monitored.Poor evaluating of the fit between researchmethodology and research questions.Researchers tend to rely on the quality rating of aparticular journal, rather than applying qualityassessment criteria to individual articles.Data extraction is not formally guided byexplicitly stated inclusion and exclusion criteria.Data extracted is not comprehensively recordedand monitored.

Generally narrative and qualitative. Higher levelsof subjectivity associated with what is taken froman article for analysis and synthesis. Lack explicitdescriptive and thematic analysis. Specific toolsand techniques from the field of qualitative dataanalysis are increasingly applied.Non-standardized reporting structures.Interpretive long scripts. The explanatory powerimproved through the use of analogy, metaphorand homology. Process of knowledge production,omitted. Sometimes incomprehensible bypractitioners lack links between differentliterature.

Implementation of evidence is often anafterthought.

Page 8: Towards a Methodology for Developing Evidence-Informed ... · British Journal of Management, Vol. 14. 207-222 (2003) Towards a Methodology for Developing Evidence-Informed Management

214 D. Tranfield, D. Denyer and P. Smart

synthesis than have interpretivists.' (Noblit andHare, 1988, p. 12)

Indeed researchers from an interpretivist orphenomenological position may suggest thatsystematic reviews, with their positivist leanings,should not be adopted in the social sciences. Evenwithin medical research, not everybody acceptsthat systematic reviews are necessary or desirable(Petticrew, 2001). Petticrew (2001, p. 98) arguesthat the concern over systematic review has beenfuelled by the fact that they are often presented assynonymous with a numerical aggregation of theresults of individual studies through a process ofmeta-analysis and 'that they are incapable ofdealing with other forms of evidence, such asfrom non-randomized studies or qualitative re-search'. However, meta-analysis 'is simply one ofthe tools, albeit a particularly important one, thatis used in preparing systematic reviews' (Mulrow,Cook and DavidofT, 1997, p. 290). In mostsystematic reviews the heterogeneity of study dataprevents the use of meta-analysis. In these cases,synthesis is achieved through summarizing thefindings of a group of studies. Alternativemethods of research synthesis such as realistsynthesis, meta-synthesis and meta-ethnographyhave also been developed to draw comparisonsand conclusions from a collection of studiesthrough interpretative and inductive methods.Whilst there are fundamental differences betweenmeta-analysis and qualitative research synthesis(Campbell, Pound, Pope, Bitten, Pill, Mogan, andDonovan, 2003), both are concerned with 'puttingtogether' (Noblit and Hare, 1988, p. 7) findingsfrom a number of empirical studies in somecoherent way (Dingwall, Murphy, Watson,Greatbatch and Parker, 1998).

The following section of the paper reports thesystematic review methodology used in medicalscience, seeks to tease out the key characteristicsof the approach, highlights the key challenges intransferring the model to the management fieldand presents a number of recommendations onhow these may be addressed.

Conducting a systematic review

Despite the relative infancy of systematic review,a reasonable consensus has emerged as to itsdesirable methodological characteristics (Davies

and Crombie, 1998). The Cochrane Collabora-tion's Cochrane Reviewers' Handbook (Clarke andOxman, 2001) and the National Health ServiceDissemination (2001) provide a list of stages inconducting systematic review (see Figure 2).

Stage I: planning the review

Prior to beginning the review a review panel isformed encompassing a range of experts in theareas of both methodology and theory. Effortsshould be made to include practitioners workingin the field on the panel. The review panel shouldhelp direct the process through regular meetingsand resolve any disputes over the inclusion andexclusion of studies. The initial stages ofsystematic reviews may be an iterative processof definition, clarification, and refinement(Clarke and Oxman, 2001). Within managementit will be necessary to conduct scoping studies toassess the relevance and size of the literature andto delimit the subject area or topic. Such studiesneed to consider cross-disciplinary perspectivesand alternative ways in which a research topichas previously been tackled. The scoping studymay also include a brief overview of thetheoretical, practical and methodological historydebates surrounding the field and sub-fields of

Stage I-Planning the review

Phase 0 - Identification for the need for a reviewPhase 1 - Preparation of a proposal for a reviewPhase 2 - Development of a review protocol

Stage n—Conducting a review

Phase 3 - Identification of research

Phase 4 - Selection of studies

Phase 5 - Study quality assessment

Phase 6 - Data extraction and monitoring progress

Phase 7 - Data synthesis

Stage in-Report ing and dissemination

Phase 8 - The report and recommendationsPhase 9 - Getting evidence into practice

Eigure2. Stages of a systematic review (Source: adapted by kindpermission of the publisher from NHS Centre for Reviews andDissemination (2001). Undertaking Systematic Reviews ofResearch on Effectiveness. CRD's Guidance for those CarryingOut or Commissioning Reviews. CRD Report Number 4 (2"''Edition) © 2001 NHS Centre for Reviews and Dissemination,University of York.

Page 9: Towards a Methodology for Developing Evidence-Informed ... · British Journal of Management, Vol. 14. 207-222 (2003) Towards a Methodology for Developing Evidence-Informed Management

Developing Evidence-Informed Management Knowledge 215

study. Where fields comprise of semi-independentand autonomous sub-fields, then this processmay prove difficult and the researcher is likely tostruggle with the volume of information and thecreation of transdisciplinary understanding.

Within medical science the researcher will alsoarrive at a definitive review question. The reviewquestion is critical to systematic review as otheraspects of the process flow from it. In systematicreview the outcome of these decisions is capturedthrough a formal document called a reviewprotocol. The protocol is a plan that helps toprotect objectivity by providing explicit descrip-tions of the steps to be taken. The protocolcontains information on the specific questionsaddressed by the study, the population (orsample) that is the focus of the study, the searchstrategy for identification of relevant studies, andthe criteria for inclusion and exclusion of studiesin the review (Davies and Crombie, 1998), Onceprotocols are complete they are registered withthe appropriate review-group editors, such as theCochrane Collaboration. If satisfactory, thereview is published to encourage interestedparties to contact the reviewers and to avoidduplication of studies.

Any management review protocol may containa conceptual discussion of the research problemand a statement of the problem's significancerather than a defined research question. Further-more management reviews are often regarded as aprocess of exploration, discovery and develop-ment. Therefore, it is generally considered unac-ceptable to plan the literature-review activitiesclosely, A more flexible approach may makeexplicit what the researcher intends to do a prioribut can be modified through the course of thestudy. The researcher needs to state explicitly whatchanges have been made and the rationale fordoing so. The aim is to produce a protocol thatdoes not compromise the researcher's ability to becreative in the literature review process, whilst alsoensuring reviews be less open to researcher biasthan are the more traditional narrative reviews.

Stage II: conducting the review

A comprehensive, unbiased search is one of thefundamental diflerences between a traditionalnarrative review and a systematic review.Although sometimes taking considerable time.

and almost always requiring perseverance andattention to detail, systematic review has beenargued to provide the most efficient and high-quality method for identifying and evaluatingextensive literatures (Mulrow, 1994), A systema-tic search begins with the identification ofkeywords and search terms, which are built fromthe scoping study, the literature and discussionswithin the review team. The reviewer should thendecide on the search strings that are mostappropriate for the study. The search strategyshould be reported in detail sufficient to ensurethat the search could be replicated. Searchesshould not only be conducted in publishedjournals and Hsted in bibliographic databases,but also comprise unpublished studies, con-ference proceedings, industry trials, the Internetand even personal requests to known investiga-tors. The output of the information search shouldbe a full listing of articles and papers (corecontributions) on which the review will be based.

Only studies that meet all the inclusion criteriaspecified in the review protocol and whichmanifest none of the exclusion criteria need beincorporated into the review. The strict criteriaused in systematic review are linked to the desireto base reviews on the best-quality evidence. Asdecisions regarding inclusion and exclusion re-main relatively subjective, this stage of thesystematic review might be conducted by morethan one reviewer. Disagreements can be resolvedwithin the review panel. The process of selectingstudies in systematic review involves severalstages. The reviewer will initially conduct areview of all potentially relevant citations identi-fied in the search. Relevant sources will beretrieved for a more detailed evaluation of thefull text and from these some will be chosen forthe systematic review. The number of sourcesincluded and excluded at each stage of the reviewis documented with the reasons for exclusions.

Within the medical domain there is a tensionbetween the statistical benefits of including alarge number of primary studies and conductinghigh-quality reviews of fewer studies with the useof more selective methodological criteria ofinclusion and exclusion (Davies, 2000), Qualityassessment refers to the appraisal of a study'sinternal validity and the degree to which itsdesign, conduct and analysis have minimizedbiases or errors. Individual studies in systematicreview are judged against a set of predetermined

Page 10: Towards a Methodology for Developing Evidence-Informed ... · British Journal of Management, Vol. 14. 207-222 (2003) Towards a Methodology for Developing Evidence-Informed Management

216 D. Tranfield, D. Denyer and P. Smart

criteria and checklists to assist the process (Ox-man, 1994). The relevance of a study to thereview depends on the relevance of its researchquestions and the quahty of its methodology. Thereviewer should avoid including:

'all studies that meet broad standards in terms ofindependent and dependent variables, avoiding anyjudgement of quality.' (Slavin, 1986, p. 6)

Systematic reviews, due to their positivisticorigins, sit comfortably with studies that usequantitative methods such as randomized con-trolled trials, quasi-experimental designs, andcost-benefit and cost-eflectiveness studies, there-fore, establishing criteria for ascertaining what is'relevant' or 'good quality' in qualitative researchprovides a further challenge (Engel and Kuzel,1992). With qualitative studies there is no possi-bility of testing statistically the significance of theresults. Qualitative research, by its very nature:

'is non-standard, unconfined, and dependent on thesubjective experience of both the researcher and theresearched ... it is debatable, therefore, whether anall-encompassing critical appraisal checklist alongthe lines of the User's Guides to the MedicalLiterature could ever be developed' (Greenhaighand Taylor, 1997, p. 741).

Several authors have presented a range ofcriteria that might be used to appraise andevaluate qualitative studies (Blaxter, 1996;Greenhaigh and Taylor, 1997; Mays and Pope,2000; Popay, Rogers and Williams, 1998). Popay,Rogers and Williams (1998) suggest that a qualityassessment would include the following:

• a primary marker: is the research aiming to explorethe subjective meanings that people give to particularexperiences and interventions?;

• context sensitive: has the research been designed insuch a way as to enable it to be sensitive/flexible tochanges occurring during the study?;

• sampling strategy: has the study sample been selectedin a purposeful way shaped by theory and/orattention given to the diverse contexts and meaningsthat the study is aiming to explore?;

• data quality: are different sources of knowledge/under-standing about the issues being explored or compared?;

• theoretical adequacy: do researchers make explicit theprocess by which they move from data to interpreta-tion?;

• generalizability: if claims are made to generalizabilitydo these follow logically and/or theoretically from thedata?

Sandelowski, Docherty and Emden (1997)claim that checklists, when applied to qualitativestudies, should be used with caution if they areused as a basis on which to exclude studies from areview. They go on to argue that any decisionsregarding exclusion must be supported by adetailed explanation of the reviewer's conceptionof 'good' and 'bad' studies and the reasons forexclusion.

Whereas systematic reviews draw upon 'rawdata', in management research these data areoften not made available in articles by authors. Inmany cases the articles only represent the resultsof part studies that satisfy the orientation of theeditors of a particular journal. Therefore, thedecisions regarding the selection of studiesactually become decisions about the selection of'articles' based on the more subjective findingsand conclusions of the author rather than on the'raw' data:

'It is highly unlikely that such a synthesis willinvolve a re-analysis of primary data which may bein the form of transcripts from interviews, for field-notes from studies involving participant observa-tion. Rather, the data to be analysed are most likelyto be the findings of the studies involved. Thesemight take the form of substantive themes arising,for example, from in-depth interviews. Withinqualitative research (and arguably all research)theory plays a pivotal role in informing theinterpretation of data. Whilst few authors appearto have considered the role for theory-led synthesisof findings across studies an argument can be madefor exploring the potential for this approach.'(Clarke and Oxman, 2001, section 4, p. 20)

Systematic reviews expose studies to rigorousmethodological scrutiny. Within the managementfield it may be possible to conduct a qualityassessment of the research articles by evaluatingthe fit between research methodology and re-search questions. However, management re-searchers usually rely on the implicit qualityrating of a particular journal, rather thanformally applying any quality assessment criteriato the articles they include in their reviews (i.e.refereed journals are 'better' than practitionerjournals). The difficulty in specifying and con-ducting quality assessments of studies is a majorchallenge in developing a systematic reviewmethodology for management research.

To reduce human error and bias, systematicreviews employ data-extraction forms. These

Page 11: Towards a Methodology for Developing Evidence-Informed ... · British Journal of Management, Vol. 14. 207-222 (2003) Towards a Methodology for Developing Evidence-Informed Management

Developing Evidence-Informed Management Knowledge 111

often contain general information (title, author,publication details), study features and specificinformation (details and methods) and notes onemerging themes coupled with details of synth-esis. The Cochrane Collaboration states thatdata-extraction forms serve at least three im-portant functions. First, the form is directlylinked to the formulated review question andthe planned assessment of the incorporatedstudies, providing a visual representation ofthese. Second, the extraction form acts as ahistorical record of the decisions made during theprocess. Third, the data-extraction form is thedata-repository from which the analysis willemerge (Clarke and Oxman, 2001).

The data-extraction process requires a docu-mentation of all steps taken. In many casesdouble extraction processes are employed, wheretwo independent assessors analyse a study andtheir findings are compared and reconciled ifrequired. Data-extraction can be paper based orcomputer based. The development of the data-extraction sheets is fiexible and may depend uponthe nature of the study. When devising the form,reviewers should consider the information thatwill be needed to construct summary tables andto perform data synthesis. Data-extraction formsshould include details of the information source(title, authors, journal, publication details) andany other features of the study such as popula-tion characteristics, context of the study and anevaluation of the study's methodological quality.Links to other concepts, identification of emer-gent themes, and key results and additional notesalso need to be included on the data-extractionform.

Research synthesis is the collective term for afamily of methods for summarizing, integrating,and, where possible, cumulating the findings ofdifferent studies on a topic or research question(Mulrow, 1994). The simplest and best-knownform of research synthesis is a narrative reviewthat attempts to identify what has been writtenon a subject or topic. Such reviews make noattempt to seek generalization or cumulativeknowledge from what is reviewed (Greenhalgh,1997). Meta-analysis is an alternative approachto synthesis, which enables the pooling of datafrom individual studies to allow for an increase instatistical power and a more precise estimate ofeffect size (Glass, 1976). Within managementresearch, few studies address the same research

question and measure the phenomenon in thesame way. Furthermore, researchers are lessconcerned with the effectiveness of certain classesof intervention, and rather more concerned withunderstanding organizations and managementprocesses. Therefore, it is unlikely that meta-analysis will be appropriate in managementresearch.

A number of authors have offered interpretiveand inductive approaches to research synthesis,which are more likely to provide a means ofdrawing insight from studies and for addressingissues pertinent to management research. Someauthors contend that there are a number ofphilosophical and practical problems associatedwith 'summing up' qualitative studies, whilstothers argue that attempts to 'synthesize existingstudies are seen as essential to reaching higheranalytic goals and also enhancing the general-izability of qualitative research' (Sandelowski,Docherty and Emden, 1997, p.367). Two inter-pretive and inductive methods, realist synthesisand meta-synthesis, have been developed to fillthe gap between narrative reviews and meta-analysis.

For Pawson (2001), realist synthesis offers onetechnique for producing a synthesis of a range ofstudy types. He argues that in medical research,programmes (such as medical treatments) carrythe potential for change. The aim of a systematicreview is to classify these programmes and toconduct a meta-analysis to provide a reliablemeasure of net effect. The practitioner is invitedto replicate the treatment that has worked tomaximum effect. In contrast, narrative reviewstend to explain the combination of attributes in aprogramme and generally identify exemplars ofbest practice. The practitioner is invited toimitate the programmes that are successful.According to Pawson, it is not programmes thatwork; rather it is the underlying reasons orresources that they offer subjects that generatechange. Whether change occurs is also dependenton the nature of the actors and the circumstancesof the programme. Realist synthesis captures alist of vital ingredients or mechanisms (positive ornegative) that underpin each individual pro-gramme. The researcher then builds theory byaccumulating understanding across a range ofprogrammes. Whilst some scholars would ques-tion whether contingency statements could everbe developed, Pawson (2001) argues that a realist

Page 12: Towards a Methodology for Developing Evidence-Informed ... · British Journal of Management, Vol. 14. 207-222 (2003) Towards a Methodology for Developing Evidence-Informed Management

218 D. Tranfield, D. Denyer and P. Smart

synthesis can provide a transferable programmetheory in the form of 'what works for whom inwhat circumstances'.

Meta-synthesis also offers an interpretativeapproach to research synthesis which can be usedto identify the:

'theories, grand narratives, generalizations, orinterpretative translations produced from the in-tegration or comparison of findings from qualita-tive studies.' (Sandelowski, Docherty and Emden,1997, p. 366)

Unlike meta-analysis, meta-synthesis is not lim-ited to synthesizing strictly comparable studies byconstructing 'interpretations, not analyses, andby revealing the analogies between accounts'(No-blit and Hare, 1988, p. 8). Meta-synthesisprovides a means of taking into account:

'all important similarities and differences in lan-guage, concepts, images, and other ideas around atarget experience.' (Sandelowski, Docherty andEmden, 1997, p.669)

Meta-ethnography is a method of meta-synthesisthat offers three alternative techniques forsynthesising studies. 'Refutational synthesis' canbe used when reports give conflicting representa-tions of the same phenomenon, 'reciprocaltranslations' can be used where reports addresssimilar issues and 'lines of argument synthesis'can be used if different reports examine differentaspects of the same phenomenon. A meta-ethnography is analogous with a groundedtheory approach for open coding and identifyingcategories emerging from the data and by makingconstant comparisons between individual ac-counts (Beck, 2001). The categories are thenlinked interpretively to provide a holistic accountof the whole phenomenon (Suri, 1999).

Many of the techniques of meta-synthesisremain 'either relatively untried and undeve-loped, and/or difficult to codify and understand'(Sandelowski, Docherty and Emden, 1997, p.369). However, both realist synthesis and meta-synthesis challenge the positivistic orthodoxy thatsurrounds contemporary approaches to researchreviews, demonstrating that a synthesis can be aninterpretive, inductive, hermeneutic and eclecticprocess (Jensen and Alien, 1996). Whilst meta-synthesis and realist synthesis approaches arefundamentally different to systematic reviews andin particular meta-analysis, they both share a

desire to improve upon traditional narrativereviews by adopting explicit and rigorous pro-cesses and by:

'the bringing together of findings on a chosentheme, the results of which should be to achieve agreater level of understanding and attain a level ofconceptual or theoretical development beyond thatachieved in any individual empirical study. (Camp-bell et al., 2002, p. 2)

As in systematic reviews, the aim of realistsyntheses and meta-syntheses is to 'have impact'by being 'presented in an accessible and usableform in the real world of practice and policymaking' (Sandelowski, Docherty and Emden,1971, p. 365).

Stage III: reporting and dissemination

A good systematic review should make it easierfor the practitioner to understand the research bysynthesizing extensive primary research papersfrom which it was derived. Within managementresearch a two-stage report might be produced.The first would provide full (rough-cut anddetailed) 'descriptive analysis' of the field. Thisis achieved using a very simple set of categorieswith the use of the extraction forms. Forexample, who are the authors, how many of thecore contributions are from the USA, how manyare European? What is the age profile of thearticles? Can the fields be divided into epochs interms of volume of orientation of study? Dosimple categories divide up the field? Forexample, can the field be divided sectorally? Bygender? Or simple categories 'borrowed' fromassociated cognate disciplines such as psychologyor sociology (interpretivist versus positivistic orbehavioural versus cognitive studies, for exam-ple). The researcher should be able to provide abroad ranging descriptive account of the fieldwith specific exemplars and an audit trail,justifying his/her conclusions.

Researchers also need to report the findings ofa 'thematic analysis', whether or not the resultswere derived through an aggregative or inter-pretative approach, outlining that which isknown and established already from data-extraction forms of the core contributions. Theymay wish to focus on the extent to whichconsensus is shared across various themes. Theymay also want to identify key emerging themes

Page 13: Towards a Methodology for Developing Evidence-Informed ... · British Journal of Management, Vol. 14. 207-222 (2003) Towards a Methodology for Developing Evidence-Informed Management

Developing Evidence-Informed Management Knowledge 219

and research questions. Whatever overarchingcategories are chosen for the tabulation, research-ers should again provide a detailed audit trailback to the core contributions to justify andground their conclusions. Linking themes acrossthe various core contributions wherever possibleand highlighting such links is an important partof the reporting process.

Systematic review provides a means for practi-tioners to use the evidence provided by research toinform their decisions. However, turning theconclusions from systematic reviews into guidelinesfor practice has been a challenge in medicine(Macdonald, 1999), as 'clinicians reason aboutindividual patients on the basis of analogy,experience, heuristics, and theory, as well asevidence' (Cook, Mulrow and Haynes, 1997, p.380). Decision-makers are likely, and should beencouraged, to use personal experience and pro-blem-solving skills rather than relying solely on theresults of systematic reviews (Bero and Rennie,1995; Rosenberg and Donald, 1995). Withinmanagement there is a need to recognize thatevidence alone is often insufficient and incomplete,only informing decision-making by boundingavailable options. Therefore, the terms 'evidenceinformed' or even 'evidence aware', rather than'evidence based' (Nutley, Davies and Walter, 2002;Nutley and Davies, 2002), may be more appro-priate in the management field, and the former hasinfiuenced our choice of title for this paper.

Improving the translation of research evidenceinto practice is not unproblematic as the 'rela-tionships between research, knowledge, policyand practice are always likely to remain loose,shifting and contingent' (Nutley and Davies,2002, p. 11). For evidence-informed practice tobe achieved, strategies need to be developedwhich encourage the uptake and utilization ofevidence that move beyond the simple construc-tion and dissemination of the research base(Nutley and Davies, 2000). Encouraging practi-tioners to set specific questions for reviews and toengage in the process may help in developing a'context sensitive' science (Nowotny, Scott andGibbons, 2001) which may help to blur theboundaries between science, policy and practice.Increasing the precision of a reliable evidencebase in order that policymakers and practitionerscan make more sensitive judgements is theultimate aim of the application of systematicreview procedures to management research.

Conclusions

This paper began by arguing that reviews ofexisting research evidence in the managementfield lack both rigour and relevance. Anderson,Herriot and Hodgkinson (2001) offer a four-foldcharacterization of applied social science. Theyterm research that is low on rigour but high onrelevance 'Popularist Science'. In contrast, 'Ped-antic Science' is high on rigour but low onrelevance, whereas 'puerile Science' is neitherrigorous nor relevant. Only 'Pragmatic Science'balances both rigour and relevance (see Figure 3).

They acknowledge that the pursuit of 'prag-matic' research:

'that genuinely bears the hallmarks of scientificrigour (irrespective of whether it be quantitativeand/or qualitative in nature), but which alsoengages a wider body of stakeholders in theknowledge production process, presents a set offormidable challenges for the management researchcommunity at this juncture.' (Hodgkinson, Herriotand Anderson, 2001, p. S46)

This paper has outlined the opportunities andchallenges in applying ideas and methods devel-

Theoretical andmethodological

rigour

Practicalrelevance

High

Low

LowQuadrant 1:

'PopularistScience'

Quadrant 3:

'PuerileScience'

HighQuadrant 2:

'PragmaticScience'

Quadrant 4:

'PedanticScience'

Figure 3. A four-fold typology of research in industrial, work andorganizational psychology. Source: adapted by G. P. Hodgkin-son, P. Herriot and N. Anderson (2001), British Journal ofManagement, 12 (Special Issue), page S42,from N. Anderson,P. Herriot and G. P. Hodgkinson, 'The practitioner-researcherdivide in industrial, work and organizational (IWO) psychology:where are we now, and where do we go from here?'. Journal ofOccupational and Organizational Psychology, 74, pp. 391—411.© 2001 The British Psychological Society and the BritishAcademy of Management. Reproduced by kind permission ofboth publishers.

Page 14: Towards a Methodology for Developing Evidence-Informed ... · British Journal of Management, Vol. 14. 207-222 (2003) Towards a Methodology for Developing Evidence-Informed Management

220 D. Tranfield, D. Denyer and P. Smart

oped in medical science to the field of manage-ment, with the aim of further developing andenhancing the quality of management reviewsand ensuring that they are practitioner andcontext sensitive. The aim of systematic reviewis to provide collective insights through theore-tical synthesis into fields and sub-fields. Foracademics, the reviewing process increases meth-odological rigour. For practitioners/managers,systematic review helps develop a reliable knowl-edge base by accumulating knowledge from arange of studies. In so doing the researcher maybe able to develop a set of 'field tested andgrounded technological rules' (Van Aken, 2001,p, 1), In this sense, systematic review can beargued to he at the heart of a 'pragmatic'management research, which aims to serve bothacademic and practitioner communities.

References

Anderson, N,, P, Herriot and G, P, Hodgkinson (2001), 'ThePractitioner-Researcher Divide in Industrial, Work andOrganizational (IWO) Psychology: Where Are We Now,and Where Do We Go From Here?', Journal of Occupationaland Organizational Psychology, 74 (4), pp, 391-411,

Aram, J, D, and J, P. F, Salipante (2000), 'Applied Research inManagement: Criteria for Management Educators and forPractitioner-Scholars', US Academy of Management Confer-ence—Multiple Perspectives on Learning in ManagementEducation, Toronto.

Becher, A, (1989), Academic Tribes and Territories: IntellectualEnquiry and the Cultures of Disciplines, The Society forResearch into Higher Education and the Open UniversityPress, Milton Keynes.

Beck, C, T, (2001), 'Caring with nursing education: A meta-synthesis'. Journal of Nursing Education, 40 (3), pp, 101-110,

Bero, L, and D, Rennie (1995), 'The Cochrane Collaboration:Preparing, Maintaining and Disseminating Systematic Re-views of the Effects of Health Care', Journal of the AmericanMedical Association, 114 (1), pp. 1935-1938,

Berry, M. (1995), 'Research and the Practice of Management: AFrench View', Organizational Science, 6 (2), pp, 104-116.

Blake, R. R. and J, S, Mouton (1976), Consultation. Addison-Wesley Publishing, Reading, MA,

Blaxter, M, (1996). 'Criteria for the Evaluation of QualitativeResearch Papers', Medical Sociology News, 22 (1), pp. 68-71,

Campbell, R., P, Pound, C, Pope, N. Britten, R. Pill, M, Morgan,and J, Donovan (2003), 'Evaluating meta-ethnography:a synthesis of qualitative research on lay experiences ofdiabetes and diabetes care'. Social Science and Medicine,56 (4), pp, 671-684.

Campbell Collaboration (2001), http://campbell,gse,upenn,edu/about.htm.

Clarke, M, and A. D, Oxman (Eds) (2001). Cochrane Reviewers'Handbook 4.1.4 [updated October 2001], The CochraneLibrary, Oxford,

Cochrane Collaboration (2001), The Cochrane Brochure, http://www,cochrane,org/cochrane/cc-broch,htm#BDL

Cook, D, J,, N, L. Greengold, A, G. Ellrodt, and S, R.Weingarten (1997). 'The Relation Between Systematic Re-views and Practice Guidelines', Annals of Internal Medicine,m (3) August, pp, 210-216.

Cook, D. J., C. D. Mulrow, and R. B. Haynes (1997).'Systematic Reviews: Synthesis of Best Evidence for ClinicalDecisions', Annals of Internal Medicine, 126 (5) March, pp.376-380.

Dabinett, G., P, Lawless, J. Rhodes, and P. Tyler (2001), AReview of the Evidence Base for Regeneration Policy andPractice, Department of the Environment Transport and theRegions.

Davies, H. T. O, and L K. Crombie (1998), 'Getting to Gripswith Systematic Reviews and Meta-Analyses', HospitalMedicine, 59 (12), pp, 955-958.

Davies, H, T. O. and S. M, Nutley (1999). 'The Rise and Rise ofEvidence in Health Care', Public Money & Management, 19(1), pp. 9-16.

Davies, H. T. O,, S, M. Nutley and P. C Smith (2000).'Editorial: What Works? The Role of Evidence in PublicSector Policy and Practice', Public Money & Management, 19(1), pp, 3-5,

Davies, H. T. O., S. M. Nutley and N, Tilley (1999). 'Editorial:Getting Research into Practice, Public Money & Manage-ment, 20(4), pp. 17-22.

Davies, P. (2000), The Relevance of Systematic Reviews toEducational Policy and Practice, http://www.jiscmail.ac.uk/flles/BEME/oxreview.htm,

Dingwall R,, E, A. Murphy, P, Watson, D, Greatbatch and S,Parker (1998). 'Catching Goldfish: Quality in QualitativeResearch', Journal of Health Services Research and Policy, 3(3), pp. 167-172,

Economic and Social Research Council (1999), A History of theEPSRC UK Centre for Evidence Based Policy and Practice,http://vvww,evidencenetwork,org/home,asp,

Engel, J. D. and A. J. Kuzel (1992). 'On the idea of whatconstitutes good qualitative inquiry'. Qualitative HealthResearch, 2, pp. 504-510.

Estabrooks, C. A., P. A. Field, and J, M, Morse (1994),'Aggregating qualitative findings: An approach to theorydevelopment'. Qualitative Health Research, 4, pp, 503-511.

Evans, D. and A, Pearson (2001), 'Systematic Reviews:Gatekeepers of Nursing Knowledge', Journal of ClinicalNursing, 10 (5), pp. 593-599.

Fink, A. (1998). Conducting Research Literature Reviews: FromPaper to the Internet, Sage Publications, London.

Friedman A., C. Durkin, M. Phillips, and E. Voltsinger (2000),The Future of UK Professional Associations, Proceedings for5th Annual Professional Association Research NetworkConference.

Furniss, J. S. and M. Nutley (2000), 'ImplementingWhat Works with Offenders—The Effective PracticeInitiative', Public Money <£ Management, 20 (4), pp.23-28,

Gibbons, M,, C, Limoges, H, Nowotny, S, Schwartzman, P,Scott and M, Trow (1994), The New Production of Knowl-edge: The Dynamics of Science and Research in ContemporarySocieties, Sage Publications, London.

Glass, G, V, (1976), 'Primary, Secondary, and Meta-analysis ofResearch', Educational Researcher, 5 (2), pp, 3-8,

Page 15: Towards a Methodology for Developing Evidence-Informed ... · British Journal of Management, Vol. 14. 207-222 (2003) Towards a Methodology for Developing Evidence-Informed Management

Developing Evidence-Informed Management Knowledge 221

Greenhaigh, T. (1997), 'Papers that Summarise Other Papers(Systematic Reviews and Meta-analyses)', British MedicalJournal, 315 (7109), pp. 672-675.

Greenhaigh, T. and R. Taylor (1997). 'Papers that go BeyondNumbers (Qualitative Research)', British Medical Journal.315 (7110), pp. 740-743.

Halladay, M. and L. Bero (2000). 'Implementing Evidence-based Practice in Health Care', Public Money & Management,20 (4), pp. 43-50.

Hambrick, D. C. (1994). 'What if the Academy ActuallyMattered?', Academy of Management Review, 19 (1), pp. 11.

Hart, C. (1998). Doing a Literature Review: Releasing the SocialScience Research Imagination, Sage Publications, London.

HM Inspectorate of Probation (1998). Strategies for EffectiveOffended Supervision. Report of the HMIP What WorksProject, Home Office, London.

Hodgkinson, G. P. (Ed.) (2001). Facing the future: the natureand purpose of management research re-assessed, BritishJournal of Management, 12 (Special Issue), pp. S1-S80.

Hodgkinson, G. P., P. Herriot and N. Anderson (2001). 'Re-aligning the Stakeholders in Management Research: Lessonsfrom Industrial, Work and Organizational Psychology',British Journal of Management, 12 (Special Issue), pp.S4I-S48.

Home Office (1998), Reducing Offending: An Assessment ofResearch Evidence on Ways of Dealing with OffendingBehaviour, Home Office Research and Statistics Directorate,London.

Huff, A. S. (2000). 'Changes in Organizational KnowledgeProduction', Academy of Management Review, 25 (2), pp.288-293.

Jensen, L. A., and M. N. Alien (1996). 'Meta-synthesis ofqualitative findings'. Qualitative Health Research, 6 (4), pp.553-60.

Laycock, G. (2000), 'From Central Research to Local Practice:Identifying and Addressing Repeat Victimization', PublicMoney & Management, 19 (1) pp. 17-22.

Macdonald, G. (1999), 'Evidence-based Social Care: Wheels offthe Runway?', Public Money & Management, 19 (1), pp.25-32.

Maclennan, D. and A. More (1999). 'Evidence, What Evidence?The Foundations for Housing Policy', Public Money &Management, 19 (I), pp. 17-24.

Mays, N. and C. Pope (2000). 'Assessing O^^lity in Quali-tative Research', British Medical Journal. 320 (January), pp.50-52.

Mulrow, C. D. (1987). 'The Medical Review Article: State ofthe Science', Annual International Medicine, 106, pp. 485-488.

Mulrow, C. D. (1994). 'Systematic Reviews—Rationale forSystematic Reviews', British Medical Journal, 309 (6954), pp.597-599.

Mulrow, C. D., D. J. Cook, and F. Davidoff (1997). 'Systematicreviews: Critical links in the great chain of evidence'. Annalsof Internal Medicine, 126, pp. 389-391.

National Institute for Clinical Excellence (2001), http://www.nice.org.uk/.

NHS Centre for Reviews and Dissemination (2001), Under-taking Systematic Reviews of Research on Effectiveness.CRD's Guidance for those Carrying Out or CommissioningReviews. CRD Report Number 4 (2"'' Edition). York.

Noblit, G. W. and R. D. Hare (1988). Meta-ethnography:Synthesizing Qualitative Studies, Sage Publications, London.

Nowotny, H., P. Scott and M. Gibbons (2001), RethinkingScience: Knowledge and the Public in an Age of Uncertainty.Blackwell Publishers, Maiden, MA.

Nutley, S. M. and H. T. O. Davies (2000). 'Making a Realityof Evidence Based Practice: Some Lessons from the Diffusionof Innovations', Public Money & Management, 20 (4), pp.35-42.

Nutley S. M., H. T. O. Davies and I. Walter (2002). FromKnowing to Doing: A Framework for Understanding theEvidence-into-practice Agenda, Discussion Paper 1, ResearchUnit for Research Utilisation, http://www.st-and.ac.uk/~ cppm/KnowDo%20paper.pdf.

Nutley, S. M. and H. T. O. Davies (2002). Evidence-based Policy& Practice: Moving from Rhetoric to Reality, Discussion Paper2, Research Unit for Research Utilisation, http://www.st-and.ac.uk/~cppm/Rhetoric%20to%20reality%20NF.pdf.

Ohlsson, A. (1994). 'Systematic Reviews—Theory and Prac-tice', Scandinavian Journal of Clinical & Laboratory Investi-gation. 54 (219), pp. 25-32.

Oxman, A. D, (1994). 'Systematic Reviews—Checklists forReview Articles', British Medical Journal, 309 (6955), pp.648-651.

Pawson, R. (2001). The Promise of a Realist Synthesis, WorkingPaper No.4, ESRC Evidence Network, Centre for EvidenceBased Policy and Practice, http://www.evidencenetwork.org/Documents/wp4.pdf

Peckham, M. (1991). 'Research and Development for theNational Health Service', Lancet, 338, pp. 367-371.

Petticrew, M. (2001). 'Systematic reviews from astronomy tozoology: myths and misconceptions', British Medical Journal322 (13) January, pp. 98-101.

Pettigrew, A. M. (1997). The Double Hurdles for ManagementResearch, Advancement in Organizational Behaviour, Ash-gate, Aldershot.

Pfeffer, J. and R. I. Sutton (1999). 'Knowing "What" to Do isNot Enough: Turning Knowledge into Action', CaliforniaManagement Review, 42 (1), pp. 83-108.

Popay, J., A. Rogers and G. Williams (1998). 'Rationale andStandards for the Systematic Review of Qualitative Litera-ture in Health Services Research', Qualitative HealthResearch, 8 (3), pp. 341-351.

Rosenberg, W. and W. Donald (1995). 'Evidence BasedMedicine: An Approach to Clinical Problem-Solving', BritishMedical Journal, 310 (6987), pp. 1122-1126.

Sandelowski, M., S. Docherty, and C. Emden (1997). 'Quali-tative Metasynthesis: Issues and Techniques'. Research inNursing and Health. 20 (4), pp. 365-371.

Smith, R. (1991), 'Where is the Wisdom.? The poverty ofmedical evidence' [editorial], British Medical Journal, 303, pp.789-799.

Slavin, R. E. (1986). 'Best-evidence Synthesis: An Alternative toMeta-analytic and Traditional Reviews', Educational Re-searcher, 15 (9), pp. 5-11.

Starkey, K. and P. Madan (2001). 'Bridging the Relevance Gap:Aligning Stakeholders in the Future of ManagementResearch', British Journal of Management, 12 (SI), pp. 3-26.

Suri, H. (1999). 'The process of synthesising qualitativeresearch: A case study'. Annual Conference of the Associationfor Qualitative Research, Melbourne, http://www.latrobe.e-du.au/aqr/offer/papers/HSuri.htm

Tisdall, P. (1982). Agents of Change: The Development andPractice of Management Consultancy, Heinemann, London

Page 16: Towards a Methodology for Developing Evidence-Informed ... · British Journal of Management, Vol. 14. 207-222 (2003) Towards a Methodology for Developing Evidence-Informed Management

222 D. Tranfield, D. Denyer and P. Smart

Tranfield, D. and K. Starkey (1998). The Nature, SocialOrganization and Promotion of Management Research:Towards Policy' British Journal of Management, 9 (4), pp.341-353.

Van de Ven, A. H. (1998). Professional Science for aProfessional School. Breaking the Code of Change Con-ference, Harvard Business School, Boston, MA.

Van Aken, J. (2001). Management Research Based on theParadigm of the Design Sciences: The Quest for Eield Testedand Grounded Technological Rules. Working Paper 01.1,Eindhoven Centre for Innovation Studies, Eindhoven Uni-versity of Technology, Eindhoven.

Whitley, R. (1984a), 'The Fragmented State of ManagementStudies: Reasons and Consequences', Journal of ManagementStudies. 21 (3), pp. 331-348.

Whitley, R. (1984b), 'The Scientific Status of ManagementResearch as a Practically-oriented Social Science', Journal ofManagement Studies. 2t (4), pp. 369-390.

Whitely, R. (2000), The Intellectual and Social Organization ofthe Sciences, Second Edition, Oxford University Press,Oxford.

Wilcoxson, L. and E. P. Fitzgerald (2001). The Nature and Roleof Management Research in Australia and New Zealand,Australian and New Zealand Academy Of ManagementConference.

Wind, J. and P. Nueno (1998). The Impact Imperative: Closingthe Relevance Gap of Academic Management Research,International Academy of Management North AmericaMeeting, New York.

Wolf, F. M., J. A. Shea and M. A. Albanese (2001), 'TowardSetting a Research Agenda for Systematic Reviews ofEvidence of the Effects of Medical Education', Teachingand Learning in Medicine. 13 (1), pp. 54-60.