quality appraisal of qualitative research
DESCRIPTION
Quality appraisal of qualitative research . Karin Hannes Centre for Methodology of Educational Research . Agenda: Quality appraisal. What is it ? Should we appraise quality ? Are there any criteria that compare to the basic quality criteria for quantitative studies? - PowerPoint PPT PresentationTRANSCRIPT
Quality appraisal of qualitative research
Karin HannesCentre for Methodology of
Educational Research
Agenda: Quality appraisal
• What is it?• Should we appraise quality?• Are there any criteria that compare to the basic quality
criteria for quantitative studies?• Are there any techniques to diminish quality threats?• What are the different stages in a critical appraisal
exercise?• How do different appraisal instruments compare to each
other?• How to use and report a critical appraisal outcome?
Quality appraisal: what is it!
• “the process of systematically examining research evidence to assess its validity, results
and relevance before using it to inform a decision”
• Hill, A., Spittlehouse, C, 2003. What is critical appraisal? Evidence Based Medicine, 3 (2), 1-8. Available from:
• http://www.evidence-based-medicine.co.uk/ebmfiles/WhatisCriticalAppraisal.pdf
Quality appraisal: Should we?A review of published QES
• 21 papers did not describe appraisal of candidate studies
• 6 explicitely mentioned not conducting formal appraisal of studies
• 5 papers did a critical appraisal, but did not use a formal checklist
• 7 described modifying existing instruments• 1 used an existing instrument without modification
Dixon-Woods M, Booth A, Sutton AJ. Synthesizing qualitative research: a review of published reports. Qual Res 2007; 7:375
Quality appraisal: Should we?• Qualitative
research is subject to the same criteria as quantitative research.
Validity, reliability and generalisibility should be addressed in critical appraisal.
Quality appraisal: Should we?
•Adjust the tools.Qualitative research is in need of a set of criteria specifically designed for it.
Quality appraisal: Should we?
• Put an end to criteriology.
Do not try to fit something that in the end stiffles the creative aspects of qualitative research.
Quality appraisal: Should we?
• Use criteria as guides to good practice
They should not be used as rigid requirements in appraising papers.
Quality appraisal: Should we?• Interpretivists (idealism)
– Bias: Subjectivity is uses actively/creatively through the research process
– Validity: there are multiple ways of understanding reality
– Reliability: Researchers report information and readers discern the patterns identified and verify interpretations
The more you appraise, the more it stifles creativity. The main inclusion criterion is relevance!
Realists-pragmatics – Bias: Researcher bias affects
trustworthiness, or validity– Validity:the emphasis is on
striving for truth in being adequate, accurate, credible
– Reliability: Steps to establish it should be build into the research process to affirm researchers’ observations
The more you appraise, the lesser the chance to end up with flawed results. The main inclusion criterion is quality!
Quality appraisal: Should we?
Note: I might substantially have been brainwashed in the ‘risk of bias’ discourse, beyond my personal control.
I appraise!
Quality appraisal: Basic criteria
• Carrying out ethical research• Importance of the research• Clarity and coherence of the report• Use of appropriate and rigorous methods
• Importance of reflexivity or attending to researcher bias• Importance of establishing validity or credibility• Importance of verification or reliability
Divergent perspectives, linked to research
paradigms
Cohen DJ, Crabtree BF. Evaluative criteria for qualitative research in health care: controversies and recommendations. Annals of Fam. Med. 2008; 6(4):Jul/aug
Quality appraisal: Basic criteriaAspect Qualitative term Quantitative term
Truth value Credibility Internal validity
Applicability Transferability Generalisability
Consistency Dependability Reliability
Neutrality Confirmability Objectivity
Quality appraisal: Basic criteriaQualitative term TechniquesCredibility: the representation of data fits the views of the participants studied, the findings hold true
•outside auditors or participants validate findings (member checks)•peer debriefing, •attention to negative cases, •independent analysis of data by more than one researcher •verbatim quotes•persistent observation (stay in the field long enough)
Transferability: research findings are transferable to other specific settings
providing details of the study participants to enable readers to evaluate for which target groups the findings potentially hold trueproviding contextual background information, demographicsproviding thick description about both the sending and the receiving context
Dependability: process of research is logical, traceable and clearly documented, particularly on the methods chosen and the decisions made by the researchers
peer review, debriefing, audit trailstriangulation, the use of different methodological approaches to look at the topic of researchreflexivity to keep a self-critical account of the research processcalculation of inter-rater agreements
Confirmability: findings are qualitatively confirmable through the analysis being grounded in the data, through examination of the audit trail
assessing the potential effects/impact of the researcher during all steps of the research processReflexivity toward personal influences, biasproviding background information on the researcher’s background, education, perspective, school of thought
Quality appraisal: different stages
Critical appraisal involves
(i) filtering against minimum criteria, involving adequacy of reporting detail
Limit the type of qualitative studies to be included to empirical studies with a description of the sample strategy, data collection procedures and the type of data-analysis considered.
Exclude: descriptive papers, editorials, opinion papers
(ii) technical rigour of the study elements indicating methodological soundness
(iii) paradigmatic sufficiency, referring to researchers’ responsiveness to data and theoretical consistency’
Technical appraisal stage
• Use an appraisal instrument to look for indications in a study that add to the level of methodological soundness of the study to determine the degree of confidence in the researcher’s competence to conduct research following established norms.
• Needs a general understanding of qualitative criteria
THE CHECKLIST APPROACH
Theoretical appraisal stage
• Use a subsequent paradigmatic approach to judgement, which refers to an evaluation of methodological coherence between theory and methodology / methods, to evaluate the quality and rationale of the decisions made.
• Needs a more in-depth understanding of qualitative research
THE OVERALL JUDGEMENT APPROACH
Validity in Qualitative Research: a comparative analysis of 3 online appraisal instruments’ ability
to evaluate validityHannes, Lockwood & Pearson (2010), Qualitative Health Research
Which criteria are used?Focus on validity (Maxwell, 1992)?
What is the extent to which appraisal instruments evaluate validity?
Which criteria are used to evaluate the quality of a study?
• Selection of appraisal instruments:– Used in recently published QES (2005-2008)– Online available and ready to use– Broadly applicable to different qualitative research
designs– Developed and supported by an
organisation/institute/consortium• Three instruments fit the inclusion criteria:
– Joanna Briggs Institute-Tool– Critical Appraisal Skills Programme-Tool– Evaluation Tool for Qualitative Studies
• To facilitate comparison:– Criteria grouped under 11 headings– Cross-comparison of the criteria
Which criteria are used to evaluate the quality of a study?
Criterion JBI CASP ETQS1.Theoretical framework x x
2. Appropriateness design x x
3. Data collection procedure x x x
4. Data-analysis procedure x x x
5. Findings x x x
6. Context x x
7. Impact of investigator x x x
8. Believability x x
9. Ethics x x x
10. Evaluation/Outcome x x
11.Value/Implication Research x x
Validity as the main criterion• In evaluating methodological quality we need to
know – whether the set of arguments or the conclusion derived
from a study necessarily follows from the premises. – whether it is well grounded in logic or truth. – whether it accurately reflects the concepts, the ideas that
it is intended to measure.• Main focus should be VALIDITY• Main question should be:
What is the extent to which the different instruments establish validity?
Validity as the main criterionMaxwell Definition Techniques
Descriptive validity
The degree to which descriptive information such as events, subjects, setting, time, place are accurately reported (facts).
Methods- & Investigator triangulation allows for cross-checking of observations
Interpretative validity
The degree to which participants’ viewpoints, thoughts, intentions, and experiences are accurately understood and reported by the researcher.
Display of citations, excerpts, use of multiple analysts (inter-rater agreements), self-reflection of the researcher, (member checking)
Theoretical validity
The degree to which a theory or theoretical explanation informing or developed from a research study fits the data and is therefore credible/defensible.
Persistent observation stable patterns, deviant or disconfirming cases, multiple working hypotheses, theory triangulation, pattern matching
Generalisa-bility (external validity)
The degree to which findings can be extended to other persons, times or settings than those directly studied.
Demographics, contextual background information, thick description, replication logic
Evaluative validity
The degree to which an evaluative critic is applied to the object of study (as part of the researcher’s reflexivity)
Ethics?, Clarifying the links between conclusions and other parts of the research process?
What is the extent to which the different instruments establish validity?
Maxwell Criteria Instrument
Descriptive validity Impact of investigatorContext
JBI, CASP, ETQSJBI, ETQS
Interpretative validity Believability JBI, ETQS
Theoretical validity Theoretical framework JBI, ETQS
Generalisability (external validity) Value & Implications CASP, ETQS
Evaluative validity Evaluation/outcome JBI, ETQS
What is the extent to which the different instruments establish validity?
• The most commonly used instrument ‘CASP’, is the least sensitive to aspects of validity. It does not address interpretive nor theoretical validity or context as a criterion.
– Statements that have no clear link to excerpts are at risk of not being grounded in the data.
– The theoretical position and the role of the researcher have a direct impact on the interpretation of the findings.
• We need to select our critical appraisal instrument with care.
What is the extent to which the different instruments establish validity?
Critical note:• Checklists only capture what has been reported.• In evaluating validity at the end of a study (post hoc), rather
than focusing on processes of verification during the study we run the risk of missing serious threats to validity until it is too late to correct them.
• Basic qualitative researchers – should be motivated to adopt techniques that improve
validity– Should be guided in how to report qualitative research in
order to facilitate critical appraisal
Quality appraisal: How to use and reporta critical appraisal outcome?
• To include or exclude a study Study/Criterion* Study 1 Study 2 Study 3 Study 4 Study 5
Crit 1 x / / x /
Crit 2 x x / x x
Crit 3 x x x x x
Crit 4 x ? / x x
Crit 5 x x x ? x
Quality rating H/L**
H Judge!*** L H Judge!
Comments Motivate****
Motivate Motivate
•*Authors may choose to give more weight to certain criteria and use this in their final judgment.•** H/L= High/Low•*** For studies that are clearly on the verge between in- and exclusion a judgement should be made and discussed with potential co-reviewers.•**** Authors should include a motivation for exclusion for those cases where judgments are being made.
Only high quality studies are included. The potential risk is that valuable insights are excluded from the synthesis.
Quality appraisal: How to use and reporta critical appraisal outcome?
• To give more weight to studies that scored high on quality
Study/Criterion* Study 1 Study 2 Study 3 Study 4 Study 5
Crit 1 x / / x /
Crit 2 x x / x x
Crit 3 x x x x x
Crit 4 x ? / x x
Crit 5 x x x ? x
Quality rating H/L** H Judge!*** L H Judge!
Comments Motivate**** Motivate Motivate
•All valuable insights remain included. •It might be complex to report on the findings of the synthesis given the ‘subgroups’ of studies. •No fixed parameters currently exist to determine the weight of qualitative studies.•Reviewers choosing this approach need to evaluate which methodological flaws have a substantial impact on the findings presented.
To describe what has been observed without excluding any studies
• All potential valuable insights remain included, because the worth of individual studies might only become recognisable at the point of synthesis rather than in the phase of appraisal.
Quality appraisal: How to use and reporta critical appraisal outcome?
• There is value in all of these approaches. However, in line with current Cochrane and Campbell policy, I recommend the two first approaches emphasizing the methodological soundness of studies rather than their contribution to science in general.
• Guidelines:– Reviewers need to clarify how the outcome of their critical appraisal exercise
is used with respect to the presentation of their findings.’– Both recommended approaches could benefit from a sensitivity analysis
evaluating what happens to the findings of a study when low or high quality studies are removed.
– The convention of using at least two researchers for the quality assessment process is a useful legacy from quantitative-based review processes; not so much for inter-rater consistency purposes but, at the very least, to open up the data to a broader range of possible interpretations.’
Quality appraisal: How to use and reporta critical appraisal outcome?
Quality appraisal: How to use and reporta critical appraisal outcome?
• Critical note:
– Fatal flaws may not be easily distinguished into simple binary judgements: it may be necessary to take a holistic view of a study that recognises the importance of context and what was feasible in that context.
– The skill in critical appraisal lies not in identifying problems, but in identifying errors that are large enough to affect how the result of the study should be interpreted.
– We need to balance assessment against the weight of a message: ‘signal to noise ratio’.
Booth A. Cochrane or cock-eyed? How should we conduct systematic reviews of qualitative research? Qualitative EBP conference, coventry university, may 14-16 2001.Petticrew and Roberts (2006, p128). Systematic Reviews in the Social Sciences: A practical guide. Oxford: Blackwell Publishing.
Critical appraisal of qualitative research
Background information:
• Cochrane Qualitative Methods Group Guidance – Critical Appraisal Chapter
(contains many more examples of critical appraisal checklists)
(1.) congruity between: the stated philosophical perspective and the research methodology
Does the study clearly state the philosophical or theoretical premises and the methodological approach adopted on which the study is based?
2) There is congruity between the research methodology and the research question or objectives.
Is the study methodology appropriate for addressing the research question?
3.) the research methodology and the methods used to collect data.
Are the data collection methods appropriate to the methodology?
(4.) the research methodology and the representation and analysis of data.
Are the data analysed and represented in a way that are congruent with the stated methodological position?
(5.) the research methodology and the interpretation of results. Are the results interpreted in ways that are appropriate to the methodology?
(6.) There is a statement locating the researcher culturally. Are the beliefs and values, and their potential influences on the study declared?
7.) The influence of the researcher on the research and vice versa is clear
Is the potential for the researcher to influence the study and for the potential of the research process itself to influence the researcher and her /his interpretations acknowledged and addressed?
(8.) Participants, and their voices, are heard Does the report provide illustrations from the data to show the basis of their conclusions and to ensure that participants and their voices are represented in the report?
(9.) The research is ethical according to current criteria or, there is evidence of ethical approval by an appropriate body.
Does the report include a statement on the ethical approval process followed?
10.) Conclusions drawn in the research report do appear to flow from the analysis, or interpretation, of the data.
Is there a relationship between the findings reported and the views of or words of the study participants; being the text generated through observation interviews or other processes?