Notes
Article history
The research reported in this issue of the journal was funded by the HS&DR programme or one of its proceeding programmes as project number 10/1008/07. The contractual start date was in April 2011. The final report began editorial review in July 2013 and was accepted for publication in January 2014. The authors have been wholly responsible for all data collection, analysis and interpretation, and for writing up their work. The HS&DR editors and production house have tried to ensure the accuracy of the authors’ report and would like to thank the reviewers for their constructive comments on the final report document. However, they do not accept liability for damages or losses arising from material published in this report.
Declared competing interests of authors
none
Permissions
Copyright statement
© Queen’s Printer and Controller of HMSO 2014. This work was produced by Wong et al. under the terms of a commissioning contract issued by the Secretary of State for Health. This issue may be freely reproduced for the purposes of private research and study and extracts (or indeed, the full report) may be included in professional journals provided that suitable acknowledgement is made and the reproduction is not associated with any form of advertising. Applications for commercial reproduction should be addressed to: NIHR Journals Library, National Institute for Health Research, Evaluation, Trials and Studies Coordinating Centre, Alpha House, University of Southampton Science Park, Southampton SO16 7NS, UK.
Chapter 1 Background
Academics and policy-makers are increasingly interested in policy-friendly approaches to evidence synthesis which seek to illuminate issues and understand contextual influences on whether or not, why and how interventions might work. 1–4 A number of different approaches have been used to try to address this goal, such as meta-ethnography, grounded theory, thematic synthesis, textual narrative synthesis, meta-study, critical interpretive synthesis, ecological triangulation and framework synthesis. 5 Qualitative and mixed-method reviews are often used to supplement, extend and, in some circumstances, replace Cochrane-style systematic reviews. 6–12 Theory-driven approaches to such reviews include realist and meta-narrative reviews. Realist review was originally developed by Pawson for complex social interventions to explore systematically how contextual factors influence the link between intervention and outcome (summed up in the question: what works, how, for whom, in what circumstances and to what extent?). 13,14 Greenhalgh et al. 15 developed a meta-narrative review for use when a policy-related topic has been researched in different ways by multiple groups of scientists, especially when key terms have different meanings in different literatures.
Quality checklists and publication standards are common (and, increasingly, expected) in health services research – see for example CONSORT (Consolidated Standards of Reporting Trials) for randomised controlled trials,16 AGREE (Appraisal of Guidelines for Research and Evaluation) for clinical guidelines,17 PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) for Cochrane-style systematic reviews18 and SQUIRE (Standards for Quality Improvement Reporting Excellence) for quality improvement studies. 19 They have two main purposes: they help researchers design and undertake robust studies, and they help reviewers and potential users of research outputs assess validity and reliability. This project seeks to produce a set of quality criteria and comparable publication standards for realist and meta-narrative reviews.
What are realist and meta-narrative reviews?
Realist and meta-narrative reviews are systematic, theory-driven interpretative techniques, which were developed to help make sense of heterogeneous evidence about complex interventions applied in diverse contexts in a way that informs policy. Interventions have been described as theory incarnate,20 driven by hypotheses, hunches, conjectures and aspirations about individual and social betterment. Strengthening a review process that helps to sift and sort these theories may be an important step in producing better interventions.
Realist reviews seek to unpack the relationships between context, mechanism and outcomes (sometimes abbreviated as CMO), i.e. how particular contexts have triggered (or interfered with) mechanisms to generate the observed outcomes. 14 Its philosophical basis is realism, which assumes the existence of an external reality (a real world) but one that is filtered (i.e. perceived, interpreted and responded to) through human senses, volitions, language and culture. Such human processing initiates a constant process of self-generated change in all social institutions, a vital process that has to be accommodated in evaluating social programmes.
In order to understand how outcomes are generated, the roles of both external reality and human understanding and response need to be incorporated. Realism does this through the concept of mechanisms, whose precise definition is contested but for which a working definition is, ‘. . . underlying entities, processes, or structures which operate in particular contexts to generate outcomes of interest’. 21 Different contexts interact with different mechanisms to make particular outcomes more or less likely; hence, a realist review produces recommendations of the general format ‘in situations [X], complex intervention [Y], modified in this way and taking account of these contingencies, may be appropriate’. Realist reviews can be undertaken in parallel with traditional Cochrane reviews (see, for example, the complementary Cochrane and realist reviews of school feeding programmes in disadvantaged children). 22,23 The Cochrane review produced an estimate of effect size, whereas the realist review addressed why and how school feeding programmes worked, explained examples of when they did not work, and produced practical recommendations for policy-makers.
Meta-narrative reviews were originally developed by Greenhalgh et al. 15,24 to try to explain the apparently disparate data encountered in their review of diffusion of innovation in health-care organisations. Core concepts such as diffusion, innovation, adoption and routinisation had been conceptualised and studied very differently by researchers from a wide range of primary disciplines including psychology, sociology, economics, management and even philosophy. While some studies had been framed as the implementation of a complex intervention in a social context (thus lending themselves to a realist analysis), others had not. Preliminary questions needed to be asked, such as ‘What exactly did these researchers mean when they used the terms ‘diffusion’, ‘innovation’ and so on?’; ‘How did they link the different concepts in a theoretical model – either as a context–mechanism–outcome proposition or otherwise?’; and ‘What explicit or implicit assumptions were made by different researchers about the nature of reality?’
These questions prompted the development of meta-narrative review, which sought to illuminate the different paradigmatic approaches to a complex topic area by considering how the same topic had been differently conceptualised, theorised and empirically studied by different groups of researchers. Meta-narrative review is particularly suited to topics where there is dissent about the nature of what is being studied and what is the best empirical approach to studying it. For example, Best et al. ,25 in a review of knowledge translation and exchange, asked how different research teams had conceptualised the terms knowledge, translation and exchange – and what different theoretical models and empirical approaches had been built on these different conceptualisations. Thus, meta-narrative review potentially offers another strategy to assist policy-makers to understand and interpret a conflicting body of research and, therefore, to use it more effectively in their work.
The need for standards in theory-driven systematic reviews
Realist and meta-narrative approaches can capitalise on and help build common ground between social researchers and policy teams. Many researchers are attracted to these approaches because they allow systematic exploration of how and why complex interventions work. Policy-makers are attracted to them because they are potentially able to answer questions relevant to practical decisions (not merely ‘What is the impact of X?’ but ‘If we invest in X, to which particular sectors should we target it, how might implementation be improved and how might we maximise its impact?’). As interest in such approaches is burgeoning, it is our experience that these approaches are sometimes being applied in ways that are not always true to the core principles set out in previous methodological guidance. 4,14,15,26 Some reviews published under the realist banner are not systematic, not theory driven and/or not consistent with realist philosophy. The meta-narrative label has also been misapplied in reviews which have no systematic methodology. For these reasons, we believe that the time has come to develop formal standards and training materials.
There is a philosophical problem here, however. Realist and meta-narrative approaches are interpretive processes (that is, they are based on building plausible evidenced explanations of observed outcomes, presented predominantly in narrative form); hence, they do not easily lend themselves to a formal procedure for quality checking. Indeed, we have argued previously that the core tasks in such reviews are thinking, reflecting and interpreting. 4,15 In these respects realist and meta-narrative reviews face a problem similar to that encountered in assessing qualitative research, namely the extent to which guidelines, standards and checklists can ever capture the essence of quality. Some qualitative researchers are openly dismissive of the technical checklist approach as an assurance of quality in systematic review. 27 While we acknowledge such views, we believe that from a pragmatic perspective, formal quality criteria – with appropriate caveats – are likely to add to, rather than detract from, the overall quality of outputs in this field. Scientific discovery is never the mere mechanical application of set procedures. 28 Accordingly, research protocols should aim to guide rather than dictate.
The online Delphi method
This study used the online Delphi method, and in this section we introduce, explain and justify our use of this method. The essence of the Delphi technique is to engender reflection and discussion among a panel of experts with a view to getting as close as possible to a consensus and documenting both the agreements reached and the nature and extent of residual disagreement. 29 It was used, for example, to set the original care standards which formed the basis of the Quality and Outcomes Framework for UK general practitioners. 30 Factors which have been shown to influence quality in the Delphi process include:
-
composition (expertise, diversity) of the expert panel
-
selection of background papers and evidence to be discussed by that panel (completeness, validity, representativeness)
-
adequacy of opportunities to read and reflect (balance between accommodating experts’ busy schedules and keeping to study milestones)
-
qualitative analysis of responses (depth of reflection and scholarship, articulation of key issues)
-
quantitative analysis of responses (appropriateness and accuracy of statistical analysis, clarity of presentation when this is fed back)
-
how dissent and ambiguity are treated (e.g. avoidance of groupthink, openness to dissenting voices). 29,31,32
Evidence suggests that the online medium is more likely to improve than jeopardise the quality of the consensus development process. Mail-only Delphi panels have been shown to be as reliable as face-to-face panels. 33 Asynchronous online communication has well-established benefits in promoting reflection and knowledge construction. 34 There are over 100 empirical examples of successful online Delphi studies conducted between geographically dispersed participants (for examples see Keeney et al. ,32 Elwyn et al. ,35 Greenhalgh and Wengraf,36 Hart et al. ,37 Holliday and Robotin,38 and Pye and Greenhalgh39). We were unable to find any online Delphi study which identified the communication medium as a significant limitation. On the contrary, many authors described significant advantages of the online approach, especially when dealing with an international sample of experts. One group commented ‘our online review process was less costly, quicker and more flexible with regard to reviewer time commitment, because the process could accommodate their individual schedules’. 38
Critical commentaries on the Delphi process have identified a number of issues which may prove problematic, for example ‘issues surrounding problem identification, researcher skills and data presentation’29 or defining consensus; issues of anonymity; time requirements for data collection, analysis, feedback to participants and obtaining responses on feedback; defining and selecting experts; enhancing response rates and deciding on how many rounds to undertake. 40 These comments suggest that it is the underlying design and rigour of the research process which is key to the quality of the study and not the medium through which this process happens.
Chapter 2 Objectives
For this project we set out to:
-
collate and summarise the literature on the principles of good practice in realist and meta-narrative reviews, highlighting in particular how and why these differ from conventional forms of systematic review and from each other
-
consider the extent to which these principles have been followed by published and in-progress reviews, thereby identifying how rigour may be lost and how existing principles could be improved
-
produce, in draft form, an explicit and accessible set of methodological guidance and publication standards using an online Delphi method with an interdisciplinary panel of experts from academia and policy
-
produce training materials with learning objectives linked to these steps and standards
-
refine these standards and training materials prospectively on real reviews-in-progress, capturing methodological and other challenges as they arise
-
synthesise expert input, evidence review and real-time problem analysis into more definitive guidance and standards
-
disseminate these guidance and standards to audiences in academia and policy.
Chapter 3 Methods
Overview of methods
We used a range of methods to meet the objectives we set out above. In this section we provide a brief overview of the methods we used and how they related to each other. The following methods sections outline specific aspects of the methods we used in more detail.
To fulfil objectives 1 and 2 we undertook a narrative review of the literature that was supplemented by collating feedback from presentations and workshops. We synthesised our findings into briefing materials [one for realist synthesis (RS) and another for meta-narrative reviews]. We recruited members to two Delphi panels, which had wide representation from researchers, students, policy-makers, theorists and research sponsors. We used the briefing materials to brief the Delphi panel, so they could help us in fulfilling objective 3. For objective 4, we drew not only on our experience in developing and delivering education materials, but also relevant feedback from the Delphi panel, an e-mail list we set up specifically for this project (www.jiscmail.ac.uk/RAMESES), training workshops and the review teams we supported methodologically. To help us refine our publication standards (objective 5) we captured methodological and other challenges that arose within the realist or meta-narrative review teams we provided methodological support to. To produce the definitive publication standards, quality standards and training materials (objective 6), we synthesised expert input (from the Delphi panel), literature review and real-time problem analysis (e.g. feedback from the e-mail list, training sessions and workshops, and presentations). Throughout this project we did not set specific time points when we would refine the drafts of our project outputs. Instead, we iteratively and contemporaneously fed any data we captured into our draft publication standards, quality standards and training materials, making changes gradually. Only our Delphi panels ran within a specific time frame. The definitive guidance and standards were, therefore, the product of continuous refinements. We addressed objective 7 through academic publications, online resources and delivery of presentations and workshops. Figure 1 provides a pictorial overview of how the different methods we used fed into each other.
Details of literature search methods
Prior to the start of this project we had undertaken initial exploratory searches. These were rapid searches that were not intended to be comprehensive: they involved each project team member identifying in their personal files examples of realist and/or meta-narrative reviews. To identify further reviews, we undertook a search of the reference lists of each retrieved review. This two-step process yielded 13 reviews that were later used by our expert librarian to develop and refine our searches (see Appendix 1). We found that the literature in this field was currently small but expanding rapidly, of broad scope, variable quality and inconsistently indexed. Our purpose for identifying published reviews was not to complete a census of realist and meta-narrative studies. We make no claims that the review we undertook was exhaustive, thus we have not and never intended that it should be published as a stand-alone piece of research. The main purpose of this review was to enable us to produce the briefing materials for our two Delphi panels (objective 3), not to produce an exhaustive summary of all research ever published on the topic. As such, the review we undertook would best be considered as being a rapid, accelerated or truncated narrative review. Such an approach will predictably produce limitations and these are discussed in Chapter 5, Limitations.
We wanted our search to allow us to pinpoint real examples (or publications claiming to be examples) that provide rich detail on their usage of those review activities we wish to scrutinise and formalise. To that end, and drawing on a previous study which demonstrated the effectiveness and efficiency of the methods proposed,41 and employing the skills of a specialist librarian, we searched 16 electronic databases from inception (where applicable) to 15 June 2010. The databases searched are listed below (the number of hits found with each database searched may be found in Table 1):
-
Academic Search Complete
-
Cumulative Index to Nursing and Allied Health Literature (CINAHL)
-
The Cochrane Library
-
Dissertation Abstracts
-
EMBASE
-
Education Resources Information Center (ERIC)
-
Global Health
-
Google
-
HealthSTAR
-
MEDLINE
-
PASCAL [database of INIST (Institut de l’Information Scientifique et Téchnique)]
-
PsycINFO
-
Scopus
-
Sociological Abstracts
-
Social Policy and Practice
-
Web of Science [Science Citation Index (SCI), Social Science Citation Index (SSCI), Arts and Humanities Citation Index (AHCI)].
Database | Hits returned |
---|---|
Academic Search Complete | 69 |
CINAHL | 28 |
The Cochrane Library | 4 |
Dissertation Abstracts | 9 |
EMBASE | 39 |
ERIC | 2 |
Global Health | 10 |
HeathSTAR | 20 |
MEDLINE | 43 |
PASCAL | 6 |
PsycINFO | 14 |
Scopus | 182 |
Web of Science (SCI and SSCI combined) | 190 |
Sociological Abstracts | 0 |
Social Policy and Practice | 0 |
0 | |
Total retrieved | 616 |
Duplicates | 368 |
Files screened | 248 |
We used the following approaches in our searches:
-
A simple truncated text-word search was conducted on all databases using the following words: (Meta-narrative OR metanarrative OR realist) ADJ (review* OR protocol* OR synthesis OR syntheses OR technic OR technics OR technique*) was used where ADJ (adjacency) was a search operator (Ovid Databases); or (metanarrative OR meta-narrative OR realist) AND (review* OR synthesis OR syntheses OR protocol* OR technic OR technics OR technique*), where adjacency was not a search operator. In the last instance, the search was limited to the title field. The strategy was developed based on a collection of 13 published reviews we had identified in our exploratory searches and was piloted and refined to produce the most sensitive search strategy for the topic.
-
Citation chaining, on databases where this feature was available (Scopus and Web of Science at the time of the search) was performed. Seminal citations were followed, with the reasoning that anyone using realist or narrative techniques would be likely to cite these references. 4,14,15
Results were kept separate for each database in RefWorks (version 5; RefWorks-COS, Bethesda, MD, USA) reference management software and were then collated into a separate merged file from which duplicates were removed.
No language or study design filters were applied. To construct our sample for further analysis (in which we intended to study both exemplary reviews and those that had flaws), we included any review that claimed to be a realist review or a meta-narrative review. Documents were excluded if they were not a review (e.g. editorials, opinion pieces, commentaries, methods papers) or did not claim to be a realist or meta-narrative review. We did not undertake any independent screening or an audit of a random subset for quality control purposes. The whole searching process from start to the retrieval of all full-text documents took approximately 1 month.
We conducted a thematic analysis of this literature which was initially oriented to addressing seven key areas:
-
What are the strengths and weaknesses of realist and meta-narrative review from both a theoretical and a practical perspective?
-
How have these approaches actually been used? Are there areas where they appear to be particularly fit (or unfit) for purpose?
-
What, broadly, are the characteristics of high- and low-quality reviews undertaken by realist or meta-narrative methods? What can we learn from the best (and worst) examples so far?
-
What challenges have reviewers themselves identified (e.g. in the introduction or discussion sections of their papers) in applying these approaches? Are there systematic gaps between the theory and the steps actually taken?
-
What is the link between realist and meta-narrative review and the policy-making process? How have published reviews been commissioned or sponsored? How have policy-makers been involved in shaping the review? How have they been involved in disseminating and applying its findings? Are there models of good practice (and of approaches to avoid) for academic–policy linkage in this area?
-
How have front-line staff and service users been involved in realist and meta-narrative reviews? If the answer to this is ‘usually, not much’, how might they have been involved and are there examples of potentially better practice which might be taken forward?
-
How should one choose between realist, meta-narrative and other theory-driven approaches when selecting a review methodology? How might (for example) the review question, purpose and intended audience(s) influence the choice of review method?
The thematic analysis was led by one member of the review team (GWo). He undertook all stages of the review and shared findings with the rest of the project team so that discussion, debate and refinement of his interpretations of the data in the included reviews could take place. Findings were shared by e-mail and, where necessary, face-to-face meetings took place to discuss any interpretations made of the data. In undertaking our thematic analysis, we familiarised ourselves with the included reviews to identify patterns in the data. We used the questions above, which relate to seven key areas, as a starting point in our sensemaking of the data, and as a project team we were aware that the purpose of the review was to produce briefing documents for the Delphi panels. In these panels we wanted to achieve a consensus on quality and reporting standards, and so what we needed from our review of the literature were data to inform us on what might constitute quality in executing and reporting reviews. We accepted that we might need to refine, discard or add additional questions and topic areas to explore in order to better capture our analysis and understanding of the literature as these emerge from our reading of the papers.
Data were extracted to a Microsoft Excel spreadsheet (Microsoft Corporation, Redmond, WA, USA), which we iteratively refined to capture the data needed to produce our briefing materials. This review was undertaken in a short timeframe, such that the time taken from obtaining full-text documents to producing the final draft for circulation of the briefing documents was approximately 10 weeks. The output of this phase was a provisional summary for each review method that addressed the questions above and highlighted for each question the key areas of knowledge, ignorance, ambiguity and uncertainty. This was distributed to the Delphi panel (as our briefing document) as the starting point for their guidance development work.
Details of online Delphi process
We followed an online adaptation of the Delphi method (see Chapter 1, The online Delphi method) which we had developed and used in a previous study to produce guidance on how to critically appraise research on illness narratives. 36 In that study, a key component of a successful Delphi process was recruiting a wide range of experts, policy-makers, practitioners and potential users of the guidance who could approach the problem from different angles and, especially, people who would respond to academic suggestions by asking ‘so what?’ questions.
Placing the academic–policy/practice tension central to this phase of the research, we planned to construct our Delphi panel to include a majority of experienced academics (e.g. those who have published on theory and method in realist and/or meta-narrative review). We also planned to recruit policy-makers, research sponsors and representatives of third-sector organisations. These individuals were recruited by approaching relevant organisations and e-mail lists [e.g. professional networks of systematic reviewers, CHAIN (Contact, Help, Advice and Information Network; http://chain.ulcc.ac.uk) and INVOLVE (www.invo.org.uk/)]. We approached INVOLVE as we were interested in exploring if we could identify a lay person who might have interest in secondary research and/or informing policy/decision-making through reviews. Those interested in participating were provided with an outline of the study and individuals who indicated greatest commitment and potential to balance the sample were selected. We drew on our own experience of developing standards and guidance, as well as on published papers by CONSORT, PRISMA, SQUIRE, AGREE and other teams working on comparable projects. 16,18,19,42
The Delphi panel was conducted entirely via the internet using a combination of e-mail and an online survey tool (www.surveymonkey.com). We began with a brainstorm round (round 1) in which participants were invited to submit personal views, exchange theoretical and empirical papers on the topic and suggest items that might could be included in the publication standards. This was done as a warm-up exercise and panel members were sent our own preliminary summary or briefing document (see Chapter 3, Details of literature search methods). These early contributions, along with our summary, were collated and summarised in a set of provisional items, which were developed into an online survey and sent electronically (via the online survey tool, SurveyMonkey®; Survey Monkey, Palo Alto, CA, USA) to participants for ranking (round 2). Participants were asked to rank each item twice on a 7-point Likert scale (1 = strongly disagree to 7 = strongly agree), once for relevance (i.e. should an item on this theme/topic be included at all in the guidance?) and once for validity (i.e. to what extent do you agree with this item as currently worded?). Those who agreed that an item was relevant, but disagreed on its wording, were invited to suggest changes to the wording via a free-text comments box. In this second round, participants were again invited to suggest additional topic areas and items.
Each participant’s responses were collated and the numerical rankings entered onto an Excel spreadsheet. The response rate, average, mode, median and interquartile range for each participant’s response to each item were calculated. Items that score low on relevance were omitted from subsequent rounds. Further online discussion was invited on items that score high on relevance but low on validity (indicating that a rephrased version of the item was needed) and on those where there was wide disagreement about relevance or validity. Following analysis and discussion within the project team, a second list of statements was drawn up and circulated for ranking (round 3). We planned that the process of collation of responses, further e-mail discussion, and reranking would be repeated until a maximum consensus is reached (round 4 et seq.). In practice, very few Delphi panels, online or face to face, go beyond three rounds as participants tend to ‘agree to differ’ rather than move towards further consensus. 36
We had planned to report residual non-consensus as such and the nature of the dissent described. Making such dissent explicit tends to expose inherent ambiguities (which may be philosophical or practical) and acknowledges that not everything can be resolved; such findings may be more use to reviewers than a firm statement that implies that all tensions have been fixed.
Preparing teaching and learning resources
A key aim of our project was to produce publicly accessible resources to support training in realist and meta-narrative review. We anticipate that these resources will need to be adapted and perhaps supplemented for different groups of learners, and interactive learning activities added. 43 We developed, and iteratively refined, draft learning objectives, example course materials and teaching and learning support methods. We drew on our previous work on course development, quality assurance and support for interactive and peer-supported learning in health-care professionals for this aspect of our project. 34,43–45
Real-time refinement
The sponsor of this study, the National Institute for Health Research (NIHR) Health Services and Delivery Research (HSDR) programme, supports secondary research calls for rapid, policy-relevant reviews, some, though not all, of which seek to use realist or meta-narrative methods. We were asked to work with a select sample of teams funded under such calls, as well as other teams engaged in relevant ongoing reviews (selected to balance our sample), to share emerging recommendations and gather real-time data on how feasible and appropriate these recommendations are in a range of different reviews. Over the 27-month duration of this study, we used the feedback we gathered to iteratively refine our draft training materials. Training and support offered to these review teams consisted of three overlapping and complementary packages:
-
An all-comers online discussion forum via JISCmail (www.jiscmail.ac.uk/RAMESES) for interested reviewers who were doing or had previously attempted a realist or meta-narrative review. This was run via light-touch facilitation in which we invited discussion on particular topics and periodically summarise themes and conclusions (a technique known in online teaching as weaving). Such a format typically accommodates large numbers of participants since most people tend to lurk most of the time. Such discussion groups tend to generate peer support through their informal, non-compulsory ethos and a strong sense of reciprocity (i.e. people helping one another out because they share an identity and commitment)46 and they are often rich sources of qualitative data. We anticipated that this forum would contribute key themes and ideas to the quality and reporting standards and learning materials throughout the duration of the study.
-
Responsive support to our designated review teams. We anticipated that our input to these teams would depend on their needs, interests and previous experience. In our previous dealings with review teams we have been called upon (for example) to assist them in distinguishing context from mechanism in a particular paper, extracting and formalising programme theories, distinguish middle-range theories from macro or micro theories, develop or adapt data extraction tools, advise on data extraction techniques and train researchers in the use of qualitative software for systematic review.
-
A series of workshops for designated review teams and other reviewers. We planned to run a series of workshops both to provide training to fellow reviewers interested in using realist or meta-narrative reviews, but also to get feedback from them about what challenges they faced either learning about or undertaking such reviews.
Chapter 4 Results
In this project we produced three specific outputs for realist and meta-narrative reviews:
-
publication standards
-
quality standards
-
teaching and learning materials (also known as training materials).
We used a range of methods to gather the data that informed the content of each of our intended outputs. This section provides details of the results we obtained from the methods we used and how they contributed to the content of our outputs.
Literature search
Sixteen electronic databases were searched from inception to June 2011 and citation tracking was undertaken generating 248 documents. A flow diagram outlining the disposition of documents can be seen in Figure 2. Table 1 shows the number of hits returned for the databases we searched.
One of the project team (GWo) screened the abstracts and titles and included documents which claimed to be realist or meta-narrative reviews. All the documents judged to be possible realist and meta-narrative reviews were circulated to all the project team and through discussion and debate a final set of included documents were retained. We retrieved what we judged to be 35 possible realist reviews and nine meta-narrative reviews. For the possible realist reviews there was no disagreement between the project team as to inclusion (35 out of 35 were included for analysis). Out of the 11 possible meta-narrative reviews, two were judged not to be meta-narrative reviews, leaving nine documents. Tables 2 and 3 show characteristics of the documents (review title, type of document, year published and topic area) that we drew on for realist reviews and meta-narrative reviews respectively. We conducted a thematic analysis guided initially by the seven questions set out above (see Chapter 3, Details of literature search methods) to produce the briefing documents for the realist and meta-narrative Delphi panels (see Appendix 2). All the data we extracted were either entered into an Excel spreadsheet or written up directly into a draft of our briefing documents.
Review title | Type of document | Year published | Topic area |
---|---|---|---|
Vocational rehabilitation: what works and in what circumstances47 | Journal article | 2004 | Vocational rehabilitation |
A systematic review of controlled trials of interventions to prevent childhood obesity and overweight: a realistic synthesis of the evidence48 | Journal article | 2006 | Interventions to reduce childhood obesity and overweight |
A realist synthesis of evidence relating to practice development: Final report to NHS Education for Scotland and NHS Quality Improvement Scotland49 | Full report | 2006 | Practice development |
aRealist review to understand the efficacy of school feeding programmes22 | Journal article | 2007 | Efficacy of school feeding programmes |
Evaluating the impact of patient and public involvement initiatives on UK health services: a systematic review50 | Journal article | 2007 | Patient and public involvement |
Marketing mix standardization in multinational corporations: a review of the evidence51 | Journal article | 2007 | Marketing mix standardisation in multinational corporations |
Human resource management interventions to improve health workers’ performance in low and middle income countries: a realist review52 | Journal article | 2008 | Human resource management interventions to improve health workers’ performance |
Does moving from a high-poverty to lower-poverty neighborhood improve mental health? A realist review of ‘Moving to Opportunity’53 | Journal article | 2008 | US Moving to Opportunity programme |
Primary health care delivery models in rural and remote Australia – a systematic review54 | Journal article | 2008 | Primary health-care delivery models in rural and remote Australia |
Independent learning literature review (research report DCSF-RR051)55 | Report | 2008 | Independent learning in school children |
A realist synthesis of randomised control trials involving use of community health workers for delivering child health interventions in low and middle income countries56 | Journal article | 2009 | Community health workers |
Community-based services for homeless adults experiencing concurrent mental health and substance use disorders: a realist approach to synthesising evidence57 | Journal article | 2009 | Community-based services for homeless adults with concurrent mental health and substance use disorders CDs |
Water, sanitation and hygiene interventions to combat childhood diarrhoea in developing countries58 | Report | 2009 | Water, sanitation and hygiene interventions in reducing childhood diarrhoea |
aInternet-based medical education: a realist review of what works, for whom and in what circumstances43 | Journal article | 2009 | Internet-based learning |
Interventions to promote social cohesion in sub-Saharan Africa59 | Full report | 2010 | Interventions to promote social cohesion in subSaharan Africa |
Implementation of antiretroviral therapy adherence interventions: a realist synthesis of evidence60 | Journal article | 2010 | Antiretroviral adherence interventions |
Lean thinking in healthcare: a realist review of the literature61 | Journal article | 2010 | Lean thinking |
District nurses’ role in palliative care provision: a realist review62 | Journal article | 2010 | Role of district nurses in palliative care provision |
A realist review of evidence to guide targeted approaches to HIV/AIDS prevention among immigrants living in high-income countries63 | PhD thesis | 2010 | Evidence to guide targeted approaches to HIV infection or AIDS prevention among immigrants living in high-income countries |
Effectiveness of telemedicine: a systematic review of reviews64 | Journal article | 2010 | Effectiveness of telemedicine |
How equitable are colorectal cancer screening programs which include FOBTs? A review of qualitative and quantitative studies65 | Journal article | 2010 | Equitability of colorectal screening programmes |
Evidence-based health policy: a preliminary systematic review66 | Journal article | 2010 | Evidence-based health policy |
Behavioral caregiving for adults with traumatic brain injury living in nursing homes: developing a practice model67 | Journal article | 2010 | Behavioural caregiving for adults with traumatic brain injury living in nursing homes |
Addressing locational disadvantage effectively68 | Journal article | 2010 | Addressing locational disadvantage |
Realist review and synthesis of retention studies for health workers in rural and remote areas69 | Report | 2011 | Access to health workers in rural and remote areas |
aPolicy guidance on threats to legislative interventions in public health: a realist synthesis70 | Journal article | 2011 | Public health legislation |
Implementing successful intimate partner violence screening programs in health care settings: evidence generated from a realist-informed systematic review71 | Journal article | 2011 | Intimate partner violence |
An evidence synthesis of qualitative and quantitative research on component intervention techniques, effectiveness, cost-effectiveness, equity and acceptability of different versions of health-related lifestyle advisor role in improving health72 | Report | 2011 | Health-related lifestyle advisors |
The gradient in health inequalities among families and children: a review of evaluation frameworks73 | Journal article | 2011 | Health inequalities among families and children |
Effectiveness of the geriatric day hospital – a realist review74 | Journal article | 2011 | Effectiveness of geriatric day hospital |
Are journal clubs effective in supporting evidence-based decision making? A systematic review. BEME Guide No.1675 | Journal article | 2011 | Effectiveness of journal club in supporting evidence-based decision-making |
Conducting a realist review of a complex concept in the pharmacy practice literature: methodological issues76 | Journal article | 2011 | Culture in community pharmacy organisations |
Getting inside acupuncture trials – exploring intervention theory and rationale77 | Journal article | 2011 | Acupuncture |
Unleashing their potential: a critical realist scoping review of the influence of dogs on physical activity for dog-owners and non-owners78 | Journal article | 2011 | Influence of dogs on physical activity for dog- and non-owners |
Social networks, social capital and chronic illness self-management: a realist review79 | Journal article | 2011 | Social networks, social capital and chronic illness self-management |
Review title | Type of document | Year published | Topic area |
---|---|---|---|
aDiffusion of innovations in service organisations: systematic literature review and recommendations for future research24 | Journal article | 2004 | Diffusion of innovations |
Environmental health and vulnerable populations in Canada: mapping an integrated equity-focused research agenda80 | Journal article | 2008 | Environmental health and vulnerable populations |
Tensions and paradoxes in electronic patient record research: a systematic literature review using the meta-narrative method81 | Journal article | 2009 | Electronic health record |
The health, social care and housing needs of lesbian, gay, bisexual and transgender older people: a review of the literature82 | Journal article | 2009 | Health, social care and housing needs of lesbian, gay, bisexual and transgender older people |
The role of urban municipal governments in reducing health inequities: a meta-narrative mapping analysis83 | Journal article | 2010 | Municipal urban governments in reducing health inequalities |
Knowledge exchange processes in organizations and policy arenas: a narrative systematic review of the literature84 | Journal article | 2010 | Knowledge exchange processes in organisational policy arenas |
aMeasuring quality in the therapeutic relationship – parts 1 and 285,86 | Journal article | 2010 | Measuring quality in therapeutic relationships |
Defining the fundamentals of care87 | Journal article | 2010 | Defining the fundamentals of nursing care |
How can we improve guideline use? A conceptual framework of implementability88 | Journal article | 2011 | Improving guideline use |
Our briefing documents were based on our thematic analysis which was guided by seven initial key areas (see Chapter 3, Details of literature search methods for a list of the key areas). We needed differing levels of immersion and analysis for each of the items we included in our briefing documents. Some were more straightforward to derive from our initial questions and our reading of the literature. We noted that three out of the initial seven questions [(1) What are the strengths and weaknesses of realist and meta-narrative review from both a theoretical and a practical perspective?; (2) How have these approaches actually been used? Are there areas where they appear to be particularly fit (or unfit) for purpose?; and (7) How should one choose between realist, meta-narrative and other theory-driven approaches when selecting a review methodology? How might (for example) the review question, purpose and intended audience(s) influence the choice of review method?] had overlaps and could be collapsed into one topic area for consideration by our Delphi panels. We judged that questions 1, 2 and 7 were related to matching the research question to the method. We noted that in our included reviews, most researchers had also considered this an important topic to address – often through an explanation of why they had deliberately chosen either a realist or meta-narrative review. We therefore included this issue as items 6 and 5 (for meta-narrative and realist reviews respectively) in our briefing document for our Delphi panel. These items asked the Delphi panel members to clarify what a research question would look like in a meta-narrative or realist review.
When doing realist and meta-narrative reviews ourselves, we had previously noted that such reviews often covered broad topics and needed to be progressively focused. Two of our initial questions related to these issues: (5) What is the link between realist and meta-narrative review and the policy-making process? How have published reviews been commissioned or sponsored? How have policy-makers been involved in shaping the review? How have they been involved in disseminating and applying its findings? Are there models of good practice (and of approaches to avoid) for academic-policy linkage in this area?; and (6) How have front-line staff and service users been involved in realist and meta-narrative reviews? If the answer to this is ‘usually, not much’, how might they have been involved and are there examples of potentially better practice which might be taken forward? We had asked questions 5 and 6 to ascertain if other researchers had noted this as an important process and, if they had, what approaches had they used. Within our included reviews, the breadth of the initial topic areas had been identified as a challenge and a range of different approaches had been used to focus reviews. The issue of the need to focus reviews thus seemed to us an important one to include in our briefing documents (as items 9 and 8 for meta-narrative and realist reviews respectively). Items 9 and 8 in our briefing documents asked the Delphi panel members to consider if it was important for researchers to explain how and why their review was shaped and contained.
Question 3 [What, broadly, are the characteristics of high- and low-quality reviews undertaken by realist or meta-narrative methods? What can we learn from the best (and worst) examples so far?] of our initial questions required the most immersion and analysis. With this question we had wanted to understand what processes a review team had to undertake to produce a high-quality review. As a project team we had our own ideas but wanted to explore if these were reflected in our reading of the included reviews. We first had to decide if we could agree among ourselves on which of the included reviews were high, mixed or low quality. To do this, each review was read in detail (GWo) and the characteristics of each review that determined its quality were extracted into an Excel spreadsheet. The headings on this spreadsheet were: study name; type of document; year submitted; topic area; purpose of review; understood method?; methodological comments; lessons for methods; methods for reporting; and challenges reported by reviewers and notes.
Once completed (one for realist reviews and another for meta-narrative reviews), the spreadsheet and the full-text documents were circulated to the rest of the project team and through e-mail discussion and debate, a consensus was achieved. The next process was then to reread each of the included reviews to determine which review processes were necessary to lead to a high-quality review. Again, this was led by GWo and each review process was added to a draft of the briefing documents. These drafts were circulated to the rest of the project team and a consensus achieved through discussion and debate. The briefing materials were the result of seven rounds of revisions.
The contents of our briefing materials were as follows:
-
an explanation of how we would like the Delphi panel members to contribute
-
background to the review methods
-
methodological issues we identified for each method
-
a summary of the published examples
-
our preliminary thoughts on what might be included as publication standards items.
The complete briefing document circulated to the Delphi panels for realist reviews and meta-narrative reviews can be found in Appendix 2.
Delphi panel
Realist review
We ran the realist review Delphi panels between September 2011 and March 2012. We recruited 37 individuals from 27 organisations in six countries. These comprised researchers in: public or population health (8); evidence synthesis (6); health services research (8); international development (2); and education (2). We also recruited experts in research methodology (6), publishing (1), nursing (2) and policy and decision-making (2). We started round 1 in mid-September 2011 and circulated the briefing document to the panel. We sent two chasing e-mails to all panel members, and within 4 weeks all panel members who indicated that they wanted to provide comments had done so. Twenty-two Delphi panel members provided suggestions of items that should be included in the publication standards. We used the suggestions from the panel members and the briefing document as the basis of the online survey for round 2. Round 2 started at the end of November 2011 and ran until early January 2012. Panel members were invited to complete our online survey and asked to rate each potential item for relevance and clarity. A copy of this survey can be found in Appendix 3. Two reminder e-mails were sent to the panel members. Once the panel had completed their survey we analysed their ratings for relevance and clarity (Table 4).
Item | Relevance | Content | ||||||
---|---|---|---|---|---|---|---|---|
Response rate (%) | Mode | Median | Interquartile range | Response rate (%) | Mode | Median | Interquartile range | |
Title | 33/37 (89) | 7 | 7 | 6–7 | 31/37 (84) | 7 | 7 | 6–7 |
Abstract | 34/37 (92) | 7 | 7 | 7–7 | 34/37 (92) | 7 | 6.5 | 5–7 |
Rationale for review | 37/37 (95) | 7 | 7 | 6–7 | 35/37 (95) | 7 | 7 | 5–7 |
Objective and focus of review | 33/37 (89) | 7 | 7 | 6–7 | 33/37 (89) | 7 | 7 | 6–7 |
Changes in review processa | 35/37 (95) | 7 | 6 | 5–7 | 34/37 (92) | 7 | 5.5 | 3–6.75 |
Rationale for using realist review | 34/37 (92) | 7 | 7 | 6–7 | 33/37 (89) | 7 | 6 | 5–7 |
Scoping the literature | 35/37 (95) | 7 | 7 | 5.5–7 | 37/37 (92) | 7 | 6 | 5–7 |
Searching process | 34/37 (92) | 7 | 7 | 6–7 | 34/37 (92) | 7 | 6 | 5–7 |
Selection and appraisal of documentsa | 35/37 (95) | 7 | 7 | 6–7 | 35/37 (95) | 7 | 6 | 4.5–7 |
Data extraction | 34/37 (92) | 7 | 7 | 6–7 | 33/37 (89) | 7 | 6 | 5–7 |
Analysis and synthesis processes | 35/37 (95) | 7 | 7 | 6–7 | 35/37 (95) | 7 | 6 | 5–7 |
Document flow diagram | 35/37 (95) | 7 | 6 | 6–7 | 35/37 (95) | 7 | 6 | 5–7 |
Document characteristicsa | 35/37 (95) | 7 | 6 | 5–7 | 35/37 (95) | 7 | 6 | 4.5–7 |
Main findings | 34/37 (92) | 7 | 7 | 6–7 | 31/37 (84) | 7 | 6.5 | 5–7 |
Summary of findings | 35/37 (95) | 7 | 7 | 7–7 | 34/37 (95) | 7 | 7 | 6–7 |
Strength, limitations and future research directions | 35/37 (95) | 7 | 7 | 6–7 | 35/37 (95) | 7 | 6 | 6–7 |
Comparison with existing literature | 35/37 (95) | 7 | 6 | 5–7 | 35/37 (95) | 7 | 6 | 5–7 |
Conclusion and Recommendations | 34/37 (92) | 7 | 7 | 6–7 | 34/37 (92) | 7 | 6.5 | 6–7 |
Funding | 35/37 (95) | 7 | 7 | 7–7 | 35/37 (95) | 7 | 7 | 6–7 |
From round 2, only three items did not achieve a consensus: items 5, 9 and 13 (see Table 4). Based on the suggestions made by the panel members we refined the text for these items. We had asked panel members if they had a preference between the terms realist review or RS. Fourteen (39%) preferred RS, 10 (28%) realist review and 12 (33%) had no preference. Our conclusion was that the terms RS and realist review are synonymous. We also produced a post-round briefing document from round 2, which detailed for each item:
-
the response rate
-
mode
-
median
-
interquartile range
-
the action we took for each item based on the panel’s ratings
-
an anonymised list of all the free-text comments made.
For round 3, we only asked the panel to consider again the items for which a consensus had not been reached in round 2, namely items 5, 9 and 13. We produced an online survey for round 3 and again asked to rate items 5, 9 and 13 for relevance and clarity. To keep the panel updated we provided them with our post-round briefing document from round 2 (available on request from authors). Round 3 ran from mid-February to mid-March 2012. A copy of this survey can be found in Appendix 4. Two reminder e-mails were sent to the panel members. Once the panel had completed their survey we analysed their ratings for relevance and clarity (Table 5).
Item | Relevance | Content | ||||||
---|---|---|---|---|---|---|---|---|
Response rate (%) | Mode | Median | Interquartile range | Response rate (%) | Mode | Median | Interquartile range | |
5. Changes in review process | 34/37 (92) | 7 | 7 | 6–7 | 34/37 (92) | 7 | 6 | 6–7 |
9. Selection and appraisal of documents | 33/37 (89) | 7 | 7 | 6–7 | 33/37 (89) | 7 | 7 | 6–7 |
13. Document characteristics | 33/37 (89) | 7 | 7 | 6–7 | 33/37 (89) | 7 | 6 | 6–7 |
By the end of round 3 a consensus was reached on all items. We produced a post-round briefing document from round 3 and circulated this to all our panel members for the sake of completeness (available on request from authors).
Using the data we gathered from the three rounds of the Delphi panel for realist reviews, we produced a final set of items to be included in the publication standards for realist reviews. These were published simultaneously in January 2013 in BMC Medicine89 and the Journal of Advanced Nursing. 90 Our publication standards have also been accepted and listed on the EQUATOR (Enhancing the QUAlity and Transparency Of health Research) network (a resource centre for good reporting of health research studies; www.equator-network.org).
Meta-narrative review
We ran the meta-narrative review Delphi panels between September 2011 and March 2012. We recruited 33 individuals from 25 organisations in six countries. These comprised researchers in public or population health researchers (five); evidence synthesis (five); health services research (eight); international development (two); education (two); and also research methodologists (six), publishing (one), nursing (two) and policy and decision-making (two). We started round 1 in mid-September 2011 and circulated the briefing document to the panel. We sent two chasing e-mails to all panel members and within 4 weeks all panel members who indicated that they wanted to provide comments had done so. Twenty-two Delphi panel members provided suggestions of items that should be included in the publication standards. One of these items, on whether or not the concept of epistemic tradition should be included in a meta-narrative review, caused a degree of disagreement within the project team. As a result, we specifically put this issue to the Delphi panel. We used the suggestions from the panel members and the briefing document as the basis of the online survey for round 2. Round 2 started at the end of November 2011 and ran until early January 2012. Panel members were invited to complete our online survey and asked to rate each potential item for relevance and clarity. A copy of this survey can be found in Appendix 5. Two reminder e-mails were sent to the panel members. Once the panel had completed their survey we analysed their ratings for relevance and clarity (Table 6).
Item | Relevance | Content | ||||||
---|---|---|---|---|---|---|---|---|
Response rate (%) | Mode | Median | Interquartile range | Response rate (%) | Mode | Median | Interquartile range | |
Title | 31/33 (94) | 7 | 7 | 6–7 | 31/33 (94) | 7 | 7 | 6–7 |
Abstract | 32/33 (97) | 7 | 7 | 6–7 | 32/33 (97) | 7 | 6.5 | 6–7 |
Rationale for review | 32/37 (97) | 7 | 7 | 6–7 | 32/33 (97) | 7 | 7 | 6–7 |
Objectives and focus of review | 32/33 (97) | 7 | 7 | 6–7 | 32/33 (97) | 7 | 7 | 6–7 |
Changes in the review processa | 31/33 (94) | 7 | 7 | 6–7 | 31/33 (94) | 7 | 6 | 6–7 |
Rationale or using the meta-narrative approacha | 27/33 (82) | 7 | 7 | 6–7 | 27/33 (82) | 7 | 6 | 5–7 |
Evidence of adherence to guiding principles of meta-narrative review | 31/33 (94) | 7 | 6 | 5–7 | 31/33 (94) | 7 | 6 | 5–7 |
Scoping the literature | 30/33 (91) | 7 | 7 | 6–7 | 30/33 (91) | 7 | 7 | 6–7 |
Searching processes | 31/33 (94) | 7 | 7 | 6–7 | 31/33 (94) | 7 | 7 | 6–7 |
Selection and appraisal of documents | 31/33 (94) | 7 | 7 | 6–7 | 31/33 (94) | 7 | 6 | 6–7 |
Data extraction | 30/33 (91) | 7 | 6 | 6–7 | 30/33 (91) | 7 | 6 | 5–7 |
Analysis and synthesis processes | 31/33 (94) | 7 | 6 | 6–7 | 31/33 (94) | 6 | 6 | 5.5–7 |
Document flow diagram | 31/33 (94) | 7 | 7 | 5–7 | 31/33 (94) | 7 | 6 | 4.5–7 |
Document characteristics | 31/33 (94) | 7 | 6 | 5–7 | 30/33 (91) | 7 | 6 | 5–7 |
Main findings | 31/33 (94) | 7 | 7 | 6–7 | 31/33 (94) | 7 | 7 | 6–7 |
Summary of findings | 31/33 (94) | 7 | 7 | 6–7 | 30/33 (91) | 7 | 7 | 6–7 |
Strengths, limitations and future research directions | 30/33 (91) | 7 | 7 | 6–7 | 30/33 (91) | 7 | 7 | 6–7 |
Comparison with existing literature | 30/33 (94) | 7 | 6 | 5–7 | 30/33 (94) | 7 | 6 | 5–7 |
Conclusion and recommendations | 31/33 (94) | 7 | 7 | 6–7 | 31/33 (94) | 7 | 7 | 6–7 |
Funding | 29/33 (88) | 7 | 7 | 6–7 | 29/33 (88) | 7 | 7 | 6–7 |
From round 2, only three items did not achieve a consensus: items 6 and 13. Item 5 had reached a consensus on relevance and content on the numerical scores, but there were sufficient concerns raised in the free text that we felt it needed to be amended and returned to the panel (see Table 6). Based on the suggestions made by the panel members, we refined the text for these items. We had asked panel members if they had a preference between the terms meta-narrative review or meta-narrative synthesis. Thirteen (41%) preferred meta-narrative synthesis, six (18%) meta-narrative review and 13 (41%) had no preference. Our conclusion was that the terms meta-narrative synthesis and meta-narrative review are synonymous. In response to the question of whether or not epistemic tradition should be included in a meta-narrative review, 16 (60%) agreed that it should be, four (15%) disagreed and seven (26%) did not know. As a result, we decided that epistemic tradition should be included in meta-narrative reviews and was incorporated into item 6. We also produced a post-round briefing document from round 2, which detailed for each item:
-
the response rate
-
mode
-
median
-
interquartile range
-
the action we took for each item based on the panel’s ratings
-
an anonymised list of all the free-text comments made.
For round 3, we only asked the panel to consider again the items for which a consensus had not been reached in round 2, namely items 5, 6 and 13. Two additional individuals who had initially decided not to respond to round 2 agreed to provide ratings. To ensure consistency they were briefed on the process and results from round 2. We produced an online survey for round 3 and again asked to rate items 5, 6 and 13 for relevance and clarity. To keep the panel updated we provided them with our post-round briefing document from round 2 (available on request from authors). Round 3 ran from mid-February to mid-March 2012. A copy of this survey can be found in Appendix 6. Two reminder e-mails were sent to the panel members. Once the panel had completed their survey we analysed their ratings for relevance and clarity (Table 7).
Item | Relevance | Content | ||||||
---|---|---|---|---|---|---|---|---|
Response rate (%) | Mode | Median | Interquartile range | Response rate (%) | Mode | Median | Interquartile range | |
5. Changes in the review process | 29/35 (83) | 7 | 7 | 6–7 | 29/35 (83) | 7 | 7 | 6–7 |
6. Rationale for using the meta-narrative approach | 31/35 (89) | 7 | 7 | 6–7 | 31/35 (89) | 7 | 7 | 6–7 |
13. Document flow diagram | 32/35 (91) | 7 | 7 | 6–7 | 31/33 (94) | 7 | 6 | 6–7 |
By the end of round 3 a consensus was reached on all items. We produced a post-round briefing document from round 3 and circulated this to all our panel members for the sake of completeness (available on request from authors).
Using the data we gathered from the three rounds of the Delphi panel for realist reviews, we produced a final set of items to be included in the publication standards for realist reviews. These were published simultaneously in January 2013 in BMC Medicine91 and the Journal of Advanced Nursing. 92 Our publication standards have also been accepted and listed on the EQUATOR network.
Developing quality standards, teaching and learning resources using real-time refinement
We used a range of sources, in real-time to help us develop and refine our quality standards and teaching and learning resources. The data we used to help us came from the following sources:
-
JISCMail (www.jiscmail.ac.uk/RAMESES). At the start of the project we set up an e-mail list and membership of this list grew rapidly. As of June 2014, the list has 326 members and there are regular discussions on a range of topics relating to realist and meta-narrative reviews.
-
Methodological support to review teams. Over the course of this project the project team members have provided differing levels of methodological support to reviewers undertaking realist and meta-narrative reviews. This has ranged from providing answers to questions raised by e-mail or on JISCMail to regular face-to-face meetings. The level of support needed by each team differed considerably depending on each team’s initial level of expertise. Table 8 provides an overview of the projects we provided more in-depth methodological support to and also brief details of the nature of each type of support provided. When providing methodological support we contemporaneously made notes on issues and topics that might be relevant in helping us in this part of the project. An example of the type of records we made of our discussions with a review team we provided methodological support to can be found in Appendix 7.
Review title | Research question(s) | Funder | Review type | Type of support provided |
---|---|---|---|---|
Risk models and scores for type 2 diabetes: systematic review | What are the different risk scores for identifying adults at risk of type 2 diabetes and which scores work for whom in what circumstances? | Local primary care trusts/London Deanery, UK | Realist synthesis linked to systematic review | One of us (TG), as the lead member of this team, provided the following: Training for all other team members on realist review principles Lead researcher on the realist review, undertaking all data extraction, tabulation, analysis and preparation of draft realist section of a mixed Cochrane-style and realist review. One other team member cross-checked this work Writing up the mixed-method review for British Medical Journal |
Uncovering the benefits of participatory research: implications of a realist review for health research and practice |
|
Canadian Institute of Health Research, Canada | Realist synthesis | This novice realist review team was supported in the following ways:
|
Realist review of multicomponent interventions to reduce harms of college binge drinking | What were the underlying theories and assumptions about why these programmes work, and what appear to be the mechanisms and associated contextual influences that led to their intended outcomes? | Dartmouth College, USA | Realist synthesis | This novice realist review team was supported in the following ways:
|
How design of places promotes or inhibits mobility of older adults: realist synthesis of 30 years of research | How do characteristics of the environment (place) support mobility and what circumstances appear to facilitate or hinder mobility in older adults? | Centers for Disease Control and Prevention, USA | Realist synthesis | This novice realist review team was supported in the following ways:
|
Systematically synthesizing IMCI implementation: what works for whom and in what circumstances? |
|
The Alliance for Health Policy and Systems Research, Switzerland | Realist synthesis | This moderately experienced realist review team was supported in the following ways:
|
Evidence synthesis on the occurrence, causes, consequences, prevention and management of bullying and harassing behaviours to inform decision-making in the NHS | What is known about the occurrence, causes, consequences and management of bullying and inappropriate behaviour in the workplace? | National Institute for Health Research Health Services and Delivery Research programme | Realist synthesis | This novice realist review team was supported in the following ways:
|
The effective and cost-effective use of intermediate, step-down, hospital at home and other forms of community care as an alternative to acute inpatient care: a realist review | Produce a conceptual framework and summary of the evidence of initiatives that have been designed to provide care closer to home in order to reduce reliance on acute care hospital beds | National Institute for Health Research Health Services and Delivery Research programme | Realist synthesis | This experienced realist review team took part in a 2-day roundtable discussion covering:
|
What are the impacts of preschool feeding programmes for disadvantaged young children? | What is the impact of school feeding on growth and educational attainment in preschool children and what explains the successes, failures and partial successes of such programmes | International Initiative for Impact Evaluation (3ie), USA | Realist synthesis | One of us (TG), as the lead on realist review elements for this review, provided the following:
|
Hospital patient safety: a realist analysis | Examine the introduction of three safety interventions (improving leadership, reducing infection rates, and implementing surgical checklists) in seven hospitals | National Institute for Health Research Health Services and Delivery Research programme | Realist synthesis | E-mail advice for team members on practical application of realist review principles |
The relevance of complexity concepts and systems thinking to public and population health intervention research: a meta-narrative synthesis | Examine a variety of theoretical frameworks that use the concept of complexity science to help understand the social processes and systems of a constantly evolving environment within which public health interventions have to adapt | Canadian Institute of Health Research, Canada | Meta-narrative review | One of us (TG) was a co-applicant on this study and provided:
|
Mining and aboriginal community health: impacts and interventions | Address the knowledge gap regarding mining impacts on Aboriginal health through a multidisciplinary knowledge synthesis of material from both academic and professional realms as held by Aboriginal communities, mining firms, governments, consultancies and civil society | Canadian Institute of Health Research and Social Sciences and Humanities Research Council, Canada | Meta-narrative review | E-mail advice for team members on practical application of meta-narrative review principles |
Workshops
We ran a number of methods training workshops during this project and these are listed in Table 9. Once again we made contemporaneous notes during these workshops and an example of the notes we made can be found in Appendix 8.
Date | Purpose | Venue |
---|---|---|
2011 March | Realist review training | Queen Mary, University of London, UK |
2011 October | Realist evaluation and review webinar | National Institute for Health Research Health Services and Delivery Research webinar, UK |
2011 October | Meta-narrative review training | Queen Mary, University of London, UK |
2011 November | Meta-narrative review training | McGill University, Montreal, QC, Canada |
2012 March | Realist review training | Karolinska Institute, Stockholm, Sweden |
2012 April | Realist review training | University of Leeds, UK |
2012 April | Realist review training | University of Sheffield, UK |
2012 October | Realist review training | Keele University, UK |
2012 October | Plenary: realist synthesis | University of Southern Denmark, Copenhagen, Denmark |
2012 November | Realist review training | Queens University Belfast, UK |
2012 November | Introduction to realism | Global Health Symposium on Health Systems Research, Beijing, China |
2013 March | Realist synthesis and evaluation | Erasmus University, Rotterdam, the Netherlands |
2013 April | Realist review training | University of East Anglia, UK |
2013 June | Realist review training | University of Leeds, UK |
Quality standards
The data from the sources above were channelled and collated contemporaneously by GWo and used to initially develop the quality standards for researchers using the realist or meta-narrative method. The initial drafts were circulated within the project team and were iteratively refined for content and clarity. Box 1 provides an illustration of how we drew on the data sources to produce the quality standards using an example for realist syntheses.
As researchers and trainers in RS we had noted that there was some confusion amongst researchers about the nature, need and role of realist programme theory (theories) in realist syntheses. To develop the briefing materials and initial drafts of the reporting standards for realist syntheses, we searched for and analysed a number of published syntheses and noted that our impressions were well founded.
When we ran a 1-day conference in March 2011, the issue of the nature, need and role of realist programme theory (theories) in realist syntheses emerged again. In our Delphi process we encouraged participants to provide free-text comments. These closely reflected the comments we received from our 1-day conference.
Development of the quality criteriaWe drew on our content expertise of the topic area and published methodological literature to develop the quality criteria. In addition we found that some of our Delphi panel participants provided us with clear indications that support the criteria we set. For example, we suggested that a RS should develop a programme theory and one that did not was inadequate. Delphi panel participants’ free-text comments echoed our suggestion:
How could identification of programme theories not be appropriate . . .
. . . it cannot be an RS without candidate [programme] theories.
We were also able to draw on the discussions that took place on JISCMail to find support some of our criteria. For example, under excellent in our suggested quality standards for programme theory, we suggested that: ‘The relationship between the programme theory and relevant substantive theory is identified.’
As illustration, a comment from JISCMail that we drew upon to support this criterion was:
In a review, one focus[es] first on what is reported but one can – and probably should, in order to produce some added value – reflect the findings and outcomes of the study under review against the theories and/or best practice that already exist.
For realist syntheses and meta-narrative reviews we developed two sets of quality standards for each. The two sets have been developed for the following user groups:
-
researchers and peer reviewers using these methods
-
funders/commissioners of research.
Although the core component of the quality standards we have developed are the same for each of the two ‘versions’ listed above, we have adapted them each in an attempt to make them more focused and useful for the intended users. All the quality standards for realist syntheses and meta-narrative reviews are freely available online. 93
Quality standards for researchers using the methods and peer reviewers
The quality standards for these user groups are set out using rubrics. By peer reviewers here, we specifically refer to individuals who have been asked to appraise the quality of completed reviews. For each review process that requires a judgement about its quality, we have provided a brief description of why the process is important and also descriptors of criteria against which a decision about quality might be arrived at. The quality standards for realist syntheses for researchers are set out in Table 10, while the quality standards for meta-narrative reviews are presented in Table 11.
Quality standards for RS (for researchers and peer reviewers) | ||||
---|---|---|---|---|
1. The research problem | ||||
Realist synthesis is a theory-driven method that is firmly rooted in a realist philosophy of science and places particular emphasis on understanding causation and how causal mechanisms are shaped and constrained by social context. This makes it particularly suitable for reviews of certain topics and question, for example complex social programmes that involve human decisions and actions. A realist research question contains some or all of the elements of what works, how, why, for whom, to what extent and in what circumstances, in what respect and over what duration and applies realist logic to address the question. Above all, realist research seeks to answer the why question. Realist synthesis always has explanatory ambitions. It assumes that programme effectiveness will always be partial and conditional and seeks to improve understanding of the key contributions and caveats | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
The research topic is appropriate for a realist approach | The research topic is:
|
The research topic is appropriate for secondary research. It requires understanding of how and why outcomes are generated and why they vary across contexts | Adequate plus: framing of the research topic reflects a thorough understanding of a realist philosophy of science (generative causation in contexts; mechanisms operating at other levels of reality than the outcomes they generate) |
Good plus: there is a coherent argument as to why a realist approach is more appropriate for the topic than other approaches, including other theory-based approaches |
The research question is constructed in such a way as to be suitable for a RS | The research question is not structured to reflect the elements of realist explanation. For example, it:
|
The research question includes a focus on how and why the intervention, or programme (or similar classes of interventions or programmes – where relevant) generates its outcomes, and contains at least some of the additional elements, ‘for whom, in what contexts, in what respects, to what extent and over what durations’ | Adequate plus: the rationale for excluding any elements of ‘the realist question’ from the research question is explicit the question has a narrow enough focus to be managed within a realist review |
Good plus: the research question is a model of clarity and as simple as possible |
2. Understanding and applying the underpinning principles of realist reviews | ||||
Realist syntheses apply realist philosophy and a realist logic of enquiry. This influences everything from the type of research question to a review’s processes (e.g. the construction of a realist programme theory, search, data extraction, analysis and synthesis to recommendations) The key analytic process in realist reviews involves iterative testing and refinement of theoretically based explanations using empirical findings in data sources. The pertinence and effectiveness of each constituent idea is then tested using relevant evidence (qualitative, quantitative, comparative, administrative, and so on) from the primary literature on that class of programmes. In this testing, the ideas within a programme theory are recast and conceptualised in realist terms. Reviewers may draw on any appropriate analytic techniques to undertake this testing |
||||
Criterion | Inadequate | Adequate | Good | Excellent |
The review demonstrates understanding and application of realist philosophy and realist logic which underpins a realist analysis | Significant misunderstandings of realist philosophy and/or logic of analysis are evident. Common examples include:
|
Some misunderstandings of realist philosophy and/or logic of analysis exist, but the overall approach is consistent enough that a recognisably realist analysis results from the process | The review’s assumptions and analytic approach are consistent with a realist philosophy at all stages of the review Where necessary, a realist programme theory is developed and tested |
Good plus: review methods, strategies or innovations used to address problems or difficulties within the review are consistent with a realist philosophy of science |
3. Focusing the review | ||||
Because a realist review may generate a large number of avenues that might be explored and explained, and because resources and timescale are invariably finite, it may be necessary to contain a review by progressively focusing both its breadth (how wide an area?) and depth (how much detail?). This important process needs to be considered from the start and may involve iterative rounds of discussion and negotiation with (for example) content experts, funders and/or users. It is typical and legitimate for the review’s objectives, question and/or the breadth and depth of the review to evolve as the review progresses | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
The review question is sufficiently and appropriately focused | The review question is too broad to be answerable within the time and resources allocated There is no evidence that progressive focusing occurred as the review was undertaken |
Attempts are made by the review team to progressively focus the review topic in a way that takes account of the priorities of the review and the realities of time and resource constraints Attempts are documented so that they can be described in publications as appropriate |
Adequate plus: the focusing process is iterative. Commissioners of the review are involved in decision-making about focusing decisions made about which avenues are pursued and which are left open for further inquiry are recorded and made available to users of the review |
Good plus: the review team draws on external stakeholder expertise to drive the focusing process in order to achieve maximal end-user relevance |
4. Constructing and refining a realist programme theory | ||||
Early in the review, the main ideas that went into the making of a class of interventions (the programme theory – which may or may not be realist in nature) are elicited. This initial programme theory sets out how and why a class of intervention is thought to work to generate the outcome(s) of interest. This initial programme theory then needs to be recast in realist terms (a rough outline of the contexts in which, populations for which, and main mechanisms by which, particular outcomes are expected to be achieved). This initial tentative theory will be progressively refined over the course of the review | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
An initial realist programme theory is identified and developed | A realist programme theory is not offered or; a programme theory is offered, but is not converted to a realist programme theory at any stage of the review | An initial programme theory is identified and described in realist terms (that is, in terms of the relationship between contexts, mechanisms and outcomes) The refined theory is consistent with the evidence provided |
Adequate plus: the initial realist programme theory is set out at the start and will be refined iteratively as the review team’s understanding of the topic grows |
Good plus: the relationship between the programme theory and relevant substantive theory is identified implications of the final theory for practice, and for refinements to substantive theory where appropriate, are described the final realist programme theory comprises multiple CMO configurations (describing the ways different mechanisms fire in different contexts to generate different outcomes) and an explanation of the pattern of CMOs |
5. Developing a search strategy | ||||
Searching in a realist review is guided by the objectives and focus of the review, and revised iteratively in the light of emerging data. Searching is directed at finding data that can be used to test theory, and may lie in a broad range of sources that may cross traditional disciplinary, programme and sector boundaries. The search phase is thus likely to involve searching for different sorts of data, or studies from different domains, with which to test different aspects of any provisional theory | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
The search process is such that it would identify data to enable the review team to develop, refine and test programme theory or theories | The search is incapable of supporting a rigorous realist review. Common errors include:
|
Searches are driven by the objectives and focus of the review The search strategy is piloted and refined to check that it is fit for purpose Documents are sought from a wide range of sources which are likely to contain relevant data for theory development, refinement and testing There is no restriction on the study or documentation type that is searched for |
Adequate plus: further searches are undertaken in light of greater understanding of the topic area. These searches are designed to find additional data that would enable further theory development, refinement or testing |
Good plus: the searching deliberately seeks out data from situations outside the programme under study where it can be reasonably inferred that the same mechanisms(s) might be in operation |
6. Selection and appraisal of documents | ||||
Realist review requires a series of judgements about the relevance and robustness of particular data for the purposes of answering specific questions within the overall review question An appraisal of the contribution of any section of data (within a document) should be made on two criteria:
The selection and appraisal stage may need to run in parallel with the analysis stage |
||||
Criterion | Inadequate | Adequate | Good | Excellent |
The selection and appraisal process ensures that sources relevant to the review containing material of sufficient rigour to be included are identified. In particular, the sources identified allow the reviewers to make sense of the topic area; to develop, refine and test theories; and to support inferences about mechanisms | The selection and appraisal process does not support a rigorous and complete realist review. For example:
|
Selection of a document for inclusion into the review is based on what it can contribute to the process of theory development, refinement and/or testing (i.e. relevance) Appraisals of rigour judge the plausibility and coherence of the method used to generate data |
Adequate plus: During the appraisal process limitations of the method used to generate data are identified and taken into consideration during analysis and synthesis |
Good plus: Selection and appraisal demonstrate sophisticated judgements of relevance and rigour within the domain |
7. Data extraction | ||||
In a review, data extraction assists analysis and synthesis. Of particular interest to the realist reviewer are data that support the use of realist logic to answer the review’s question(s), e.g. data CMO configurations, demiregularities, middle-range and/or programme theories | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
The data extraction process captures the necessary data to enable a realist review | The data extraction process does not capture the necessary data to enable a realist review. For example:
|
Data extraction focuses on identification and elucidation of CMO configurations and refinement of programme theory Piloting and refinement of the data extraction process has been undertaken where appropriate. Quality control processes are in place to check that all review team members apply common processes and standards in data extraction |
Adequate plus: data extraction processes support later processes of analysis (e.g. by organising data into sets relevant for later analysis). The data extracted are comprehensive enough to identify main CMO patterns |
Good plus: the data extraction process is continually refined as the review progresses, so as to capture relevant data as the review question is focused and/or programme theory is refined |
8. Reporting | ||||
Realist reviews may be reported in multiple formats: lengthy reports, summary reports, articles, websites and so on. Reports should be consistent with the publication standards for RS. (See RAMESES publication standards: Realist syntheses at: http://onlinelibrary.wiley.com/doi/10.1111/jan.12095/full89, or www.biomedcentral.com/1741-7015/11/21)89,90 | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
The RS is reported using the items listed in the RAMESES reporting standard for realist syntheses | Key items are missing. For example:
|
Most items reported. In particular the following items should be reported: | All items are reported clearly and in sufficient detail for an external reader to understand and to judge the methods used and the plausibility and coherence of the findings | Good plus: the report is well written and easy to understand. Additional materials are made available for external readers to investigate aspects of the review in more detail |
Quality standards for meta-narrative reviews (for researchers and peer reviewers) | ||||
---|---|---|---|---|
1. The research problem | ||||
Meta-narrative review is a relatively new method of systematic review, designed for topics that have been differently conceptualised and studied by different groups of researchers. To understand the many approaches, reviewers have to consciously and reflexively step out of their own world view, learn some new vocabulary and methods, and try to view a topic through multiple different sets of eyes. An overarching narrative of the different perspectives, based on an increased understanding of them, is produced which highlights what different research teams might learn from one another’s approaches | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
The research topic is appropriate for a meta-narrative approach | The research topic is:
|
The research topic is appropriate for secondary research. It would benefit from illumination of how a topic has been conceptualised and studied differently by different groups | Adequate plus:
|
Good plus:
|
The research question is constructed in such a way as to be suitable for a meta-narrative review | The research question is not structured to reflect the elements of meta-narrative explanation. For example, it:
|
The research question includes a focus on how a topic has been conceptualised and studied differently by different groups | Adequate plus:
|
Good plus:
|
2. Understanding and applying the purpose and underpinning principles of meta-narrative reviews | ||||
Meta-narrative review (which is rooted in a constructivist philosophy of science) is inspired by the work of Thomas Kuhn, who observed that science progresses in paradigms. Meta-narrative reviews often look historically at how particular research traditions or epistemic traditions have unfolded over time and shaped the normal science of a topic area The review seeks first to identify and understand as many as possible of the potentially important different research traditions which have a bearing on the topic. In the synthesis phase, by means of an overarching narrative, the findings from these different traditions are compared and contrasted to build a rich picture of the topic area from multiple perspectives. The goal of meta-narrative review is sensemaking of a complex (and perhaps contested) topic area. During analysis and synthesis, six guiding principles (pragmatism, pluralism, historicity, contestation, reflexivity and peer review) should be used |
||||
Criterion | Inadequate | Adequate | Good | Excellent |
The review demonstrates understanding and application of the purpose and principles underpinning a meta-narrative review | Significant misunderstandings of purpose and principles underpinning a meta-narrative review. Common examples include:
|
Some misunderstandings of purpose and principles underpinning a meta-narrative review, but the overall approach is consistent enough that a recognisable set of distinct meta-narratives together with a higher-order synthesis of these results from the process | The review’s assumptions and analytic approach are consistent with the purpose and underpinning principles of a meta-narrative review. In particular, the philosophical position is explicitly constructivist. A sufficient range of paradigms/epistemic traditions has been included to make sense of an unfolding and complex topic area from multiple perspectives and to use contrasts between these as higher-order data |
Good plus:
|
3. Focusing the review | ||||
A meta-narrative review asks some or all of the following questions:
|
||||
Criterion | Inadequate | Adequate | Good | Excellent |
The review question is sufficiently and appropriately focused | The review question is too broad to be answerable within the time and resources allocated There is no evidence that progressive focusing occurred as the review was undertaken |
Attempts were made by the review team to progressively focus the review topic in a way that takes account of the priorities of the review and the realities of time and resource constraints | Adequate plus:
|
Good plus:
|
4. Scoping the literature | ||||
An important process in a meta-narrative review is to identify a sufficiently broad range of sources to be able to build as comprehensive a map as possible of research undertaken on the topic. This scoping step is used to identify in broad terms the different research traditions, situated in different literatures, which have addressed the topic of interest. Initial attempts to make sense of a topic area may involve not just informal browsing of the literature but also consulting with experts and stakeholders | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
The scoping of the literature has been sufficiently and appropriately undertaken | The scoping of the literature has been limited and cursory (e.g. only a single source is used – perhaps the MEDLINE database – and/or the review has inappropriately concentrated on a single research tradition, for example evidence-based medicine) | Attempts made to utilise a broad range of relevant sources and to build as comprehensive a map as possible of the research traditions on the topic | Adequate plus:
|
Good plus:
|
5. Developing a search strategy | ||||
Searching in a meta-narrative review is guided by the objectives and focus of the review, and revised iteratively in the light of emerging data. Searching is directed at finding sufficient data to develop and make sense of the relevant research traditions that have been identified, and may lie in a broad range of sources that may cross traditional disciplinary, programme and sector boundaries. This stage is likely to involve searching for different kinds of data in different ways | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
The search process is such that it would identify data to enable the review team to develop and refine the map of seminal papers and primary research studies | The search is incapable of supporting the development of a rigorous meta-narrative review. Errors may include:
|
Searches are driven by the objectives and focus of the review and are piloted and refined to check that they are fit for purpose Documents are sought from wide range of sources which are likely to contain relevant data on research traditions There is no predefined restriction on the study or documentation type that is searched for |
Adequate plus:
|
Good plus:
|
6. Selection and appraisal of documents | ||||
Meta-narrative review is not a technical process, rather, it is a process of sensemaking of the literature, selecting and combining data from primary sources to produce an account of how a research tradition unfolded and why, and then (in the synthesis phase) comparing and contrasting findings from these different traditions to build a rich picture of the topic area from multiple perspectives. This process requires a series of judgements about the unfolding of research, in particular traditions, and about the relevance and robustness of particular data within that tradition Meta-narrative review takes its quality criteria from the traditions included in the review. Studies in these separate traditions should be appraised using the quality criteria that a competent peer reviewer in that tradition would choose to use The description of the selection and appraisal process should be sufficiently detailed to enable a reader to judge how likely it is that researchers inadvertently excluded data that may have significantly altered the findings of the review |
||||
Criterion | Inadequate | Adequate | Good | Excellent |
The selection and appraisal process ensures that sources relevant to the review containing material likely to help identify, develop and refine understanding of research traditions are included | The selection and appraisal process does not support a rigorous and complete meta-narrative review. For example:
|
Selection of a document for inclusion into the review is based on what it can contribute to making sense of research traditions All the key high-quality sources are identified and included in the review and the poor-quality ones accurately excluded |
Adequate plus:
|
Good plus:
|
7. Data extraction | ||||
In a review, data extraction assists analysis and synthesis. Of particular interest to the meta-narrative reviewer are data elements that would contribute to constructing a story of how research on a topic unfolded over time in a particular tradition | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
The data extraction process captures the necessary data to enable a meta-narrative review | The data extraction process does not capture the necessary data to enable a meta-narrative review. For example:
|
Data extraction focuses on identification and elucidation of data that informs how research on a topic unfolded over time in a particular tradition Piloting and refinement of the data extraction process is undertaken where appropriate Quality control processes are in place to check that all review team members apply common processes and standards in data extraction |
Adequate plus:
|
Good plus:
|
8. Synthesis phase | ||||
Having identified the individual meta-narratives, the next phase in a meta-narrative review is to compare and contrast these to generate higher-order data (e.g. to identify and explain conflicting findings) | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
The meta-narrative should include a synthesis phase where philosophical, conceptual, methodological and empirical differences between traditions are discussed and explained | The synthesis phase is missing or fails to engage with the underlying philosophical, conceptual or theoretical contrasts between traditions | Some attempt is made to show how different groups of researchers produced different findings as a result of different philosophical assumptions, different ways of conceptualising the topic, different theoretical explanations or different study designs and methods | Adequate plus:
|
Good plus:
|
9. Reporting | ||||
Meta-narrative reviews may be reported in multiple formats – lengthy reports, summary reports, articles, websites and so on. Reports should be consistent with the publication standards for meta-narrative reviews. (See RAMESES publication standards: realist syntheses at: http://onlinelibrary.wiley.com/doi/10.1111/jan.12095/full89, or www.biomedcentral.com/1741-7015/11/21)89,90 | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
The meta-narrative review is reported using the items listed in the relevant RAMESES reporting standard | Key items are missing. For example:
|
Most items reported. In particular the following items should be reported:
|
All items are reported clearly and in sufficient detail for an external reader to understand and to judge the methods used and the plausibility and coherence of the findings | Good plus:
|
As an illustrative example to explain the layout of these quality standards, in the quality standard for focusing the reviews (see Table 10, item 3) for realist syntheses, this aspect of the review could be judged as being adequate if attempts are made by the review team to progressively focus the review topic in a way that takes account of the priorities of the review and the realities of time and resource constraints. For this aspect of a review to be judged as good we recommend that, as well as fulfilling the criteria for adequate (hence our use of the terms adequate plus), reviews would need to ensure (among others), the focusing process is iterative.
Quality standards for funders/commissioners of research
As more and more realist syntheses and meta-narrative reviews are being funded/commissioned, decision-makers and peer reviewers at this stage need to make judgements on two broad areas: proposed review processes and methodological expertise. We appreciate that many funding bodies and commissioners will already have processes in place to guide the peer reviewers they appoint. As such we see this set of guidance we have produced not as replacement for, but as supplementation to, any existing organisational peer-review processes and guidance.
The quality standards for realist syntheses for funders/commissioners of research are set out in Table 12. Those for meta-narrative reviews are in Table 13. These have been abridged and adapted from their respective counterparts in Tables 10 and 11 to better suit the needs of this user group.
Quality standards for RS (for funders/commissioners of research) | ||||
---|---|---|---|---|
1. The research problem | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
Is the research topic appropriate for a realist approach? | Research topic:
|
Research topic:
|
Adequate plus:
|
Good plus:
|
Is the research question constructed in such a way as to be suitable for a RS? | The research question is not structured to reflect the elements of realist explanation | The research question includes a focus on how and why the intervention, or programme, generates its outcomes and contains at least some of the additional elements, for whom, in what contexts, in what respects, to what extent and over what durations | Adequate plus:
|
Good plus:
|
2. Understanding and applying the underpinning principles of realist reviews | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
Does the review team demonstrate understanding and application of realist philosophy and realist logic which underpins a realist analysis? | Significant misunderstandings of realist philosophy and/or logic of analysis are evident | Some misunderstandings of realist philosophy and/or logic of analysis exist, but the overall approach is consistent enough that a recognisably realist analysis results from the process |
|
Good plus:
|
3. Focusing the review | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
Is, or will, the review question be sufficiently and appropriately focused? |
|
Process proposed enables the review team to progressively focus the review topic in a way that takes account of the priorities of the review and the realities of time and resource constraints | Adequate plus:
|
Good plus:
|
4. Constructing and refining a realist programme theory | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
Does the review team plan to identify, develop and refine their initial realist programme theory? | There are no plans to identify, develop and refine a realist programme theory | There are plans to identify, develop and refine a realist programme theory | Adequate plus:
|
Good plus – there are plans to:
|
5. Developing a search strategy | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
Is the proposed search process such that it would identify data to enable the review team to develop, refine and test programme theory or theories? | The search is incapable of supporting a rigorous realist review | The proposed searches will:
|
Adequate plus:
|
Good plus:
|
6. Selection and appraisal of documents | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
Will the selection and appraisal process ensure that documents of relevance to the review containing material of sufficient rigour to be included are identified? | The proposed selection and appraisal process does not support a rigorous and complete realist review | Selection of a document for inclusion will be based on:
|
Adequate plus:
|
As for Good |
7. Data extraction | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
Will the data extraction process capture the necessary data to enable a realist review? |
|
The data extraction processes will:
|
Adequate plus:
|
Good plus:
|
8. Reporting | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
Will the review team use the items listed in the RAMESES reporting standard for realist syntheses when reporting their RS? | No information provided | RAMESES reporting standard for realist syntheses will be used for reporting | Adequate plus:
|
As for Good |
Quality standards for meta-narrative reviews (for funders/commissioners of research) | ||||
---|---|---|---|---|
1. The research problem | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
Is the research topic appropriate for a meta-narrative approach? | Research topic:
|
Research topic:
|
Adequate plus:
|
Good plus:
|
Is the research question constructed in such a way as to be suitable for a meta-narrative review? | The research question is not structured to reflect the elements of meta-narrative explanation | The research question includes a focus on how a topic has been conceptualised and studied differently by different groups | Adequate plus:
|
Good plus:
|
2. Understanding and applying the purpose and underpinning principles of meta-narrative reviews | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
Does the review team demonstrate an understanding and application of the purpose and principles underpinning a meta-narrative review? | Significant misunderstandings of the purpose and principles underpinning a meta-narrative review | Some misunderstandings of the purpose and principles underpinning a meta-narrative review, but the overall planned approach is consistent enough that a recognisable set of distinct meta-narratives together with a higher-order synthesis of these is likely to results from the process |
|
Good plus:
|
3. Focusing the review | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
Is, or will, the review question be sufficiently and appropriately focused? |
|
Attempts will be made by the review team to progressively focus the review topic in a way that takes account of the priorities of the review and the realities of time and resource constraints | Adequate plus:
|
Good plus:
|
4. Scoping the literature | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
Has sufficient and appropriate scoping of the literature been planned? | The planned scoping of the literature appears to be limited and cursory | Attempts will be made to utilise a broad range of relevant sources and to build as comprehensive a map as possible of the research traditions on the topic | Adequate plus:
|
Good plus:
|
5. Developing a search strategy | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
Is the proposed search process such that it would identify data to enable the review team to develop and refine the map of seminal papers and primary research studies? | The planned search is incapable of supporting the development of a rigorous meta-narrative review | The proposed searches will:
|
Adequate plus:
|
As for Good |
6. Selection and appraisal of documents | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
Will the selection and appraisal process ensure that sources relevant to the review containing material likely to help identify, develop and refine understanding of research traditions be included? | The selection and appraisal process will not support a rigorous and complete meta-narrative review | Selection of a document for inclusion into the review will:
|
Adequate plus:
|
As for Good |
7. Data extraction | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
Will the data extraction process capture the necessary data to enable a meta-narrative review? | The data extraction process will not capture the necessary data to enable a meta-narrative review | Data extraction processes will:
|
Adequate plus:
|
Good plus:
|
8. Synthesis phase | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
Will synthesis of the meta-narratives include discussion and explanation of the philosophical, conceptual, methodological and empirical differences between traditions? | A synthesis phase:
|
The planned synthesis phase will attempt to show how different groups of researchers produced different findings as a result of different philosophical assumptions, ways of conceptualising the topic, theoretical explanations or study designs and methods | Adequate plus:
|
As for Good |
9. Reporting | ||||
Criterion | Inadequate | Adequate | Good | Excellent |
Will the review team use the items listed in the RAMESES reporting standard for meta-narrative reviews when reporting their meta-narrative review? | No information provided | RAMESES reporting standard for meta-narrative reviews will be used for reporting | Adequate plus:
|
As for Good |
Teaching and learning resources
We developed teaching resources for both realist and meta-narrative reviews. The challenge we faced when tackling this task was that both methods were relatively new and as yet only some methodological development has taken place. As a project team, we discussed at length and repeatedly what our fellow researchers might find helpful, as we were aware that time and resources were limited. We took our inspiration of what kind of teaching and learning resources to produce from the feedback we had obtained from fellow realist researchers on a paper a member of our project team (RP) had co-authored on realist evaluation. 94 The feedback we had been given was that focusing on areas that researchers found challenging and teaching through examples from the literature was helpful. We decided to adopt this format for both sets of our training materials – namely to focus on the aspects of each review method that researchers found the most challenging and to illustrate both good and bad practice with examples from the published literature. From our analysis of the data we gathered from our various sources (see Chapter 4, Literature search and Delphi panel and Developing quality standards, teaching and learning resources using real-time refinement) we noted that there were specific review method issues that fellow researchers found the most challenging and focused our teaching and learning materials on these.
For realist reviews, the challenging issues we covered were:
-
Focusing reviews.
-
Because a RS will generate a large number of avenues that might be explored and explained, and because resources and timescale are invariably finite, it may be necessary to contain a review. Many different aspects of a realist review might need to be focused. Focusing may also take place at different time points in the review process.
-
-
Programme theory.
-
Realist synthesis has most often been used to make sense of complex interventions. These interventions or programmes often have multiple components (which interact in non-linear ways), outcomes (some intended and some not) and long pathways to the desired outcome(s). The term programme theory refers to an abstracted description and/or diagram that lays out what a programme (or family of programmes or intervention) comprises and how it is expected to work. Programme theory serves two main functions in a RS. The first is to sketch the terrain that will be investigated and, in the process, to assist in refining the elements and scope for the review. The second is to provide a structure for review findings.
-
-
Developing a search strategy.
-
What constitutes the right evidence is different in a RS than it is in other form of review. Data that may usefully contribute to a RS are:
-
decided not by research type (e.g. randomised controlled trial) but by relevance to the review question
-
not restricted to research into or evaluations of programs per se, but related to the programme theory that underpins the programme
-
not necessarily about the whole research question, but relevant to a subsection of it
-
drawn not necessarily from a whole text/document, but from a subsection of it relevant to a particular aspect of the review question
-
able to shed light on any aspect of context (C), mechanism (M) or outcome (O) for any element of the theory
-
different for theory building (which does not need to be as rigorous) as opposed to theory testing (which needs to be sufficiently rigorous to support the conclusion being drawn on for the review).
-
-
-
Selection and appraisal of documents.
-
Realist synthesis requires a series of judgements about the relevance and robustness of particular data items for the purposes of answering a specific question. A wide range of documents may contain data that contribute to a RS. Hence, rejecting a document on a global assessment of its methodological quality is illogical. Instead, inclusion and exclusion decisions are based on two criteria:
-
Relevance – whether it can contribute to theory building and/or testing.
-
Rigour – whether or not the methods used to generate the relevant data are credible and trustworthy.
-
-
-
Applying realist principles in analysis.
-
The basic analytic task in a realist review is to find and align the evidence to demonstrate that particular mechanisms generate particular outcomes and to demonstrate which aspects of context matter. Working from the basic analytic structure described above, it follows that relevant mechanisms cannot be identified without reference to outcomes (mechanisms are what cause outcomes) and that relevant aspects of context cannot be identified without reference to mechanisms. An ideal RS provides evidence for outcomes, evidence to support the existence of the hypothesised mechanisms, evidence that those mechanisms cause those outcomes, evidence that features of context exist and evidence that those features of context affect whether and which mechanisms fire.
-
The challenging issues faced by meta-narrative reviewers were very similar and concerned:
-
Understanding and applying the underpinning principles of meta-narrative reviews
-
Meta-narrative review (which is rooted in a constructivist philosophy of science) was inspired by the work of Thomas Kuhn, who observed that science progresses in paradigms. 15 Meta-narrative reviews often look historically at how particular research traditions or epistemic traditions have unfolded over time and shaped the normal science of a topic area. The review seeks first to identify and understand as many as possible of the potentially important different research traditions that have a bearing on the topic. In the synthesis phase, by means of an overarching narrative, the findings from these different traditions are compared and contrasted to build a rich picture of the topic area from multiple perspectives. The goal of meta-narrative review is sensemaking of a complex (and perhaps contested) topic area.
-
-
Focusing reviews
-
Because a meta-narrative review will generate a large number of avenues that might be explored and explained, and because resources and timescale are invariably finite, it may be necessary to contain a review. Many different aspects of a realist review might need to be focused. Focusing may also take place at different time points in the review process.
-
-
Finding the most relevant evidence
-
Three specific processes will help the meta-narrative reviewer find the most relevant evidence:
-
scoping the literature
-
developing and pursuing a search strategy
-
selecting and appraising the documents.
-
-
Through an iterative cycle of drafts, feedback on drafts and revisions we developed the final structure of our teaching materials. We drew on our collective experiences in teaching and learning as well as knowledge of the educational literature to develop these materials. A particular challenge we faced when developing these materials was deciding on who our exact audience would be, i.e. the novice, intermediate or advanced reviewer. We finally decided that we would focus on providing materials for the more novice end of this spectrum, as many of the enquiries we (as a project team) were getting asking for help were from novice review teams, and we and our fellow realist and meta-narrative reviewers had identified that capacity building was a real and significant issue. Each of the teaching and learning resources has a similar structure and cover:
-
objectives
-
an explanation on why the topic area is important to get right
-
what would constitute high quality for this topic area
-
one or more worked examples (drawn from the published literature) of how the topic area in a review might be improved
-
example(s) from the published literature of how the topic area has been tackled successfully
-
learning activities (realist review only)
-
reflection activities.
A list of suggested further reading and resources is provided within each of the teaching and learning materials documents. The teaching and learning materials for RS and meta-narrative are in Appendix 9 and 10, respectively, and are freely available online. 93
In addition to these teaching and training materials, one of our project team (Professor Ray Pawson) has written a book on realist research methods that also provides more in-depth discussions on various aspects of realist review. 95
Chapter 5 Discussion
In this project we have developed publication standards, quality standards, and teaching and learning resources for realist and meta-narrative reviews. Both are relatively new systematic review methods in health services research. Realist and meta-narrative reviews potentially offer great promise in unpacking the black box of the many complex interventions that are increasingly being used to improve health and patient outcomes. We see this project as a start to the long journey of advancing the rigour of how realist and meta-narrative reviews are carried out and reported.
Both realist and meta-narrative reviews are methods that have grown out of the increasing call for secondary research methods to address issues around the implementation of interventions. 96 They are not the only review methods that try to address this challenge, other examples include meta-ethnography, grounded theory, thematic synthesis, textual narrative synthesis, meta-study, critical interpretive synthesis, ecological triangulation and framework synthesis. With this growth in possible review methods, one unintended consequence has been that there may now be too much choice and it is not immediately apparent which method should be used and when. 5 A detailed discussion of this issue is beyond the scope of this report, but excellent resources exist that may help in the choice of review methods. 97,98
As relatively experienced users of these methods, we had noted a number of common and recurrent challenges that face grant awarding bodies, peer reviewers, reviewers and users. These centred on two closely related questions: how to judge if a realist or meta-narrative review, or a proposal for such a review, is of high quality (including, for completed reviews, how credible and robust findings are) and how to undertake such reviews. Our experience suggests that we can go a long way towards answering these questions by developing resources that helps fellow reviewers to give due consideration to the theoretical and conceptual underpinnings of realist and meta-narrative reviews, outlined briefly below.
Realist review is based on a realist philosophy of science, which permeates and informs its underlying epistemological assumptions, methodology and quality considerations. Meta-narrative review takes a more constructivist philosophical position, though it is compatible with approaches which propose the existence of a social reality independent of our constructions of it. The meta-narrative approach seeks to tease out and explore the full range of philosophical positions represented in the primary literature.
One of the most common misapplications we have noted is that reviewers have not always appreciated the underlying philosophical basis of these review methods (and the implications of this for how the review should be conducted). Instead, they have based their reviews explicitly or implicitly on fundamentally different philosophical assumptions – most commonly the positivist notion that generalisable truths are best generated from controlled experiments, especially randomised trials.
Even when a realist philosophy of science has been adhered to in a realist review, reviewers – ourselves included – often struggle with recurring conceptual and methodological issues. Mechanisms present a particular challenge in realist review – how to define them, where to locate them, how to identify them and how to test and refine them. Both review methods trade on the use of theoretical explanations to make sense of the observed data. Realist reviewers commonly grapple with how to define a theory (what, for example, is the difference between a programme theory and a middle-range theory?) and what level of abstraction is appropriate in different circumstances. On a more pragmatic level, those who seek to produce theory-driven reviews of heterogeneous topic areas wrestle with a broad range of how to issues: how to define the scope of the review; how and to what extent to refine this scope as the review unfolds; what literature(s) to search and how; how to critically appraise what is often a very diverse sample of primary studies; how to collate, analyse and synthesise findings; and how to make recommendations that are academically defensible and useful to policy-makers and so on. We believe that the resources we have produced from this project will go some way to addressing the challenges we have highlighted above.
In undertaking this project we were faced with one main dilemma that related to how best to allocate time and resources to the multiple work packages. For example, we could easily have spent more time on our narrative review but this may potentially have been at the expense of relatively neglecting our Delphi panels, support to review teams or developing teaching materials. In retrospect our project was very ambitious in its aims and as such we had to prioritise some aspects of the project above others. For example, we felt that it was more important to devote more time to getting right our Delphi process, so that we had a solid a consensus on which to develop our quality and publication standards and (to a lesser extent) our teaching materials. This meant that our narrative review had to be rapid/truncated/abbreviated (see Chapter 3, Details of literature search methods and Chapter 4, Literature search for more details). Another example of prioritisation was in the breadth and depth of our training materials. Entire textbooks could be written for these, but instead we chose to focus on common challenges. Our hope is that we have started the journey to addressing some of the issues around all new methods – namely how do you judge quality, how do you report it and how do you do X, Y or Z. We do however fully accept that more is needed and as such we have provided recommendations in Research recommendations and implications for practice.
Changes to protocol
Near the start of this project we published our project protocol. 99 During the course of the project we varied two aspects of our protocol, as described in the sections below.
Real-time piloting of the provisional standards, guidance and training materials
Our intention had been that over the 27-month duration of this study, we would recruit two cohorts of review teams. With the first cohort, we would use provisional standards, guidance and training materials developed from our initial review of the literature. Whereas, with the second cohort, we would pilot the standards, guidance and training materials which had been produced/refined via the Delphi process. After following the two cohorts of review teams through their reviews, we would then further revise the outputs as a master document before considering how to modify these for different audiences.
However, there were a number of issues that made our plans impractical and potentially misleading. Firstly, it was not immediately apparent from our literature review what the main methodological and training challenges were. Secondly, we had no control over when review teams wanted us to provide them with methodological support. It was, therefore, difficult for us to assemble together the necessary cohorts and have our initial drafts ready. We found ourselves providing methodological support almost on a continuous basis but to different teams at different times on a wide range of different methodological aspects. Getting clear starting and finishing points in time for any cohort we could assemble was impossible if we wanted to be responsive to the needs of review teams. Finally, we noted early on in our project that while the literature was useful in helping us to identify methodological and training challenges, our fellow reviewers were better. We found that review teams possessed an invaluable store of knowledge about the challenges they faced. As we supported review teams, communicated via e-mail and met them at conferences and workshops, we were able to harness and gather more and more information about what they found really challenging. We thus made the decision that iterative refinement (building on the gradually accumulating experiences of fellow researchers) might prove to be a more fruitful way of developing our resources rather than what we had originally planned.
Fishbowl exercise
Approximately halfway through the study period, we had planned to formally present our emerging findings to a panel of external researchers in order to collate additional feedback. We had planned this event as a precaution against any groupthink that we might encounter within our project team. We discussed the need for such an event with our project steering group, especially in the light of the fact that we had been able to recruit what we considered to be a very diverse range of individuals to our Delphi panels. With the agreement of our project steering group we decided that there was little merit in holding such an event.
Limitations
To develop the briefing materials for our Delphi panels we undertook a narrative review. This review has limitations that are likely to have introduced a number of biases and so – potentially at least – limit the inferences that can be made from the included reviews. For example, the search process for the review, despite being developed by an expert librarian, was not exhaustive. All the screening for inclusion and exclusion was undertaken by one screener and no quality checks were undertaken. Both processes may well mean that we are likely to have missed some reviews. Once reviews had been included, data extraction was undertaken by one researcher, and omissions in data extraction are likely to have occurred. However, all the included reviews and the data extraction spreadsheet were circulated to all project team members and so a degree of informal quality checking did occurr. Deciding what should be included in the Delphi panels’ briefing materials was undertaken by the entire project team. We are aware that any item or topic included in the briefing materials was done so as a result of our subjective interpretations, raising questions about reproducibility. However, the briefing documents we produced were not an end product in itself, but the starting point for the Delphi panels to build a consensus. As such, we expected that changes would occur as we ran each round of the Delphi process and, so, were not as concerned that any omissions as a result of the review’s limitations process would have that large an impact on the final publication and quality standards.
We recognise that there is much more to cover in terms of the breadth and depth of the teaching and learning resources we have produced. Because realist and meta-narrative reviews are both relatively new review methods, the wish list we were able to elicit from our fellow reviewers when using these methods is quite long. Given the time and resources allocated for this project we elected to focus on providing depth, rather than breadth on the issues that were most challenging. With time, we hope to use the community of practice we have developed to address more and more methodological challenges.
As experience grows with the use of these methods, it is very likely that the resources we have produced will need to be updated. We welcome and invite methodological development in realist and meta-narrative reviews. We expect that what we have produced should be gradually refined and updated as methodological developments take place with increasing use of realist and meta-narrative reviews. Thus we view the publication and quality standards and teaching and learning resources more as a starting point than definitive resources that must not be altered in any way.
We are aware that both realist and meta-narrative reviews are used for secondary research on a wide range of topics and by reviewers from a broad range of disciplines. The level of expertise of the users of our resources will also vary considerably, from novice to seasoned reviewer. These two aspects mean that some latitude is needed in the use of the resources we have produced. For example, not all the publication standard’s items will always be applicable when reporting all reviews. Or when assessing the quality of a review, there may be justifiable reasons for a review to not meet some quality criteria. We have tried to anticipate the varied uses that realist and meta-narrative review might be put to by providing a degree of flexibility in our standards. For example, in our publication standards, if adaptations are made to the review method (as originally described), then reviewers are invited to provide an explanation for any such adaptations.
Research recommendations and implications for practice
In common with many quality and reporting standards there is a dearth of research to demonstrate that such standards necessarily change practice and improve the quality of research. 18 This will also be true for the standards we have produced and, therefore, research to demonstrate a change in practice and improvement in the quality of realist and meta-narrative reviews is needed.
As experience with realist and meta-narrative reviews grows and more are undertaken, new methodological insights are likely to occur. These need to be captured and analysed to determine if the quality and publication standards we have produced continue to be fit for purpose or need to be updated. Ideally, further funding might enable a project similar to this one, i.e. RAMESES II, to address the updating of the standards, though as much groundwork has already been done a more truncated project may suffice.
Our training materials are focused on what we were able to identify as being the processes that fellow reviewers found the most challenging to execute. There are additional processes that we have not focused on and further work is needed to identify these. With our training materials we have chosen to produce learning materials that teach by using examples from published reviews and through use of learning and reflection activities. They have not been formally evaluated and are likely to benefit from iterative cycles of evaluation and updating based on findings.
Finally, both realist and meta-narrative reviews are relatively new approaches and as with any approach, capacity building is an issue. This project has enabled the project team to support and build capacity with a number of researchers and set up an e-mail mailing list to bring researchers together. A pressing need for the future is to maintain the momentum generated by this project. To this end, the JISCMail e-mail list continues to run but we invite any researchers interested in either method to join us in helping to build capacity.
Chapter 6 Conclusion
In conclusion, while realist and meta-narrative reviews hold much promise for developing theory and informing policy in some of the health sector’s most pressing questions, misunderstandings and misapplications of these methods are common. To try to address these problems we have produced publication and quality standards, and teaching and learning materials. We hope that our resources will be the start of an iterative journey of refinement and development of better resources for realist and meta-narrative reviews. Acknowledging that research should never be static, the RAMESES project does not seek to produce the last word on this topic but to capture current expertise and establish an agreed state of the science on which future researchers will no doubt build.
Acknowledgements
We wish to thank the following:
-
Project steering group – Drs C. Mead, J. Lancaster and D. Noble.
-
Literature searching – Jeanette Buckingham.
For their feedback, comments and time:
-
All contributors to the RAMESES JISCMail.
-
All Delphi panel members (see Appendix 11).
-
All the review teams we supported.
Contributions of authors
Dr Geoff Wong is a co-investigator in the RAMESES project and is a senior lecturer in primary health care at Queen Mary University of London, London, UK.
Professor Trish Greenhalgh is the principal investigator on the RAMESES project and is Professor of Primary Health Care and Dean for Research impact at Queen Mary University of London, London, UK.
Dr Gill Westhorp is an evaluation expert and has used both the realist evaluation and synthesis methods on a range of projects internationally. She is the Director of Community Matters, a research consultancy in Adelaide, SA, Australia.
Professor Ray Pawson is a co-investigator in the RAMESES project and is Professor of Social Research Methodology at the University of Leeds, Leeds, UK.
Geoff Wong carried out the literature review with the help of Jeanette Buckingham (Librarian, John W. Scott Health Science Library, University of Alberta, AB, Canada). Geoff Wong, Trish Greenhalgh, Gill Westhorp and Ray Pawson analysed the findings from the review and produced the materials for the Delphi panel. They also analysed the results of the Delphi panel and real-time data to produce the publication standards, the methodological guidance and teaching and learning materials. Geoff Wong, Trish Greenhalgh, Gill Westhorp and Ray Pawson conceived of the study and participated in its design. Geoff Wong co-ordinated the study and ran the Delphi panels. All authors read and approved the final manuscript.
Disclaimers
This report presents independent research funded by the National Institute for Health Research (NIHR). The views and opinions expressed by authors in this publication are those of the authors and do not necessarily reflect those of the NHS, the NIHR, NETSCC, the HS&DR programme or the Department of Health. If there are verbatim quotations included in this publication the views and opinions expressed by the interviewees are those of the interviewees and do not necessarily reflect those of the authors, those of the NHS, the NIHR, NETSCC, the HS&DR programme or the Department of Health.
Publications
Wong G, Greenhalgh T, Westhorp G, Pawson R. RAMESES publication standards: meta-narrative reviews. BMC Med 2013;11:20.
Wong G, Greenhalgh T, Westhorp G, Pawson R. RAMESES publication standards: realist syntheses. BMC Med 2013;11:21.
Jagosh J, Macaulay AC, Salsberg J, Bush PL, Henderson J, Sirett E, et al. Uncovering the benefits of participatory research: Implications of a realist review for health research and practice. Milbank Q 2012;90:311–46.
Noble D, Mathur R, Dent T, Meads C, Greenhalgh T. Risk models and scores for type 2 diabetes: systematic review. BMJ 2011;343:d7163.
References
- Berwick DM. The science of improvement. JAMA 2008;299:1182-4. http://dx.doi.org/10.1001/jama.299.10.1182.
- Lavis JN. How can we support the use of systematic reviews in policymaking?. PLOS Med 2009;6. http://dx.doi.org/10.1371/journal.pmed.1000141.
- Mays N, Pope C, Popay J. Systematically reviewing qualitative and quantitative evidence to inform management and policy-making in the health field. J Health Serv Res Policy 2005;10:6-20. http://dx.doi.org/10.1258/1355819054308576.
- Pawson R, Greenhalgh T, Harvey G, Walshe K. Realist review – a new method of systematic review designed for complex policy interventions. J Health Serv Res Policy 2005;10:21-34. http://dx.doi.org/10.1258/1355819054308530.
- Barnett-Page E, Thomas J. Methods for the synthesis of qualitative research: a critical review. BMC Med Res Methodol 2009;9. http://dx.doi.org/10.1186/1471-2288-9-59.
- Dixon-Woods M, Agarwal S, Jones D, Young B, Sutton A. Synthesising qualitative and quantitative evidence: a review of possible methods. J Health Serv Res Policy 2005;10:45-53. http://dx.doi.org/10.1258/1355819052801804.
- Lucas PJ, Baird J, Arai L, Law C, Roberts HM. Worked examples of alternative methods for the synthesis of qualitative and quantitative research in systematic reviews. BMC Med Res Methodol 2007;7. http://dx.doi.org/10.1186/1471-2288-7-4.
- Oxman AD, Schunemann HJ, Fretheim A. Improving the use of research evidence in guideline development: 8. Synthesis and presentation of evidence. Health Res Policy Syst 2006;4. http://dx.doi.org/10.1186/1478-4505-4-20.
- Popay J, Rogers A, Wiliams G. Rationale and standards for the systematic review of qualitative literature in health services research. Qual Health Res 1998;8:341-51. http://dx.doi.org/10.1177/104973239800800305.
- Thomas J, Harden A. Methods for the thematic synthesis of qualitative research in systematic reviews. BMC Med Res Methodol 2008;8. http://dx.doi.org/10.1186/1471-2288-8-45.
- Tricco AC, Tetzlaff J, Moher D. The art and science of knowledge synthesis. J Clin Epidemiol 2011;64:11-20. http://dx.doi.org/10.1016/j.jclinepi.2009.11.007.
- Watt A, Cameron A, Sturm L, Lathlean T, Babidge W, Blamey S, et al. Rapid reviews versus full systematic reviews: an inventory of current methods and practice in health technology assessment. Int J Technol Assess Health Care 2008;24:133-9. http://dx.doi.org/10.1017/S0266462308080185.
- Pawson R. Evidence-based policy: the promise of ‘realist synthesis’. Evaluation 2002;8:340-58. http://dx.doi.org/10.1177/135638902401462448.
- Pawson R. Evidence-based Policy: a Realist Perspective. London: Sage; 2006.
- Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O, Peacock R. Storylines of research in diffusion of innovation: a meta-narrative approach to systematic review. Soc Sci Med 2005;61:417-30. http://dx.doi.org/10.1016/j.socscimed.2004.12.001.
- Moher D, Hopewell S, Schulz K, Montori V, Gotzsche P, Devereaux P, et al. CONSORT 2010 Explanation and Elaboration: updated guidelines for reporting parallel group randomised trial. BMJ 2010;340. http://dx.doi.org/10.1136/bmj.c869.
- AGREE collaboration . Development and validation of an international appraisal instrument for assessing the quality of clinical practice guidelines: the AGREE project. Qual Saf Health Care 2003;12:18-23. http://dx.doi.org/10.1136/qhc.12.1.18.
- Liberati A, Altman D, Tetzlaff J, Mulrow C, Gotzsche P, Ioannidis J, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate healthcare interventions: explanation and elaboration. BMJ 2009;339. http://dx.doi.org/10.1136/bmj.b2700.
- Davidoff F, Batalden P, Stevens D, Ogrinc G, Mooney S. SQUIRE Development Group . Publication guidelines for improvement studies in health care: evolution of the SQUIRE project. Ann Intern Med 2008;149:670-6. http://dx.doi.org/10.7326/0003-4819-149-9-200811040-00009.
- Weiss C, Connell JP, Kubisch AC, Schorr LB, Weiss CH. New Approaches to Evaluating Community Initiatives: Concepts, Methods, and Contexts. Washington, DC: Aspen Institute; 1995.
- Astbury B, Leeuw F. Unpacking black boxes: mechanisms and theory building in evaluation. Am J Eval 2010;31:363-81. http://dx.doi.org/10.1177/1098214010371972.
- Greenhalgh T, Kristjansson E, Robinson V. Realist review to understand the efficacy of school feeding programmes. BMJ 2007;335:858-61. http://dx.doi.org/10.1136/bmj.39359.525174.AD.
- Kristjansson E, Robinson V, Petticrew M, Macdonald B, Krasevec J, Janzen L, et al. School feeding for improving the physical and psychosocial health of disadvantaged elementary school children. Cochrane Database Syst Rev 2007;1.
- Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organisations: systematic literature review and recommendations for future research. Milbank Q 2004;82:581-629. http://dx.doi.org/10.1111/j.0887-378X.2004.00325.x.
- Best A, Terpstra J, Moor G, Riley B, Norman C, Glasgow R. Building knowledge integration systems for evidence-informed decisions. J Health Organ Manag 2009;23:627-41. http://dx.doi.org/10.1108/14777260911001644.
- Pawson R, Greenhalgh T, Harvey G, Walshe K. Realist synthesis – an introduction. London: ESRC; 2004.
- McClure M. ‘Clarity bordering on stupidity’: where’s the quality in systematic review?. J Educ Policy 2005;20:393-416. http://dx.doi.org/10.1080/02680930500131801.
- Popper K. The Logic of Scientific Discovery. Vienna: Verlag von Julius Springer; 1935.
- Hasson F, Keeney S, McKenna H. Research guidelines for the Delphi survey technique. J Adv Nurs 2000;32:1008-15.
- Campbell S, Roland M, Shekelle P, Cantrill J, Buetow S, Cragg D. Development of review criteria for assessing the quality of management of stable angina, adult asthma, and non-insulin dependent diabetes mellitus in general practice. Qual Health Care 1999;8:6-15. http://dx.doi.org/10.1136/qshc.8.1.6.
- Hsu C-C, Sandford B. The Delphi Technique: Making Sense of Consensus. Pract Assess Res Eval 2007;12.
- Keeney S, Hasson F, McKenna HP. A critical review of the Delphi technique as a research methodology for nursing. Int J Nurs Stud 2001;38:195-200. http://dx.doi.org/10.1016/S0020-7489(00)00044-4.
- Washington D, Bernstein S, Kahan J, Leape L, Kamberg C, Shekelle P. Reliability of clinical guideline development using mail-only versus inperson expert panels. Med Care 2003;41:1374-81. http://dx.doi.org/10.1097/01.MLR.0000100583.76137.3E.
- Russell J, Elton L, Swinglehurst D, Greenhalgh T. Using the online environment in assessment for learning: a case-based study of a web-based course in primary care. Assess Eval Higher Educ 2006;31:465-78. http://dx.doi.org/10.1080/02602930600679209.
- Elwyn G, O’Connor A, Stacey D, Volk R, Edwards A, Coulter A, et al. Developing a quality criteria framework for patient decision aids: online international Delphi consensus process. BMJ 2006;333. http://dx.doi.org/10.1136/bmj.38926.629329.AE.
- Greenhalgh T, Wengraf T. Collecting stories: is it research? Is it good research? Preliminary guidance based on a Delphi study. Med Educ 2008;42:242-7. http://dx.doi.org/10.1111/j.1365-2923.2007.02956.x.
- Hart L, Bourchier S, Jorm A, Kanowski L, Kingston A, Stanley D, et al. Development of mental health first aid guidelines for Aboriginal and Torres Strait Islander people experiencing problems with substance use: a Delphi study. BMC Psychiatry 2010;10. http://dx.doi.org/10.1186/1471-244X-10-78.
- Holliday C, Robotin M. The Delphi process: a solution for reviewing novel grant applications. Int J Gen Med 2010;3:225-30. http://dx.doi.org/10.2147/IJGM.S11117.
- Pye J, Greenhalgh T. First aid kits for recreational dive boats: a Delphi study. J Travel Med 2010;8:311-17. http://dx.doi.org/10.1016/j.tmaid.2010.07.001.
- Keeney S, Hasson F, McKenna H. Consulting the oracle: ten lessons from using the Delphi technique in nursing research. J Adv Nurs 2006;53:205-12. http://dx.doi.org/10.1111/j.1365-2648.2006.03716.x.
- Greenhalgh T, Peacock R. Effectiveness and efficiency of search methods in systematic reviews of complex evidence: audit of primary sources. BMJ 2005;331:1064-5. http://dx.doi.org/10.1136/bmj.38636.593461.68.
- MacDermid J, Brooks D, Solway S, Switzer-McIntyre S, Brosseau L, Graham I. Reliability and validity of the AGREE instrument used by physical therapists in assessment of clinical practice guidelines. BMC Health Serv Res 2005;5. http://dx.doi.org/10.1186/1472-6963-5-18.
- Wong G, Greenhalgh T, Pawson R. Internet-based medical education: a realist review of what works, for whom and in what circumstances. BMC Med Educ 2010;10. http://dx.doi.org/10.1186/1472-6920-10-12.
- Greenhalgh T, Toon P, Russell J, Wong G, Plumb L, Macfarlane F. Transferability of principles of evidence based medicine to improve educational quality: systematic review and case study of an online course in primary health care. BMJ 2003;326:142-5. http://dx.doi.org/10.1136/bmj.326.7381.142.
- Wong G, Greenhalgh T, Russell J, Boynton P, Toon P. Putting your course on the web: lessons from a case study and systematic literature review. Med Educ 2003;37:1020-3. http://dx.doi.org/10.1046/j.1365-2923.2003.01673.x.
- Russell J, Greenhalgh T, Boynton P, Rigby M. Soft networks for bridging the gap between research and practice: illuminative evaluation of CHAIN. BMJ 2004;328. http://dx.doi.org/10.1136/bmj.328.7449.1174.
- Floyd M, Pilling D, Garner K, Barrett P. Vocational rehabilitation: what works and in what circumstances. Int J Rehabil Res 2004;27:99-103. http://dx.doi.org/10.1097/01.mrr.0000127638.09376.cd.
- Connelly J, Duaso M, Butler G. A systematic review of controlled trials of interventions to prevent childhood obesity and overweight: a realistic synthesis of the evidence. Public Health 2007;121:510-17. http://dx.doi.org/10.1016/j.puhe.2006.11.015.
- McCormack B, Dewar B, Wright J, Garbett R, Harvey G, Ballantine K. A Realist Synthesis of Evidence Relating to Practice Development: Final Report to NHS Education for Scotland and NHS Quality Improvement Scotland. Edinburgh: NHS Education for Scotland; n.d.
- Daykin N, Evans D, Petsoulas C, Sayers A. Evaluating the impact of patient and public involvement initiatives on UK health services: a systematic review. Evid Policy 2007;3:47-65. http://dx.doi.org/10.1332/174426407779702201.
- Birnik A, Bowman C. Marketing mix standardization in multinational corporations: a review of the evidence. Int J Manag Rev 2007;9:303-24. http://dx.doi.org/10.1111/j.1468-2370.2007.00213.x.
- Dieleman M, Gerretsen B, van der Wilt G. Human resource management interventions to improve health workers’ performance in low and middle income countries: a realist review. Health Res Policy Syst 2009;7. http://dx.doi.org/10.1186/1478-4505-7-7.
- Jackson L, Langille L, Lyons R, Hughes J, Martin D, Winstanley V. Does moving from a high-poverty to lower-poverty neighborhood improve mental health? A realist review of ‘Moving to Opportunity’. Health Place 2009;15:961-70. http://dx.doi.org/10.1016/j.healthplace.2009.03.003.
- Wakerman J, Humphreys J, Wells R, Kuipers P, Entwistle P, Jones J. Primary health care delivery models in rural and remote Australia – a systematic review. BMC Health Serv Res 2008;8. http://dx.doi.org/10.1186/1472-6963-8-276.
- Meyer B, Haywood N, Sachdev D, Faraday S. Independent Learning Literature Review (Research Report DCSF-RR051). Nottingham: DCSF Publications; 2008.
- Kane S, Gerretsen B, Scherpbier R, Dal Poz M, Dieleman M. A realist synthesis of randomised control trials involving use of community health workers for delivering child health interventions in low and middle income countries. BMC Health Serv Res 2010;10. http://dx.doi.org/10.1186/1472-6963-10-286.
- O’Campo P, Kirst M, Schaefer-McDaniel N, Firestone M, Scott A, McShane K. Community-based services for homeless adults experiencing concurrent mental health and substance use disorders: a realist approach to synthesizing evidence. J Urban Health 2009;86:965-89. http://dx.doi.org/10.1007/s11524-009-9392-1.
- Waddington H, Snilsveit B, White H, Fewtrell L. Water, Sanitation and Hygiene Interventions to Combat Childhood Diarrhoea in Developing Countries. New Delhi: International Initiative for Impact Evaluation; 2009.
- King E, Samii C, Snilstveit B. Interventions to Promote Social Cohesion in Sub-Saharan Africa. New Delhi: International Initiative for Impact Evaluation; 2010.
- Leeman J, Chang Y, Lee E, Voils C, Crandell J, Sandelowski M. Implementation of antiretroviral therapy adherence interventions: a realist synthesis of evidence. J Adv Nurs 2010;66:1915-30. http://dx.doi.org/10.1111/j.1365-2648.2010.05360.x.
- Mazzocato P, Savage C, Brommels M, Aronsson H, Thor J. Lean thinking in healthcare: a realist review of the literature. Qual Saf Health Care 2010;19:376-82. http://dx.doi.org/10.1136/qshc.2009.037986.
- Walshe C, Luker K. District nurses’ role in palliative care provision: a realist review. Int J Nurs Stud 2010;47:1167-83. http://dx.doi.org/10.1016/j.ijnurstu.2010.04.006.
- McMahon T. A Realist Review of Evidence to Guide Targeted Approaches to HIV AIDS Prevention Among Immigrants Living in High-Income Countries 2010.
- Ekeland AG, Bowes A, Flottorp S. Effectiveness of telemedicine: a systematic review of reviews. Int J Med Infirm 2010;79:736-71. http://dx.doi.org/10.1016/j.ijmedinf.2010.08.006.
- Javanparast S, Ward P, Young G, Wilson C, Carter S, Misan G, et al. How equitable are colorectal cancer screening programs which include FOBTs? A review of qualitative and quantitative studies. Prev Med 2010;50:165-72. http://dx.doi.org/10.1016/j.ypmed.2010.02.003.
- Morgan G. Evidence-based health policy: a preliminary systematic review. Health Educ J 2010;69:43-7. http://dx.doi.org/10.1177/0017896910363328.
- McLean AM, Koppang J. Behavioral caregiving for adults with traumatic brain injury living in nursing homes: developing a practice model. J Theory Constr Test 2010;14:17-22.
- Ware V-A, Gronda H, Vitis L. Addressing Locational Disadvantage Effectively. Melbourne, Australia: Australian Housing and Urban Research Institute Research Synthesis Service; 2010.
- Dieleman M, Kane S, Zwanikken P, Gerretsen B. Realist Review and Synthesis of Retention Studies for Health Workers in Rural and Remote Areas. Geneva: World Health Organization; 2011.
- Wong G, Pawson R, Owen L. Policy guidance on threats to legislative interventions in public health: a realist synthesis. BMC Public Health 2011;11. http://dx.doi.org/10.1186/1471-2458-11-222.
- O’Campo P, Kirst M, Tsamis C, Chambers C, Ahmad F. Implementing successful intimate partner violence screening programs in health care settings: evidence generated from a realist-informed systematic review. Soc Sci Med 2011;72:855-66. http://dx.doi.org/10.1016/j.socscimed.2010.12.019.
- Carr S, Lhussier M, Forster N, Geddes L, Deane K, Pennington M, et al. An evidence synthesis of qualitative and quantitative research on component intervention techniques, effectiveness, cost-effectiveness, equity and acceptability of different versions of health-related lifestyle advisor role in improving health. Health Technol Assess 2011;15. http://dx.doi.org/10.3310/hta15090.
- Davies JK, Sherriff N. The gradient in health inequalities among families and children: a review of evaluation frameworks. Health Policy 2011;101:1-10. http://dx.doi.org/10.1016/j.healthpol.2010.09.015.
- Gunawardena I. Effectiveness of the geriatric day hospital – a realist review. Rev Clin Gerontol 2011;21:267-9.
- Harris J, Kearley K, Heneghan C, Meats E, Roberts N, Perera R, et al. Are journal clubs effective in supporting evidence-based decision making? A systematic review. BEME Guide No. 16. Med Teach 2011;33:9-23. http://dx.doi.org/10.3109/0142159X.2011.530321.
- Jacobs S, Ashcroft D, Hassell K. Conducting a realist review of a complex concept in the pharmacy practice literature: methodological issues. Int J Pharm Prac 2010;18:1-2.
- Price S, Long AF, Godfrey M, Thomas KJ. Getting inside acupuncture trials – exploring intervention theory and rationale. BMC Complement Altern Med 2011;11. http://dx.doi.org/10.1186/1472-6882-11-22.
- Toohey A, Rock M. Unleashing their potential: a critical realist scoping review of the influence of dogs on physical activity for dog-owners and non-owners. Int J Behav Nutr Phys Act 2011;8. http://dx.doi.org/10.1186/1479-5868-8-46.
- Vassilev I, Rogers A, Sanders C, Kennedy A, Blickem C, Protheroe J, et al. Social networks, social capital and chronic illness self-management: a realist review. Chronic Illn 2011;7:60-86. http://dx.doi.org/10.1177/1742395310383338.
- Masuda J, Zupancic T, Poland B, Cole D. Environmental health and vulnerable populations in Canada: mapping an integrated equity-focused research agenda. Can Geogr 2008;52:427-50. http://dx.doi.org/10.1111/j.1541-0064.2008.00223.x.
- Greenhalgh T, Potts HW, Wong G, Bark P, Swinglehurst D. Tensions and paradoxes in electronic patient record research: a systematic literature review using the meta-narrative method. Milbank Q 2009;87:729-88. http://dx.doi.org/10.1111/j.1468-0009.2009.00578.x.
- Addis S, Davies M, Greene G, MacBrideStewart S, Shepherd M. The health, social care and housing needs of lesbian, gay, bisexual and transgender older people: a review of the literature. Health Soc Care Community 2009;17:647-58. http://dx.doi.org/10.1111/j.1365-2524.2009.00866.x.
- Collins P, Hayes M. The role of urban municipal governments in reducing health inequities: a meta-narrative mapping analysis. Int J Equity Health 2010;9. http://dx.doi.org/10.1186/1475-9276-9-13.
- Contandriopoulos D, Lemire M, Denis J-L, Tremblay E. Knowledge exchange processes in organizations and policy arenas: a narrative systematic review of the literature. Milbank Q 2010;88:444-83. http://dx.doi.org/10.1111/j.1468-0009.2010.00608.x.
- Greenhalgh T, Heath I. Measuring quality in the therapeutic relationship Part 1: objective approaches. Qual Saf Health Care 2010;19:475-8. http://dx.doi.org/10.1136/qshc.2010.043364.
- Greenhalgh T, Heath I. Measuring quality in the therapeutic relationship Part 2: subjective approaches. Qual Saf Health Care 2010;19:479-83. http://dx.doi.org/10.1136/qshc.2010.043372.
- Kitson A, Conroy T, Wengstrom Y, Profetto-McGrath J, Robertson-Malt S. Defining the fundamentals of care. Int J Nurs Pract 2010;16:423-34. http://dx.doi.org/10.1111/j.1440-172X.2010.01861.x.
- Gagliardi A, Brouwers M, Palda V, Lemieux-Charles L, Grimshaw J. How can we improve guideline use? A conceptual framework of implementability. Implement Sci 2011;6. http://dx.doi.org/10.1186/1748-5908-6-26.
- Wong G, Greenhalgh T, Westhorp G, Pawson R. RAMESES publication standards: realist syntheses. BMC Med 2013;11. http://dx.doi.org/10.1186/1741-7015-11-21.
- Wong G, Greenhalgh T, Westhorp G, Pawson R. RAMESES publication standards: realist syntheses. J Adv Nurs 2013;69:1005-22. http://dx.doi.org/10.1111/jan.12095.
- Wong G, Greenhalgh T, Westhorp G, Pawson R. RAMESES publication standards: meta-narrative reviews. BMC Med 2013;11. http://dx.doi.org/10.1186/1741-7015-11-20.
- Wong G, Greenhalgh T, Westhorp G, Pawson R. RAMESES publication standards: meta-narrative reviews. J Adv Nurs 2013;69:987-1004. http://dx.doi.org/10.1111/jan.12092.
- Wong G, Westhorp G, Greenhalgh T, Pawson R. The RAMESES Project: Project Outputs. 2013. www.ramesesproject.org/index.php?pr=Project_outputs (accessed 19 August 2014).
- Pawson R, Manzano-Santaella A. A realist diagnostic workshop. Evaluation 2012;18:176-91. http://dx.doi.org/10.1177/1356389012440912.
- Pawson R. The Science of Evaluation: a Realist Manifesto. London: Sage; 2013.
- Eccles M, Armstrong D, Baker R, Cleary K, Davies H, Davies S, et al. An implementation research agenda. Implement Sci 2009;4. http://dx.doi.org/10.1186/1748-5908-4-18.
- Gough D, Oliver S, Thomas J. An Introduction to Systematic Reviews. London: Sage; 2012.
- Pluye P, Hong Q. Combining the power of stories and the power of numbers: mixed methods research and mixed studies reviews. Annu Rev Public Health 2014;35:29-45. http://dx.doi.org/10.1146/annurev-publhealth-032013-182440.
- Greenhalgh T, Wong G, Westhorp G, Pawson R. Protocol – Realist and Meta-narrative Evidence Synthesis: Evolving Standards (RAMESES). BMC Med Res Methodol 2011;11. http://dx.doi.org/10.1186/1471-2288-11-115.
Appendix 1 Realist and meta-narrative reviews identified from initial exploratory searches
Connelly J, Duaso M, Butler G. A systematic review of controlled trials of interventions to prevent childhood obesity and overweight: a realistic synthesis of the evidence. Public Health 2007;121:510–17.
Dieleman M, Gerretsen B, van der Wilt G. Human resource management interventions to improve health workers’ performance in low and middle income countries: a realist review. Health Res Policy Syst 2009;7.
Dieleman M, Kane S, Zwanikken P, Gerretsen B. Realist Review and Synthesis of Retention Studies for Health Workers in Rural and Remote Areas. Geneva: World Health Organization; 2011.
Greenhalgh T, Kristjansson E, Robinson V. Realist review to understand the efficacy of school feeding programmes. BMJ 2007;335:858–61.
Jackson L, Langille L, Lyons R, Hughes J, Martin D, Winstanley V. Does moving from a high-poverty to lower-poverty neighborhood improve mental health? A realist review of ‘Moving to Opportunity’. Health Place 2009;15:961–70.
Kane S, Gerretsen B, Scherpbier R, Dal Poz M, Dieleman M. A realist synthesis of randomised control trials involving use of community health workers for delivering child health interventions in low and middle income countries. BMC Health Serv Res 2010;10.
King E, Samii C, Snilstveit B. Interventions to Promote Social Cohesion in Sub-Saharan Africa. New Delhi: International Initiative for Impact Evaluation; 2010.
Leeman J, Chang Y, Lee E, Voils C, Crandell J, Sandelowski M. Implementation of antiretroviral therapy adherence interventions: a realist synthesis of evidence. J Adv Nurs 2010;66:1915–30.
McCormack B, Dewar B, Wright J, Garbett R, Harvey G, Ballantine K. A Realist Synthesis of Evidence Relating to Practice Development: Final Report to NHS Education for Scotland and NHS Quality Improvement Scotland. Edinburgh: NHS Education for Scotland; 2006.
O’Campo P, Kirst M, Schaefer-McDaniel N, Firestone M, Scott A, McShane K. Community-based services for homeless adults experiencing concurrent mental health and substance use disorders: a realist approach to synthesizing evidence. J Urban Health 2009;86:965–89.
Waddington H, Snilsveit B, White H, Fewtrell L. Water, Sanitation and Hygiene Interventions to Combat Childhood Diarrhoea in Developing Countries. New Delhi: International Initiative for Impact Evaluation; 2009.
Wong G, Greenhalgh T, Pawson R. Internet-based medical education: a realist review of what works, for whom and in what circumstances. BMC Med Educ 2010;10:12.
Wong G, Pawson R, Owen L. Policy guidance on threats to legislative interventions in public health: a realist synthesis. BMC Public Health 2011;11.
Appendix 2 Briefing document for realist and meta-narrative review Delphi panel
Appendix 3 ‘Paper’ version of round 2 online Delphi panel survey for realist reviews
Appendix 4 ‘Paper’ version of round 3 online Delphi panel survey for realist reviews
Appendix 5 ‘Paper’ version of round 2 online Delphi panel survey for meta-narrative reviews
Appendix 6 ‘Paper’ version of round 3 online Delphi panel survey for meta-narrative reviews
Appendix 7 Notes on teleconference with a review team the project team provided methodological support to
Teleconference with Durham Health Services and Delivery Research review team
Date
30 October 2012.
Time
10.00 a.m. to 11.10 a.m.
Participants
Geoff.
Jan.
Madeline.
Neill.
Purpose of meeting was to get feedback from review team in Durham about:
-
process of learning about realist reviews
-
concepts that were easy/harder to grasp in realist review.
General comments
Felt that the review method had been helpful as it enables reviewers to learn more about a topic than might get from a Cochrane review.
BUT required review team to be:
-
engaged
-
prepared to unlearn and relearn new things (may act to inhibit uptake of method?)
-
be comfortable at the beginning to not know where you are heading
-
read and reread the literature, engage with it deeper than just (for example) skim reading – ‘takes time to make connections . . .’
-
greater clarity comes from immersing yourself into the literature and this then helps with knowing where to head/go/change – ‘. . . saw things you never saw in other research methods . . .’
-
was much harder work than any other review they had done. ALSO to make progress had to have LOTS of meetings. Review was more time consuming and labour intensive than they had anticipated.
Specific challenges
Consensus that worked examples were the most helpful way to learn.
There was praise for commitment, willingness and clarity of training from GWo from all of Durham team.
Suggestions for areas that need specific attention:
Clarifying terms – C, M, O programme theory and middle-ranged theory, relationship of an Intervention to CMO.
Focusing review – team felt that they had a huge topic to cover in a short period of time and so may not have done the subject matter ‘justice’.
‘Blueprint’/template – some members of the team felt that having a template of what to do might help. But there was also an appreciation that realist review was a review method that was iterative.
To help some learn, a ‘quick start’ style of guide covering the main concepts might be helpful.
Searching and inclusion – when does the systematic searching stop and realist searching start? Issue was more about what studies/documents to include. The review team understood the concept of relevance, but found that they could only resolve this with lots of discussion.
Realist logic – team members were worried that they might not have got realist logic. Getting feedback from trainer helped.
Analysis and synthesis
-
Having a worked example of this that traces the ‘journey’ from a piece of data → inference → theory would help.
-
Explaining the need that C, M and O may change over time and depending on which outcome is important – again worked example would help.
-
Explaining the need to change level of abstraction of analysis – go deep and then back to more abstract – again worked example would help.
Reporting
Huge tension here between wanting to report all that they found and also to provide a document that they think might be relevant to policy and decision-makers, especially ‘coal face’ managers. Helped by context and content expertise and thinking like a manager.
Review team in the peer review of their report was asked by one peer reviewer to provide minute details of each CMO. Team agreed on the need for transparency, but felt that worked examples of how to ‘walk this balance’ of how much detail would help.
Comment about published reviews from team – in wanting to learn about realist review they turned to the published literature and found that it was full of examples that confused as opposed to helped. GWo explained regarding issue of ‘fake handbags’. (Reviews which claimed to be realist reviews, but were in fact not.)
Team suggested that it is useful to point readers towards good examples, perhaps by focusing on the positives – e.g. this is a really good example of . . .
GWo 30 October 2012.
Appendix 8 Notes from the realist review training workshop held at Queen Mary University of London in March 2011
Advancing Realist Research Conference
Date
25 March 2011.
Venue
G O Jones Room, Queen Mary University of London, London, UK.
Participants
Participant | Affiliation |
---|---|
Rob Anderson | Peninsula Medical School, UK |
David Baker | Dartmouth College, USA |
Andrew Booth | University of Sheffield, UK |
Madeline Carter | University of Durham, UK |
Steve Dewar | Marie Curie Cancer Care, UK |
Marjolein Dieleman | Royal Tropical Institute, the Netherlands |
Carole Doherty | University of Surrey, UK |
Tim Dornan | Maastricht University, the Netherlands |
Ruth Garside | Peninsula Medical School, UK |
Barend Gerretsen | Royal Tropical Institute, the Netherlands |
Trish Greenhalgh | Queen Mary University of London, UK |
Sacha Harris | Imperial College, UK |
Andrea Herepath | Cardiff University, UK |
Roger Kneebone | Imperial College, UK |
Patricia Lanter | Dartmouth College, UK |
Bruno Marchal | Institute of Tropical Medicine, Belgium |
Ana Manzano-Santaella | University of Leeds, UK |
Douglas Noble | Queen Mary University of London, UK |
Ray Pawson | University of Leeds, UK |
Mark Pearson | Peninsula Medical School, UK |
Birte Snilstveit | 3ie, UK |
Charitini Stavropoulou | University of Surrey, UK |
Katherine Stevenson | Jönköping University, Sweden |
Neill Thompson | University of Durham, UK |
Hugh Waddington | 3ie, UK |
Rebecca Walwyn | University of Leeds, UK |
Gill Westhorp | Community Matters, Australia |
Geoff Wong | University College London, UK |
Feedback from sessions
Methods 2: introduction to RAMESES (Realist and Meta-narrative Evidence Synthesis: Evolving Standards)
Participants were presented information on the RAMESES project and asked what they would like from it.
Guidance/standards:
-
Protocols for RS needed – consensus on this
-
Explain how RS fits in with other review methods.
-
Guidance needs to establish what counts as INTERNAL and EXTERNAL validity.
-
Standards set should be broad enough to be suitable for ‘all’ purposes of RS – possibly principles based and not too ‘rigid’.
-
Reviewers using RS should understand:
-
realist ontology.
-
realist theory of causation.
-
-
Guidance/standards must be useful to FUNDERS/REVIEWERS/RESEARCHERS.
Methodological:
-
In general the HOW TO do X is a big problem in RS – tools needed.
-
Glossary of terms/concepts:
-
Mechanism.
-
Programme theory.
-
Middle-range theory.
-
Context.
-
Programme/intervention.
-
Policy.
-
-
Relationship between RS concepts.
-
How to ensure transparency in a RS.
-
How to write a RS protocol.
-
When should RS be used?
-
What can it be used for (e.g. just to understand policy or in other circumstances)?
-
How to focus a RS so that it is ‘do-able’.
-
How to select/develop programme theory.
-
How do you know what studies to include?
-
How to pull out context from included studies.
-
How do you analyse CMOs.
SYNTHESIS: group discussion
In small groups, participants were asked to try to map out the relationship between:
-
programme theory
-
mechanism
-
context.
A summary of the main points
-
These concepts were hard to define and distinguish and it was not clear to the participants if definitions would be relative or absolute. How they related to each other was also not clear to the participants. In addition these concepts (e.g. programme theory/logic models) were not unique to RS and this added to confusion over definitions.
-
Some mentioned that they might be better off thought if as ‘sensitising principles’ and that precise definitions may not be either necessary or achievable. It was raised that some may need a precise definition in order to be able to use the concept – might this apply more to novices?
-
The different way of thinking (about the world) needed to undertake a RS might mean some will struggle.
-
Specific points discussed in the session:
-
Theories that are important are the ones that have bearing on the question of causality.
-
Context:
-
Context pre-exists the intervention. It can have two ‘states’ – at the beginning it is everything that has a bearing on X – at the end it is everything that actually did have a bearing on X. May also be thought of as ‘context to describe’ and ‘context to explain’.
-
Contexts are defined in relation to a particular mechanism and conceptualising it this was helps in working out middle-range theory.
-
-
Mechanisms:
-
Mechanism is what is going through a person’s head.
-
Mechanisms operate at different levels (e.g. individual psychology, group dynamics etc.) Mechanisms should be anchored to the outcome (same unit of analysis).
-
Mechanisms need to be distinguished from interventional modality/strategy.
-
-
-
IN terms of taking things forward, ideas include:
-
Seminar series (e.g. phone, Adobe Connect mediated).
-
Reading list/materials.
-
From RAMESES try to put all RS reviews online as examples ‘next to’ standards – people can decide for themselves.
-
Bearing in mind that different people at different stages need different things!
-
Appendix 9 Realist synthesis: Realist And Meta-narrative Evidence Syntheses – Evolving Standards (RAMESES) training materials
Appendix 10 Training materials for meta-narrative reviews
Appendix 11 List of all members of the online Delphi panels
Dave Baker, Sinai Hospital of Baltimore (Baltimore, MD, USA).
Marcello Bertotti, University of East London (London, UK).
Allan Best, InSource (Vancouver, BC, Canada).
Margaret Cargo, University of South Australia (Adelaide, SA, Australia).
Simon Carroll, University of Victoria (Victoria, BC, Canada).
Colleen Davison, Queens University, (Kingston, ON, Canada).
Marjolein Dieleman, Royal Tropical Institute (Amsterdam, the Netherlands).
Tim Dornan, Maastricht University (Maastricht, the Netherlands).
Ruth Garside, Peninsula College of Medicine and Dentistry (Exeter, UK).
Bradford Gray, Milbank Quarterly (New York, NY, USA).
Joanne Greenhalgh, University of Leeds (Leeds, UK).
Lois Jackson, Dalhousie University (Halifax, NS, Canada).
Justin Jagosh, McGill University (Montreal, QC, Canada).
Monika Kastner, University of Toronto (Toronto, ON, Canada).
James Lamerton, Sunshine Coast Division of General Practice (Cotton Tree, QLD, Australia).
Fraser MacFarlane, Queen Mary, University of London (London, UK).
Bruno Marchal, Institute of Tropical Medicine (Antwerp, Belgium).
Tracey McConnell, Queen’s University (Belfast, UK).
Gemma Moss, Institute of Education (London, UK).
Douglas Noble, Queen Mary, University of London (London, UK).
Patricia O’Campo, University of Toronto (Toronto, ON, Canada).
Mark Pearson, Peninsula College of Medicine and Dentistry (Exeter, UK).
Pierre Pluye McGill University (Montreal, QC, Canada).
Henry Potts, University College London (London, UK).
Barbara Riley, University of Waterloo, (Waterloo, ON, Canada).
Glenn Robert, Kings College London (London, UK).
Jessie Saul, North American Research & Analysis, Inc. (Faribault, MN, USA).
Paul Shekelle, RAND Corporation (Santa Monica, CA, USA).
Neale Smith, University of British Columbia (Vancouver, BC, Canada).
Sanjeev Sridharan, University of Toronto (Toronto, ON, Canada).
Deborah Swinglehurst, Queen Mary, University of London (London, UK).
Nick Tilley, University College London (London, UK).
Kieran Walshe, University of Manchester (Manchester, UK).
All the RAMESES Project team members were also members of the Delphi panel.
Appendix 12 Project protocol
Project protocol (PDF download)
© 2011 Greenhalgh et al. ; licensee Biomed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution and reproduction in any medium, provided the original work is properly cited.
List of abbreviations
- AGREE
- Appraisal of Guidelines for Research and Evaluation
- C
- context
- CINAHL
- Cumulative Index to Nursing and Allied Health Literature
- CMO
- context, mechanism and outcome
- CONSORT
- Consolidated Standards of Reporting Trials
- EQUATOR
- Enhancing the QUAlity and Transparency Of health Research
- ERIC
- Education Resources Information Center
- HSDR
- Health Services and Delivery Research
- M
- mechanism
- NIHR
- National Institute for Health Research
- O
- outcome
- PRISMA
- Preferred Reporting Items for Systematic Reviews and Meta-Analyses
- RAMESES
- Realist And Meta-narrative Evidence Syntheses – Evolving Standards
- RS
- realist synthesis (or review)
- SCI
- Science Citation Index
- SQUIRE
- Standards for Quality Improvement Reporting Excellence
- SSCI
- Social Science Citation Index