User-centred narratives and quality improvement
Patient experience is defined as one of three components of quality in healthcare, alongside patient safety and clinical effectiveness. Improving people’s experience of healthcare has been highlighted as a priority for successive governments in the United Kingdom [1]. A number of specifically user-centred quality improvement approaches drawing on patient experience data have been developed, such as Experience-Based Co-Design (EBCD) and Patient and Family Centred Care [2,3,4].
Despite a substantial amount of evidence about what matters to people about their care, until recently organisational initiatives have commonly focused on collecting more patient experience data (typically through surveys) and measuring performance, rather than using it for quality improvement [5]. This situation is gradually changing, but clinical staff keen to use patient experience data for improvement may lack organisational support to do so. They need expert help sifting through such data to identify themes and priorities for action, and implementing quality improvement [6, 7]. This may be particularly true of narrative data. As Martin et al. [8] note, making sense of quantitative data is challenging enough; working out how to turn the ‘untamed richness’ of ‘soft’ forms of intelligence such as narrative and observational data into action for quality improvement may be even more daunting.
Bringing service user partners into the analysis process may be one way to help achieve this. We involved users in a secondary data analysis project which aimed to develop new resources to support EBCD.
Patient and public involvement and data analysis
Patient and public involvement (PPI) in health research is a well-established principle, meaning research is conducted with or by users, rather than to, for or about them. Service-user-researchers, who collect and sometimes analyse research data, are more common in some research fields than others, notably mental health. However, more typically PPI involvement has been limited to advising on research questions and research design, leaving professional researchers to complete data collection and analysis. PPI in data analysis is perhaps one of the most challenging and least well explored aspects of involvement. Analysing patient stories may appear more intuitive and approachable than, say, quantitative analysis such as logistic regression or mathematical modelling. At the same time, both the volume of the data generated and the need for the analyst to be alert to theory as well as patterns in the data can be a hurdle.
There are comparatively few articles documenting the process of and rationale for service user involvement in data analysis. A systematic review [9] found many examples of lay people becoming involved in the conduct, design or dissemination of research, but it was less common for them to be involved in execution or translation of the research. This echoes findings that ‘examples are few of participatory interpretation and analysis of data’ [10]. There are however some interesting examples of service user involvement in the analytic process, which focus both on the process of ‘doing qualitative analysis’ with lay partners, but also offer insights into the benefits and challenges. These challenges include how much training is required and to what level of detail; practical constraints and demands of reading through long transcripts; how to recruit user partners able to engage with the perspectives of the specific study population; and how to deal with the emotional impact of reading potentially difficult stories [11,12,13].
Whitmore [14] cautions against expecting lay partners to write for an academic audience (which could be ‘unfair and unrealistic’), and distinguishes this from involvement in interpretation and illuminating the meaning of participants’ accounts. The knowledge and understanding that lay partners bring to the table has been argued to offer clear benefits, and to counter the interests, assumptions and perspectives which researchers themselves bring to analysis.
Jennings et al. [15] have recently reviewed different approaches to ‘collaborative data analysis’ within mental health and propose the following typology:
-
Consultation (researchers conduct the analysis and present it to PPI partners for comment)
-
Development (PPI partners help develop an initial coding framework then applied by the researchers)
-
Application (researchers develop a coding framework and then involve PPI partners in applying this to a set of transcripts)
-
Development and application (in which PPI co-researchers are given extensive training in data analysis, and are involved over time in both developing and refining the coding framework and applying it to all data, described as the ‘gold standard’ of collaborative data analysis).
The authors note that while studies using ‘consultation’ may appear to be the least democratic approach, they also ‘tended to have people with lived experience in their research team….[and] moved beyond the binary categorisation of researchers as academic or service users’ [15 p.4]. One example used ‘multiple coding’, with one clinical researcher, one psychologist and one service user researcher each independently coding focus group transcripts to identify outcomes for Cognitive Behavioural Therapy for psychosis [16]. The results showed a high degree or consensus between the analysts but also new themes, and ‘points of fracture or non-consensus’ (p.e96) which had to be resolved, in some cases by returning to individual research participants to clarify what they had meant.
An example of the gold standard ‘development and application’ approach is Cotterell [17]. To support his study of the needs of service users with life limiting conditions he held repeated interpretation and ‘theme generation’ sessions over the course of several months with a user panel. The researcher’s initial seven themes were revised to eight themes, only two of which (‘diagnosis’ and ‘relationships’) remained the same. Strikingly the revised themes included more emotional categories than the original set, including ‘fear’, ‘anger/frustration’, and ‘grief’. Garfield et al. [18] describe a lighter touch process of involving lay partners in analysis of qualitative data, by giving people a sub-set of transcripts and inviting them to come up with their own themes after brief training. These mapped closely onto the themes already identified in the researcher-led analysis, providing useful confirmation, but also identified one new theme. The final output of the research was presented as a ‘synergy of perspectives’ that were not easily visible or separated.
Best et al. [19] tested the Participatory Theme Elicitation or PTE method, with a youth advisory panel of 8 members. The research team selected 40 focus group data extracts, from a study of school-based physical activity, that ‘could be easily understood and interpreted as standalone statements’ (p.3). The young people involved were invited to sort these into thematic piles. These were then recorded and grouped by researchers, and the output was then compared with the researchers’ thematic analysis. Echoing Garfield [18], the authors found that ‘while PTE analysis was, for the most part, consistent with the researcher-led analysis, young people also identified new emerging thematic content’ (p.1). They note that approaches to involving people in analysis need to be accessible, and that some techniques (such as Byrne et al’s [10] experimentation with the Voice-Centred Relational or VCR Method) are too elaborate and time-consuming in terms of training and analysis time. Byrne et al. themselves identify that teenagers involved in the VCR method - which involves repeated attentive reading of transcripts and the development of an emerging narrative -found it ‘difficult, tedious and time consuming’ p.74).
Some researchers who have involved users in analysis have expressed concerns about lay partners potentially putting too much emphasis onto their own experiences rather than analysing what came out of the data [18]. Cotterell [17], however, records that his early concerns about ‘the ability of the group to remain ‘objective’ and to refrain from blurring interpretation of data with their own personal experience and concerns’ proved unfounded. He concludes that service users with similar concerns and experiences as those who are the focus of the study can bring a reflective lens to the process and a greater capacity for the analysis to truly ‘unpack the taken for granted’. One might add that researchers also bring their own biases about what they see in the data and what they regard as important.
Fisher [20] argues that involvement of lay partners in analysis can uncover nuances which may not be obvious to a researcher. Disabled user researchers drew his attention to a single passage in one transcript which for them gave an important sense of how users feel judged and oppressed by benefit assessment panels, but to which he might not otherwise have given much weight. This demonstrated how insider sensitivity to the nuance of language can bring a level of understanding to the process that researchers may not attain, even after long immersion within a community. Similarly, Gillard et al. [21] suggest that service user involvement in data analysis added ‘expertise by experience’ to the process, and had changed the conventions of ‘doing research’ to some degree by the inclusion of challenging voices, as well as adding a more nuanced set of themes that came from their differing points of reference.
What is experience-based co-design (EBCD)?
EBCD is a participatory action research approach to quality improvement based on active patient-staff partnership [2]. It has three phases: ‘discovery’, ‘co-design’, and implementation.
The discovery phase, which would normally last around 6 months, involves interviews with staff; observations of a particular care pathway; and video-recorded interviews with local service users and family members about their experiences of care. These interviews are analysed by a researcher (who may be an academic or a healthcare professional) to identify ‘touchpoints’ – these are key moments of interaction between person and service where something could have been done better, or which exemplify good experience. Video clips showing these touchpoints are selected and edited into a ‘trigger film’. Examples of touchpoints might include an account of insensitive communication; a memorable act of kindness or emotional care; disturbance from noise or lighting on the ward; particular efforts to ensure privacy and dignity.
In the co-design phase, the trigger film is shown first to a workshop for users and family members only, and then again to a joint workshop with healthcare staff. The aim of the film is to trigger discussion about local quality issues and agree a set of improvement priorities, which are then addressed through small co-design working groups (involving users and staff as equal partners). The whole EBCD process typically takes 12–18 months to complete [22].
Although evaluations have shown EBCD to be effective, they have also identified the time and resources it takes as barriers to widespread adoption [22]. In a previous study [23]we therefore tested an ‘accelerated’ version of EBCD in two lung cancer services and two intensive care units, in which we replaced much of the discovery phase with interviews from existing national interview collections, held by the Health Experiences Research Group, University of Oxford [2] and disseminated on the Healthtalk.org website through a series of lay-friendly ‘topic summaries’. Thus instead of trigger films made from local user interviews, we identified touchpoints from existing interviews which had been collected around the country, and made films which can be used in multiple locations. However, in common with other EBCD projects, this did not include direct user involvement in analysis.
Rationale for the new project
Having demonstrated that trigger films developed from the national archive could have a similar effect to local films, and were ‘good enough’ to trigger co-design, we were prompted to reflect whether further adaptations to the discovery phase were possible. The co-authors were awarded a grant for secondary data analysis from the Economic and Social Research Council (ES/L01338X/1) to make a series of new trigger films and pursue two possible ways to streamline the discovery phase:
-
1.
What would be lost, if anything, if touchpoints for the trigger films were selected only from material already chosen for dissemination on Healthtalk, rather than re-analysing full transcripts, as a way of further accelerating the process (i.e. from the topic summaries on the website)?
-
2.
How could service users themselves best get involved in data analysis? Would they identify similar or different touchpoints to the researcher?
In this paper we focus particularly on this second area, and describe the work we undertook with user partners in analysis workshops. We conclude with recommendations developed with our workshop attendees for taking the process forward and our reflections on it.
As well as having a service user co-investigator on the team [GS], we organised two workshops to explore involvement in data analysis, with two service user panels; one panel member has also co-authored this paper [LB].