Availability of data and materials
Research Involvement and Engagement volume 5, Article number: 26 (2019)
Thank you for commenting on our paper . Firstly, we agree that it is possible for people to engage with ‘raw data’; the ‘gold standard’ approach referred to by Jennings et al.  involves people engaging deeply with transcripts, to “achieve meaning at a deep, semantic level, and undertake extensive co-revision of themes, codes and frameworks as findings emerge. In essence, if the academic researchers are doing it, so are the PPI co-researchers.”
It is worth noting that in our case we were conducting a secondary analysis of existing data collected some time ago. While we had PPI partners involved throughout our secondary analysis project, by definition that group of people could not have been involved in the data collection. Thus we were looking for ways to involve people in exploring this previously collected data from a new angle. Nonetheless, as we noted in our paper, some of the people who got involved with our project were really enthusiastic about engaging directly with transcripts. It is good to hear of your positive experience of reading and engaging with selected transcripts, and we agree that this may be easier when a group of people have been involved in primary data collection, a point also made by Jennings.
However, we would also argue that it is important not to let the best be the enemy of the good. Sometimes there may be more pragmatic ways to approach user involvement in analysis which make good use of people’s time and insights, and potentially bring a wider range of perspectives to bear. If we involve only those people willing to read full transcripts we could miss important insights from a wider group of people who might like to contribute in a different way. A full qualitative research dataset – say 40 interviews of an hour and a half each – can easily run to 1500 pages of text. Even reading 1500 pages, let alone coding the content and relating it to previous literature and theory, is a substantial undertaking (and one which, incidentally, is often under-estimated by those unfamiliar with qualitative research). The idea that an analytic conversation might be a helpful alternative approach was developed with and by our PPI partners, including two PPI co-authors, and is not simply a researcher view. Offering individuals a choice of ways to inform analysis can be helpful for them and enhance diversity.
In your group, using Garfield’s approach , you each read a few selected transcripts, and then brought your insights to a discussion of themes with the researcher. This sounds very much like an analytic conversation, but with a starting point in individual readings rather than a group discussion of likely anticipated themes. This is of course another good way to stimulate analytic conversation, rooted in raw data but not expecting people to analyse a whole dataset. In fact one of us is doing something similar in a current project where there was PPI involvement from the beginning in shaping the proposal and the interview guide, and now towards the end we are sharing selected transcripts and inviting PPI reflections on the emerging themes which will inform the researcher’s coding and analysis.
We would note that we did not ask people to share with us their experiences as such; we asked them to propose touchpoints drawing on or informed by their own experience. These touchpoints were intended to be generic likely issues for us to look out for (such as ‘noise’ or ‘support after discharge’), and where service improvement efforts could be focussed. We think this bringing of personal experiential insights (in this case to inform analysis and the identification of improvement priorities) is part of the point of PPI, but that is very different from treating it as data.
There are different ways to approach user involvement in qualitative data analysis, depending on the nature of the topic, the type of project (e.g. secondary versus primary analysis), how applied or theoretical the study is, and the skills, preferences and interests of the people involved. As Jennings et al. note, limited time and financial/human resources are another factor, and in “most funded studies….compromises between quality and pragmatism are required”. We would suggest there is no one “right” way, but a plurality of good ways, and we all learn by openly sharing our experiences, our successes and our challenges.
Williams M, Etkind M, Husson F, Ogunleye D, Norton J. Comments on: Involving service users in the qualitative analysis of patient narratives to support healthcare policy improvement. Res Involv Engagem. 2019;5 https://doi.org/10.1186/s40900-019-0157-z.
Jennings H, Slade M, Bates P, Munday E, Toney R. Best practice framework for patient and public involvement (PPI) in collaborative data analysis of qualitative mental health research: methodology development and refinement. BMC Psychiatry. 2018;18:213. https://doi.org/10.1186/s12888-018-1794-8.
Garfield S, Jheeta S, Husson F, Jacklin A, Bischler A, Norton C, Franklin BD. Lay involvement in the analysis of qualitative data in health Serv res: a descriptive study. Res Involv Engagem. 2016;2:29. https://doi.org/10.1186/s40900-016-0041-z.
Thanks to all the people who attended our analysis workshops, and to the original interviewees whose interviews we worked with. Thanks also to the DIPEx Charity, which hosts the DIPEx website, for creating the series of trigger films. Dr. Sian Rees was a co-applicant on the original grant but has since retired. At the time of the work, LL was supported by NIHR Oxford Biomedical Research Centre.
Economic and Social Research Council (ES/L01338X/1).
The original interviews included in the secondary analysis were given approval by Berkshire Research Ethics Committee (09/H0505/66). Patients consented to be interviewed, and for their interviews to be a) disseminated online on Healthtalk and b) to be used additionally for secondary analysis, teaching and service improvement. This paper reports a patient and public involvement project, exploring involvement in qualitative data analysis and commenting on data already collected in the above interviews. University of Oxford institutional ethics guidance states that people who are “giving their views on research….do not count as “human participants” in the sense intended by CUREC’s [Central University Research Ethics Committee] policy. They are not giving you information about themselves, and the opinions they offer are not themselves the subject of research. You need not get ethical approval of your research if your contact with people is confined to this sort of interaction”.
People who took part in the workshops did so as patient and public involvement partners, and were paid an honorarium for their time. They were given written information about the purpose of the workshops as well as verbal explanation and training.
No personal data included.
LL and GR have both taught on quality improvement training programmes run by JC and the Point of Care Foundation. GR is one of the originators of EBCD. Other authors declare that they have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Locock, L., Kirkpatrick, S., Brading, L. et al. Response to “comments on: involving service users in the qualitative analysis of patient narratives to support healthcare quality improvement. Res Involv Engagem 5, 26 (2019). https://doi.org/10.1186/s40900-019-0158-y