Skip to main content

Community members’ experiences training as medical journal reviewers

Abstract

Purpose

Although medical research dissemination is intended to benefit members of society, few members of society actually participate in the process of publishing findings. This study shares findings from community members’ (including patients and the public) experiences being trained as medical journal reviewers.

Methods

We analyzed findings from two focus group interviews of community reviewers (N = 29) to identify themes in their experiences with the training program.

Results

Community members trained as journal reviewers appreciated learning the context under which manuscript development and review occur from authors and funders, the value of the community member perspectives to science, and strengthened their critical thinking skills. A range of training tools and strategies included glossaries of research terms, creating review guides, practicing reviews, being trained by a supportive team, and working with and learning collaboratively.

Conclusions

Training as a journal reviewer has a positive impact on participating community members. Programs training community members as journal reviewers should incorporate guest speakers well-versed in community engaged research, group activities, a variety of training tools and materials, and highly supportive training teams.

Plain English summary

The findings of medical research are supposed to benefit society, but few members of society that are not specialists actually help publish findings. In this study we hoped to learn about the experiences of community members who were trained to be medical journal reviewers. We interviewed 29 of the 34 community members who were trained to learn what they liked and did not like about the training. We learned that community members appreciated learning about how journal articles get published. Learning from journal article authors was helpful, as well as from people who work for organizations that finance the research. They also learned about how patient and community perspectives are important in science. They also said they learned to strengthen their critical thinking skills. They mentioned training tools and strategies that would have helped them. These included lists of research terms and their definitions, review guides, practicing reviews ahead of time, having supportive trainers, and working with and learning from each other. We conclude that training as a journal reviewer has had a positive impact on the community members. We suggest that programs who want to train community members as journal reviewers should incorporate guest speakers who understand community engaged research. They should also make sure they incorporate group activities, a variety of training tools and materials, and be very supportive.

Peer Review reports

Introduction

Clinicians or researchers typically review manuscripts submitted to medical journals, without input from patients or other community members. However, U.S.-based research on how patients and the public (here referred to as “community members”) contribute to other research phases suggests they provide a distinct perspective on research and contribute expertise on their communities [1, 2]. For example, community-based participatory research (CBPR) involves communities in selecting research topics, collecting data, and interpreting findings [3, 4]. Integrating community perspectives into research can lead to a stronger understanding of community priorities, adjusting study designs to minimize participant burden and increase participant recruitment and retention [2, 5,6,7,8,9,10]. Because journal articles are seen as primarily academic, community members do not generally serve as journal manuscript reviewers. However, community members’ unique perspectives inform and broaden research dissemination activities’ scope, offering journal authors a unique point of view they might not otherwise consider [3, 11].

Physicians, researchers, or other scientific professionals are the primary editors and reviewers of medical journal manuscripts; community members are rarely involved in reviews. Communities tend to be involved with research in early research phases (e.g., selecting research questions, developing study protocols), less common in later phases (e.g., disseminating and implementing findings), and rare in medical journal manuscript reviews [12]. We are aware of only two medical journals (both based in the United Kingdom) that include patients/community members as part of the review process (British Medical Journal and Research Involvement and Engagement) [13, 14]. We are not aware of efforts in other countries besides the U.S. to include patients/community members in other parts of the scientific process. Some perceive that community members simply do not know enough about research to meaningfully contribute to the research enterprise, including lacking objective viewpoints and statistical training [3]. However, relying solely on scientific reviewers overlooks critical community perspectives [15, 16].

The NIH-funded (U54 mechanism) randomized controlled trial within which this study was embedded found that community members could be successfully trained to meaningfully contribute to disseminating research through their journal manuscript reviews, as compared to using only scientific reviewers [17]. This study describes the experiences the community members who participated in that study had in being trained to review medical journal manuscripts. The study is intended to determine best practices for community training that could be adapted to other CBPR efforts and helps establish training protocols for future community reviewing in other journals.

Methods

Project description

Community reviewers participated in-person in 90-min training sessions per week for 6 weeks in spring 2018 and fall 2019. The training topics are listed in Table 1. The training included didactic teaching, and interactive and skill-building exercises to enhance writing and critical thinking skills. Trainees reflected on relevant personal experiences to help them identify values, beliefs, and biases they might bring to their reviews [18, 19]. The trainees practiced by reviewing manuscripts which the project team evaluated and suggested areas to improve. Trainees received $50 for each training session they attended.

Table 1 Community reviewers’ training topics

Design

The design was a qualitative case study using a social constructionist inquiry framework to learn about participants’ training experiences [20]. This framework was chosen because we were interested in learning about the participants’ experiences in their own words, as well as the meaning and value of the training.

Sample and data collection

A total of 34 trainees (Cohort 1, n = 24; Cohort 2, n = 10) including patients, community residents, and community organizations’ staff participated in the training. We recruited new reviewers as previous reviewers left the project which resulted in different cohort sizes. Community members living near Cleveland, Ohio, were recruited for the study through flyers posted at community centers and libraries. Eligibility criteria included: being at least 18 years old, having at least a high school education, having computer access, proficiency in reading and writing in English, demonstrating reading proficiency, and having experience with a common medical condition (e.g., stroke, hypertension, diabetes, etc.) as either a patient or caregiver. Please see Huml et al. [17] for additional details.

Survey

A brief 3-question paper-and-pencil survey was created for this study and administered to all 34 participants at the beginning of the training. Questions asked participants to share reasons for participating in the training and their confidence in research topics. The first question was intended to gauge the trainees’ confidence in different research topics (reading research articles; determining the research question, understanding the methods used, understanding tables and figures; identifying potential problems with the research) with “Not at all Confident”; “Somewhat Confident”, “Moderately Confident”, “Very Confident” and “Completely Confident” as response options. The second question was intended to better understand how important they rated reasons for interest in reviewing research articles (learning how to write more effectively; helping authors improve their work; wanting to know what is new in the field of research). Response options included “Not at all Important”; “Slightly Important”; “Moderately Important”; “Very Important”; and “Extremely Important”. The third question was open-ended, asking participants to list any other reasons they were interested in reviewing research articles.

Focus group

A total of 29 of the 34 trainees (three men, 26 women) participated in one of two semi-structured focus groups, held in May 2018 (Cohort 1) and October 2019 (Cohort 2). Participants were told about the focus group ahead of time and it was held during each cohort’s last training session. The sample included all trainees who were present on the last day of the cohort’s training. While trainees were free to leave before the focus group began, to our knowledge, none did so. The seven focus group questions asked trainees to describe their experiences with the training, what they learned, what parts of the training were most memorable and helpful, how their experiences in the training had affected them, what they would tell others interested in taking part in the training, and how the training could improve. The Cohort 1 focus group was intended to gather feedback for improving the training for Cohort 2 as well as learning about the trainees’ perspectives on the training.

Procedures

An institutional review board approved all study activities. An external, Ph.D. level social work researcher unknown to the participants with extensive experience in qualitative and community research facilitated the focus groups; no other project staff were present. The focus groups were held in the trainees’ normal meeting room. The focus groups ranged from 37 to 50 min, were recorded using an MP3 recorder, and a professional transcriptionist transcribed them. All present on the days of the focus groups participated, and none declined. No compensation was offered for participating in the focus group; trainees who attended the session were paid for attending regardless of whether they participated in the focus group.

Analysis

The outside evaluator conducted thematic analyses of the focus group interviews [21]. The focus group transcripts were examined iteratively and inductively using Atlas ti. Consistent with the social constructionist inquiry framework, we stayed as close to the direct quotes as possible as the analysis unfolded, seeking to locate similarities in training experiences. First, the evaluator created a summary document and shared it with the training team (four individuals) for two reasons: (1) to check the single evaluator/analyst’s interpretations, and (2) to quickly inform the team about improvements needed for future trainings. Next, direct quotes were identified as they related to the research question. The quotes were then examined for common patterns, grouped into categories, and the categories were combined and developed into themes. The two cohorts of data served as source triangulation, as we reviewed each cohort’s data separately, compared the findings and then combined them when we determined the patterns of responses were similar. We established trustworthiness through negative case analysis (in which the data are examined for evidence of inconsistency with larger themes), and the larger project team (five individuals; four directly involved in the training, and one who was external to it) reviewing the findings and looking for inaccuracies, evidence of bias, and/or inconsistencies [20].

Results

Survey findings

A total of 34 participants completed the survey (see Tables 2, 3). Most trainees (more than three-quarters) reported participating because they wanted to know what is new in research or to help authors improve their work. More than three-quarters reported they felt confident reading research articles and determining research questions. Half reported they wanted to learn about writing more effectively, and more than half, very or completely confident identifying potential problems with research, and understanding tables, figures, and research methods. In open-ended questions, the trainees wrote that the training was a chance to “be more in tune with medical research,” to “help article authors see things from a different perspective from their own,” and “to provide community voice in academic research.” One said they hoped their input would help reduce health disparities, and another: “I want authors of research articles to be more inclusive in their studies. Medical research …should benefit as many people as possible.” Overall, trainees saw the training as an educational opportunity that could have societal benefits.

Table 2 Initial questionnaire: confidence regarding research and motivation for participating in training (N = 34)
Table 3 Motivations for participating in training (N = 34)

Qualitative findings: lessons for training community reviewers

A total of 144 codes were generated from which we derived three key lessons learned from the project: context, learning and time, and training tools.

Lesson 1: Provide context in which reviews occur

The first lesson we learned was that it is important to provide trainees with the context in which reviews take place. Context was provided through accentuating how community members can contribute to academic publishing and including authors’ and funders’ perspectives in the training. Trainees said they learned how valuable their experiences, expertise, voices, and perspectives are to science. One said: “I like the importance of us wearing our different hats and bringing our own perspective into the research.” Another trainee said, “I never realized it that getting the community involved in [research] from behind the scenes were so important, because you know they asked us that when we review the manuscripts, we should review them from a community perspective.” Better understanding reviews’ contexts, trainees had a new perspective on reading articles.

I never ever read them [before] with like what can be done or ‘What can I say to make this more relevant for the community?’ or to clarify the information. So in that sense, I feel empowered, …I actually have a say in what’s gonna turn out to be the final product. So maybe when I pick up that journal again, it’ll be more tailored and user-friendly.

Bringing in guest speakers, specifically authors and funders, was also helpful for contextualizing the research, manuscript submission, review, and editing process. One guest speaker, an author who shared her experiences in having articles rejected, was seen as “courageous” and “insightful,” and helped trainees better understand how community perspectives contribute to research. “Hearing from an author that they know there’s going to be comments… They’re expecting feedback. They’re expecting it to be not done” was useful. Other guests included research funders who value community-engaged research [e.g., the Patient-Centered Outcomes Research Institute (PCORI)] allowed trainees to more holistically learn about community-engaged research. “That opened up even more doors to us.” The PCORI guest speaker was highlighted as making them feel comfortable through her humility and respect for their perspectives as community members.

Lesson 2: Emphasize opportunities for learning and use time carefully

Trainees felt they had learned a great deal from the training, including learning to read with a more critical eye and developing a greater awareness about health disparities. One said they felt the training, “made me a better community member, … I’m just very sensitive to what people are doing and the different populations that are being served and underserved…I just find this extremely valuable.”

Trainees said they now read articles more critically, looking for strengths and weaknesses, and can identify what research leaves out. One trainee said the skills differed from what they do in their daily lives. “You’re really looking at it with a very critical eye and asking a lot of questions, which typically isn’t asked of you in either everyday work or school.”

The trainees noted learning how to review required a learning curve. They said spent substantial time working to understand the highly scientific and technical styles in which articles are written. One said, “the subject matter is not for a layperson… it …took three reads for me.” Others said they had to look up definitions or “had to decipher each word and then put the paragraphs back together again, ‘cause I’m not used to…the way this language is put together.” Because of the effort required, trainees most often suggested accurately informing potential trainees about the time required to conduct reviews and using training time carefully.

Let people know that this isn’t just a “Sit and read it. Sit down and write what you think, or where you’re coming from.” …I mean even if [you’re reviewing using] your frame of reference, you still have to do some research in order to have an intelligent response.

Trainees said the time they spent on homework beyond the in-person training time was more than expected (2–4 h per week), and the turnaround time on homework was also fast. They felt overall, the project demanded a “bigger commitment” than planned. The trainees also noted some in-person training sessions ended early, and they felt the remaining time could have been better utilized by employing training strategies such as group work.

Lesson 3: Utilize a variety of training tools

Trainees emphasized how important the various tools used were to their experiences. These tools included the training team, guidelines, glossaries, and other materials provided. They recommended adding group interaction as an additional training tool.

The facilitators were considered key tools. Described as clear, knowledgeable, prepared, organized, patient, helpful, and kind, trainees felt their trainers “totally respected” their questions, even when the same question was asked repeatedly. “There was not one time that anybody was put down… we were asking questions that could be really obvious from their point of view, …they were very, very gracious.” Trainees said trainers responded to their needs and tailored the training to their knowledge base. The facilitators, I thought, were wonderful. They were very patient with us. They made sure you understood. They asked us great questions.”

Trainees said that although the trainers were very helpful, the training’s expectations could have been clearer and training sessions more focused on how to dissect articles. The first cohort suggested providing a research terms glossary to quickly look up unfamiliar terms and reduce time spent researching phrases. “I went looking up the different types of research. …I need a glossary because this is extra.” Based on this feedback, the second cohort received a glossary including basic and complex statistical concepts which they reported was very helpful. Other useful tools included teaching strategies for critical reading, reading for specific keywords, and following a “bullet sheet” guiding trainees through their reviews. Trainees suggested future training should provide more information on how to read and interpret tables and charts. They also suggested getting some information on how to review before their first review to help them feel prepared, “instead of do a review and then learn afterwards what was expected.”

The trainees suggested maximizing group and collaborative opportunities for learning, suggesting that interacting more with fellow trainees would have helped them to learn others’ perspectives in conducting reviews. They suggested strategies and activities such as peer-to-peer conversations to discuss reviews, arranging tables and chairs so they could more easily interact with and help one another, and conducting a group review. Some suggested reviewing the previous cohort’s reviews and discussing them would have helped them begin to think about review structures and approaches. One trainee said that speaking informally with another trainee about a study they both read helped broaden their perspectives on the study.

The most important part of this training came outside of this room when I was talking to another participant …we had come up with different ways of looking at it, and it broadened my perspective. You know I was looking at it one way. …I realized that next time I see something, I have to look at it one way, then I’m gonna have to close my mind, walk away for a while, and then come back and say, “No. Now I have to look at it from a different perspective.”

In one session, the trainees interacted, reviewing one another’s reviews which helped them feel a sense of community with each other and increased their confidence in their reviews.

Finally, trainees highlighted the usefulness of an “open session” held late in the training, when trainees felt more prepared to articulate their questions. The trainers asked the trainees what they still wanted or needed to know before they commenced their independent reviews, allowing trainees to get clarification on remaining questions and see that other trainees had similar questions.

Discussion

Overall, community journal reviewer trainees viewed their training as extremely useful and valuable. Although trainees expressed some confidence in some aspects of research before the training, they also said they gained skills, including becoming better critical readers of research, and increasing sensitivity to issues facing underserved populations. Trainees benefited from learning from one another, and from people engaged in the writing and research process. Maximizing the effectiveness of training community members as journal reviewers requires contextualizing the review process, especially community contribution, an emphasis on learning, appropriately communicating about and using trainees’ time, and a prepared, knowledgeable, helpful, sensitive, respectful, and responsive training team. Future training should provide trainees with clear expectations of the time required, provide a variety of tools and guidelines for conducting reviews, reading and interpreting tables and charts, and engage trainees in group work.

Including community perspectives in peer-reviewed academic research journal articles is a new and underused practice. Although not currently utilized even in articles that actively include or apply to community members, we found community members are motivated to contribute to creating knowledge through research dissemination. Although they recognized the limits of their research knowledge (specifically with complex statistics) community members are eager to learn and offer useful feedback for applying academic knowledge in community practice. Our findings highlight important process changes to make future community reviewer training programs more effective.

Framing research with community perspectives in mind can broaden the impact, reach, and practical application of research findings. Based on previous research, we assumed community members had unique perspectives reflecting their cultural backgrounds, interpretations of illness, and experiences with specific diseases, perspectives that differ from scientists’ in terms including what research should be prioritized, methods used, and how results should be interpreted [3, 4, 22,23,24,25,26]. Consistent with previous literature, we found community reviewers contributed specific viewpoints on the research beyond the technical aspects. In the training, they indicated learning to value and appreciate their unique contributions while becoming more sensitive to, and engaged with, health disparities in their communities and beyond.

Our findings support the idea that community members can gain basic knowledge about research issues and procedures, an understanding of scientific language and statistical analyses, and can lead to their contributing unique and valuable insight to the existing literature. We found no evidence that our trainees had unrealistic expectations of medical research, nor lacked objectivity [3], but consistent with previous work, they valued research most relevant to them and/or their communities [27]. Utilizing community members’ perspectives can be useful because “researchers ‘don’t know what they don’t know’ until they involve patients/the public.” [28] Furthermore, training community members specifically equips them to be part of the research more actively, as compared with simply engaging with them or including them, symbolically, as a research partner or a member of an editorial board.

Limitations

One limitation is the potential breach of confidentiality in focus groups; however, our trainees simply offered their perspectives and experiences; no questions were particularly sensitive [29]. The focus group structure allowed the trainees to build on each other’s answers, agreeing and/or offering contrasting perspectives. The interviewer was also external, unknown to them, and created an environment in which they could offer their honest opinions about the training. Second, only one person was directly involved in the coding and analysis of the data, and although measures were in place such that other team members reviewed the data to provide checks on the data, no formal analysis of the reliability of the coding was conducted. One potential issue to keep in mind is that the trainees were overwhelmingly women-identified; thus, the perspectives presented here could underrepresent the perspectives and experiences of men. However, the original sample also was skewed in terms of gender, so it likely is representative of the original sample. It is unclear why the training was more appealing to women than men, or it may be that the recruitment sites were more frequented by women. Future trainings may consider recruiting and/or oversampling men to ensure their perspectives are also represented. Finally, it is worth noting that the trainees expressed confidence in their knowledge of research before the training. This might mean that the people interested in/drawn to participate in the study, and/or those excluded from the study after the reading proficiency screening are not fully representative of “average” community members. Some level of interest and a basic level of confidence in or understanding of research may be necessary for successful trainings.

Implications for community-engaged research

While efforts are increasingly being made to include community perspectives in research dissemination and recent work attempts to provide helpful guides to assist new or non-scientifically trained reviewers [30, 31], our findings indicate that having strong guidance and support in reviews is necessary. Support, advice, and resources have been noted as important in guidelines on how patients may become involved with research [30]. An important part of our trainees’ experience involved being supported by the training team and interacting with other reviewers/trainees. Our findings should transfer well to programs conducting similar training initiatives with community members. The training is replicable and integrating community perspective into research dissemination could be more effectively implemented and evaluated if such training programs were used more widely. Integrating community perspectives into published research, especially journals publishing community-engaged research, is both important and necessary to ensure the utility of research efforts and justify the expenditure of research funds based on perceived community benefit.

The project detailed here was included as part of a dissemination grant funded by NIH. While some funders encourage community engagement in research activities [32], others specifically require researchers to establish collaborations and/or partnerships with community members (such as the U54 mechanism) in conducting research and/or share research findings with study participants [33, 34]. However, funders should more widely consider requiring the integration of training community reviewers specifically into funding conditions to support embedding community reviewers as a routine part of peer reviews during dissemination. This would allow the findings of this research to be directly implemented, expand this currently limited practice, and maximize the usefulness of research findings.

Conclusion

Training community members as journal reviewers is innovative and offers the potential to bring unique voices to the traditionally academically focused realm of journal article publishing. Trainees benefit from the training, are well-prepared to review if given a structured curriculum, appropriate tools, strategies, and are supported by a responsive institutional environment. Training programs should emphasize the learning opportunity, include authors’, editors’, and funders’ perspectives, use time carefully, employ effective trainers, and complement didactic training with interactive group work. Employing such strategies can increase the likelihood of community members’ voices being included in research dissemination and increase medical journal articles’ community relevance.

Availability of data and materials

Data are available on request.

References

  1. Israel BA, Coombe CM, Cheezum RR, Schulz AJ, McGranaghan RJ, Lichtenstein R, et al. Community-based participatory research: a capacity-building approach for policy advocacy aimed at eliminating health disparities. Am J Public Health. 2010;100:2094–102.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Minkler M. Community-based research partnerships: challenges and opportunities. J Urban Health Bull N Y Acad Med. 2005;82:ii3-12.

    Google Scholar 

  3. Caron-Flinterman FJ, Broerse JEW, Teerling J, Bunders JFG. Patients’ priorities concerning health research: the case of asthma and COPD research in the Netherlands. Health Expect Int J Public Particip Health Care Health Policy. 2005;8:253–63.

    Google Scholar 

  4. Hewlett S, de Wit M, Richards P, Quest E, Hughes R, Heiberg T, et al. Patients and professionals as research partners: challenges, practicalities, and benefits. Arthritis Rheum. 2006;55:676–80.

    Article  PubMed  Google Scholar 

  5. Israel BA, Schulz AJ, Parker EA, Becker AB. Review of community-based research: assessing partnership approaches to improve public health. Annu Rev Public Health. 1998;19:173–202.

    Article  CAS  PubMed  Google Scholar 

  6. Minkler M, Wallerstein N. Community-Based Participatory Research for Health: From Process to Outcomes. Hoboken: Wiley; 2011.

    Google Scholar 

  7. Minogue V, Boness J, Brown A, Girdlestone J. The impact of service user involvement in research. Int J Health Care Qual Assur Inc Leadersh Health Serv. 2005;18:103–12.

    Article  PubMed  Google Scholar 

  8. Resnik DB, Kennedy CE. Balancing scientific and community interests in community-based participatory research. Acc Res. 2010;17:198–210.

    Article  Google Scholar 

  9. McInerney P, Cooke R. Patients’ involvement in improvement initiatives: a qualitative systematic review. JBI Database Syst Rev Implement Rep. 2015;13:232–90.

    Article  Google Scholar 

  10. Wallerstein NB, Duran B. Using community-based participatory research to address health disparities. Health Promot Pract. 2006;7:312–23.

    Article  PubMed  Google Scholar 

  11. Johnson DS, Bush MT, Brandzel S, Wernli KJ. The patient voice in research-evolution of a role. Res Involv Engagem. 2016;2:6.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Domecq JP, Prutsky G, Elraiyah T, Wang Z, Nabhan M, Shippee N, et al. Patient engagement in research: a systematic review. BMC Health Serv Res. 2014;14:89.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Research Involvement and Engagement Reviewer Guidelines [Internet]. BioMed Cent. [cited 2023 Aug 8]. https://researchinvolvement.biomedcentral.com/about/reviewer-guidelines.

  14. The British Medical Journal. Guidance for BMJ Patient and Public Reviewers [Internet]. [cited 2023 Aug 8]. https://www.bmj.com/about-bmj/resources-reviewers/guidance-patient-reviewers.

  15. Fleurence RL, Forsythe LP, Lauer M, Rotter J, Ioannidis JPA, Beal A, et al. Engaging patients and stakeholders in research proposal review: the patient-centered outcomes research institute. Ann Intern Med. 2014;161:122–30.

    Article  PubMed  Google Scholar 

  16. Forsythe LP, Ellis LE, Edmundson L, Sabharwal R, Rein A, Konopka K, et al. Patient and stakeholder engagement in the PCORI pilot projects: description and lessons learned. J Gen Intern Med. 2016;31:13–21.

    Article  PubMed  Google Scholar 

  17. Huml AM, Albert JM, Beltran JM, Berg KA, Collins CC, Hood EN, et al. Community members as reviewers of medical journal manuscripts: a randomized controlled trial. J Gen Intern Med. 2022.

  18. Santiago HC. Visual mapping to enhance learning and critical thinking skills. Undefined [Internet]. 2011 [cited 2022 Feb 24]. https://www.semanticscholar.org/paper/Visual-Mapping-to-Enhance-Learning-and-Critical-Santiago/77486c9be78fb0052396e937cc324fd5a5d24e21.

  19. West DC, Pomeroy JR, Park JK, Gerstenberger EA, Sandoval J. Critical thinking in graduate medical education: a role for concept mapping assessment? JAMA. 2000;284:1105–10.

    Article  CAS  PubMed  Google Scholar 

  20. Patton MQ. Qualitative research and evaluation methods. 4th ed. Thousand Oaks: SAGE Publications; 2015.

    Google Scholar 

  21. Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3:77–101.

    Article  Google Scholar 

  22. Cashman SB, Adeky S, Allen AJ, Corburn J, Israel BA, Montaño J, et al. The power and the promise: working with communities to analyze data, interpret findings, and get to outcomes. Am J Public Health. 2008;98:1407–17.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Corner J, Wright D, Hopkinson J, Gunaratnam Y, McDonald JW, Foster C. The research priorities of patients attending UK cancer treatment centres: findings from a modified nominal group study. Br J Cancer. 2007;96:875–81.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  24. Edwards V, Wyatt K, Logan S, Britten N. Consulting parents about the design of a randomized controlled trial of osteopathy for children with cerebral palsy. Health Expect Int J Public Particip Health Care Health Policy. 2011;14:429–38.

    Google Scholar 

  25. English KC, Wallerstein N, Chino M, Finster CE, Rafelito A, Adeky S, et al. Intermediate outcomes of a tribal community public health infrastructure assessment. Ethn Dis. 2004;14:S61-69.

    PubMed  Google Scholar 

  26. Young K. Doctors’ understanding of rheumatoid disease does not align with patients’ experiences. BMJ. 2013;346: f2901.

    Article  PubMed  Google Scholar 

  27. Boivin A, Richards T, Forsythe L, Grégoire A, L’Espérance A, Abelson J, et al. Evaluating patient and public involvement in research. BMJ. 2018;363: k5147.

    Article  PubMed  Google Scholar 

  28. Staley K. ‘Is it worth doing?’ Measuring the impact of patient and public involvement in research. Res Involv Engag. 2015;1:6. https://doi.org/10.1186/s40900-015-0008-5.

    Article  Google Scholar 

  29. Morgan DL. Basic and advanced focus groups. Thousand Oaks: SAGE Publications, Inc.; 2019.

    Book  Google Scholar 

  30. National Institute for Health Research. Patient and public involvement in health and social care research: A handbook for researchers. England: National Institute for Health Research Research Design Service; 2014. p. 1–40.

  31. Salmi L, Blease C. A step-by-step guide to peer review: a template for patients and novice reviewers. BMJ Health Care Inform. 2021;28: e100392.

    Article  PubMed  PubMed Central  Google Scholar 

  32. National Institutes of Health. U.S. Department of Health and Human Services National Center for Advancing Translational Sciences: Patient/Community Engagement and Health Information [Internet]. 2017. https://ncats.nih.gov/engagement.

  33. Patient Centered Outcomes Research Institute (PCORI). Awardee Resources: Policies & Guidelines [Internet]. Award. Resour. Policies Guidel. 2018 [cited 2023 Aug 9]. https://www.pcori.org/funding-opportunities/applicant-and-awardee-resources/awardee-resources/policies-guidelines.

  34. Patient-Centered Outcomes Research Institute (PCORI). About Our Research [Internet]. Our Res. 2017 [cited 2023 Aug 9]. https://www.pcori.org/research/about-our-research.

Download references

Acknowledgements

We appreciate the support of all community reviewers in sharing their experiences. We also appreciate Jacquie Dolata’s assistance and feedback on this manuscript.

Funding

This research was funded by the National Institutes of Health/NCMHD U54 MD002265.

Author information

Authors and Affiliations

Authors

Contributions

All authors contributed to the manuscript in a manner consistent with authorship. CC designed the study, created data collection tools, performed the analyses, and wrote up the findings. EH, JJT, and AS designed, ran the program, recruited participants. EH and JJT contributed to the analysis and writing. AS and KS provided substantial suggestions for study implications. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Cyleste C. Collins.

Ethics declarations

Ethics approval and consent to participate

Was obtained from Case Western Reserve University's Institutional Review Board and all participants provided written consent to participate.

Consent for publication

Not applicable.

Competing interests

No authors declare competing or conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Collins, C.C., Hood, E., Jewett-Tennant, J. et al. Community members’ experiences training as medical journal reviewers. Res Involv Engagem 9, 66 (2023). https://doi.org/10.1186/s40900-023-00482-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40900-023-00482-x

Keywords