You are currently viewing a new version of our website. To view the old version click .
  • Article
  • Open Access

13 February 2023

Community Research Fellows Training Program: Evaluation of a COVID-19-Precipitated Virtual Adaptation

,
,
,
,
,
and
1
Division of Public Health Sciences, Department of Surgery, School of Medicine, Washington University, St. Louis, MO 63110, USA
2
Brown School, Washington University, St. Louis, MO 63110, USA
3
Community Research Fellows Training, School of Medicine, Siteman Cancer Center & Washing University, St. Louis, MO 63110, USA
*
Author to whom correspondence should be addressed.
This article belongs to the Special Issue Community-Engaged Research to Address Health and Healthcare Disparities

Abstract

Community engagement is important for promoting health equity. However, effective community engagement requires trust, collaboration, and the opportunity for all stakeholders to share in decision-making. Community-based training in public health research can build trust and increase community comfort with shared decision-making in academic and community partnerships. The Community Research Fellows Training (CRFT) Program is a community-based training program that promotes the role of underserved populations in research by enhancing participant knowledge and understanding of public health research and other relevant topics in health. This paper describes the process of modifying the original 15-week in-person training program to a 12-week online, virtual format to assure program continuation. In addition, we provide program evaluation data of the virtual training. Average post-test scores were higher than pre-test scores for every session, establishing the feasibility of virtual course delivery. While the knowledge gains observed were not as strong as those observed for the in-person training program, findings suggest the appropriateness of continuing to adapt CRFT for virtual formats.

1. Introduction

Our previous research suggests that as science advances and health research and care become more complex, it remains important that patients and other health stakeholders have the ability to participate in decision-making at the level desired [,]. To facilitate outcomes acceptable to stakeholders, innovative strategies are needed to ensure that patients, family members, and other stakeholders have sufficient knowledge of research and research methods to confidently share their preferences, needs, concerns, and priorities [,]. Lack of knowledge or uncertainty about relevant issues may result in a failure to engage fully in research and/or patient decision-making processes []. This lack of engagement increases the likelihood of deference to clinical professionals, the health system, friends, family, and social media influences, ultimately heightening the likelihood of succumbing to misinformation and/or dissatisfaction with health decisions []. Thus, researchers and clinicians committed to community engaged research and practice must provide opportunities for learning that increase research knowledge and literacy [,].
Community engagement is an important tool for promoting positive social and community health change []. However, effective community engagement requires a process that builds trust, values contributions of all stakeholders, and generates a collaborative framework []. Community-based training in public health research can contribute to this process and allows academics to build capacity of community members to engage in community-based participatory research (CBPR), a collaborative form of community engagement [,,,]. Researchers are increasingly recognizing the role that CBPR can play in research designed to reduce health disparities and move toward health equity [,,,,,,,,]. When communities and researchers are interested in and prepared for engagement, community engaged research can result in the identification and examination of health related concerns and conditions of relevance to the community [,,]. Increasing community engaged research generally, and CBPR specifically, are viewed as research strategies capable of improving the acceptability and effectiveness of health promotion and disease prevention activities as well as research itself [,,].
The Community Research Fellows Training (CRFT) Program is a community-based training program that promotes the role of underserved populations in research by enhancing the capacity for CBPR [,,,]. This 15-week training program was adapted from the Community Alliance for Research Empowering Social Change (CARES) program. CRFT includes topics relevant to a wide range of health research []. The advisory group that designed and implemented CRFT reviewed the CARES materials and determined what program components were appropriate for the urban, Midwest location of the CRFT program. Recent CRFT cohorts have addressed 18 topics presented in 15 weekly sessions, each lasting 3 h. Topics include health disparities, health literacy, ethics, cultural competency, epidemiology, quantitative and qualitative research methods, chronic disease prevention, clinical trials, study design, program evaluation, and grant writing. Sessions were designed to be lay-friendly, while also consistent with Master of Public Health (MPH) curriculum. The Principal Investigator recruited one or two faculty members to lead each program session, depending on session content. The faculty selected are experts in the field, teach and perform research in these areas, and have strong community connections. Faculty are from multiple institutions in the region. The training uses multiple teaching approaches (large didactic interactive lectures, small group activities, group exercises, and small and large group discussions) to explain topics in ways that accommodate a variety of learning styles []. Other pedagogical elements and principles are available in previous evaluation studies [,].
The inclusion criteria for the CRFT program required participants to be at least 18 years old and live or work in, or willing to commute to, the St. Louis greater metropolitan area. The program participants do not receive compensation but receive free training and resources. Participants are referred to as “Fellows” to further empower and engage them in the academic process.
In April 2020, a sixth CRFT cohort was cancelled due to the COVID-19 shutdown. The cancellation occurred after the application deadline but prior to any formal acceptance of Fellows into the program. This paper describes the process of modifying the original 15-week in-person program to a 12-week online program offered in a virtual format, to assure program continuation. The team decided that the virtual program was the most feasible strategy to continue teaching community members during the COVID-19 pandemic.
The Human Research Protection Office at Washington University in St. Louis classified the study of “virtual” CRFT underlying this paper as program evaluation and non-human subject research.

2. Methods

2.1. Community Research Fellows Training Program

A community advisory board (CAB) that consists of 12 members, 8 CRFT graduates and 4 community stakeholders, helps to guide all aspects of the CRFT program, including recruitment, selection, program implementation, and evaluation of the program. The CAB and Project Team collaborated as a revision team to consider how to continue an established and evaluated community-based training program [,,,] in a virtual format. The focus of revision team discussions involved how to avoid the fatigue and attention span deficits inherent in a virtual format as well as highlighting the best practices for using virtual technology platforms like Zoom© (Zoom Video Communications, Inc., San Jose, CA, USA) and Canvas© (Instructure, Inc., Salt Lake City, UT, USA) while maintaining fidelity with the quintessential elements of in-person CRFT. These modifications included reducing the length and number of sessions, creating engaging activities in a virtual format, innovation in community engagement homework, creating networking opportunities, and changing program evaluation procedures.
Discussions on transitioning CRFT to a virtual delivery began in September 2020 after several members of the Project Team had adapted other courses to a virtual format for the academic summer and fall semesters. They were able to bring lessons learned from those experiences to the discussions. In addition, the Washington University in St. Louis Center for Teaching and Learning provided many resources [,,] to faculty and staff at the university.

2.2. Goals

The program continues to pursue the original goals of the CRFT program. The first goal is to equip community members with sufficient research knowledge to be good consumers of that research. In addition, the program seeks to help community members to understand how to use research to improve the community health status and well-being. The final goal is to increase community readiness for community/academic collaboration and partnership on mutually beneficial projects and programs. Additionally, the same multidisciplinary faculty taught the virtual program who had taught the in-person program. The team also recommended a reduced number of sessions (from 15 to 12), inclusion of an online orientation that provided an overview of Canvas© and Zoom© navigation, and one workshop intended to train community members in qualitative research methods, as well as an online graduation ceremony and optional outdoor, in-person celebration (See Table 1).
Table 1. Cohort VI Virtual Online Session Topics and Learning Objectives.

2.3. The Online Format

The Project Team received permission from the home institution to administer the course using the Canvas© Learning Management System (LMS). The session readings, activities, homework, and supplemental materials were available in Canvas© modules. The Project Team decided that each lecture topic would be 45 to 50 min, with 10 to 15 min reserved for questions. Faculty modified the session objectives and class activities to accommodate the new 2 h course duration (See Table 1). The Project Team and volunteers facilitated and monitored activities completed in Zoom© breakout rooms because small group activities were included in the original in-person CRFT format. There was agreement that homework should be continued but modified to accommodate ongoing physical distancing which was practiced as a pandemic-spread mitigation measure in the region.
Given the online format, the Project Team made the decision to keep the virtual online cohort (Cohort VI for 2021) small to encourage greater interactive participation. The cohort was limited to a maximum of 25 participants, which was similar to Cohorts III–V. Prior CRFT cohorts were recruited through a local newspaper, E-mail, community websites, community newsletters, flyers, personal referrals, and word of mouth. However, the Project Team decided to approach those who applied for the cancelled 2020 cohort (n = 29), individuals who contacted staff to express interest in a future cohort (n = 23), and send emails to the CRFT alumni network and other CRFT affiliated individuals. Applicants had an opportunity to use a previous application, provide an updated application, or submit a new application.
The Project Team reviewed the applications and selected a cohort that was diverse in work, education, and life experiences, just as the in-person CRFT cohorts had been. Twenty-three of the thirty-eight individuals who applied to participate in the 2021 cohort had previously applied in 2020. Sixteen of those twenty-three past applicants were selected as Fellows for the 2021 cohort. There were 15 new applicants in 2021, with 9 selected to participate in the 2021 cohort.

2.4. Canvas© Training

We offered training on how to use Canvas©, the learning management system during the CRFT program orientation. Canvas© was mainly used as a way to store and organize CRFT program materials, whereas assessments were given outside of Canvas©, using a survey administration platform, REDCap®. REDCap® is a secure, web-based software platform designed to support data capture for research and evaluation. The orientation session lasted 2 h, with about 30 min set aside for a walkthrough of virtual course materials, including how to use Canvas©. All 25 Fellows were able to attend the orientation session.
During the introduction to Canvas©, CRFT team members demonstrated going to the institution specific Canvas© website, creating an account, logging into the system, and navigating through the course materials, which were set up using the modules feature in Canvas©. In addition to the live walkthrough of Canvas©, the general and institution specific information was sent via email. The process of creating an account was more difficult for Fellows than using Canvas©, as several Fellows reached out to the CRFT team for assistance and questions on account creation. However, once an account was created, Fellows did not voice concerns on accessing course materials. One Fellow, who also did not finish the program, was never able to create a Canvas© account. Fellows also received a one-page guide on how to access Canvas© prepared by the CRFT team.

2.5. Assessment of Participant Knowledge

Fellows’ increase in knowledge of public health principles and CBPR was assessed via a questionnaire administered before session 2 (baseline) and after session 12 (final) of the program. Each questionnaire to assess public health knowledge consisted of 20 identical multiple-choice questions created by the CRFT Project Team and faculty to measure the Fellow’s knowledge of the CRFT training topics and learning objectives (Table 1). The questions selected for administration were related to the learning objectives the faculty member(s) intended to cover during the weekly session. Fellows’ baseline and final scores were linked using their personally selected CRFT ID numbers.
A baseline and final score were computed for each individual by summing the responses for all questions, where a value of 1 was assigned for a correct response and a value of 0 for an incorrect response (including missing and “I don’t know” responses). A maximum total score of 20 points was possible for each score. We then converted this to a percentage score. To assess the difference between the baseline and final assessment, we performed a non-parametric test, the Wilcoxon signed-rank test, due to violations of normality assumptions.

2.6. Pre- and Post-Tests

Fellows took pre- and post-tests at each of the 12 sessions. Pre- and post-test measures assessed the content at the beginning of each session and at the end of each session, using identical items. All tests were shortened from the pre- and post-tests administered during the in-person sessions. The tests were shortened from 10 items to 5 items in order to reduce participant burden. Pre- and post-tests were scored similar to the baseline and final assessment, using a sum score of correct responses, where a value of 1 was assigned for a correct response and 0 for incorrect response (including missing or don’t know). A maximum total score of 5 was possible for each pre- and post-test. We then converted this to a percentage score. Due to the violation of normality assumptions, we used a non-parametric test, the Wilcoxon signed-rank test, to evaluate the score differences on pre-test and post-test for each session.

2.7. Evaluation of Fellow Satisfaction

At the end of each of the 12 training sessions, Fellows completed session evaluations. Fellows rated their satisfaction level on that week’s topic and content covered as well as evaluation of the faculty member presenting. Evaluations consisted of six statements with Likert-scale response options. The Likert-scale items were about satisfaction with the learning objectives for the topic as well as the perceived quality of the presentation for the session. In addition, there were four free response questions.
Analyses were completed using SAS/STAT 9.4 (SAS Institute, Cary, NC, USA).

3. Results

There were 25 Fellows enrolled in the virtual online CRFT program. Of these 25 Fellows, 23 (92%) completed the 12-week virtual training program. We obtained complete baseline and final assessment data from 21 of the 23 Fellows who completed the program (91%). Table 2 provides demographic information on the full cohort (n = 25).
Table 2. Demographic characteristics of CRFT participants in evaluation sample (n = 25).
The three largest groups comprising the cohort were individuals affiliated with community-based organizations (32%), followed by health care workers (28%) and individuals with multiple roles (28%). The remaining Fellows were community members and those affiliated with faith-based organizations (8%) and those in academia (4%). The mean age of Fellows was 45, with an age range from 25 to 65. All Fellows had some college or an associate’s degree or higher, including 48% with graduate education, and 68%of the cohort reported that they had previously completed a course on research. The majority of fellows were African-American/Black (76%).
All of the Fellows who completed both the program and final assessments (n = 21) were female and 71% were African-American/Black. The four individuals who did not complete the training and/or the final assessment were African-American/Black, with one male and three females. These individuals were similar to the 21 Fellows who completed the program and final assessment in all respects except age. The mean age was slightly lower, 38 and ranged from 33 to 43, compared to the 21 individuals who completed the program and final assessments (mean age = 46.8, ranging from 25 to 65).

3.1. Assessment of Participant Knowledge

The average score increased from 65% correct on the baseline public health and CBPR knowledge assessment to 76% correct on the final assessment (n = 20 (one outlying observation excluded), mean change of 11%, range −10% to 40%). Only two Fellows (out of 21, 10%) decreased their scores from baseline to final assessment. The Wilcoxon signed-rank test for this knowledge change was significant (p = 0.0005).
Due to the online nature of the course, participants found it challenging to complete pre-/post-tests. In addition, due to non-participant technical errors involving REDCap®, approximately halfway through the course, Fellows did not always receive reminder emails to take the test. Overall, four Fellows (16%) completed all pre-/post-tests. During the first half of the course, 60–68% of participants completed the pre- and post-tests, and during the second half of the course, 36–54% of participants completed both tests.
There is evidence of knowledge gained. However, the in-person program had a greater number of sessions with Fellows showing statistically significant gains compared to the online program [,]. Data in Table 3 indicates that although average post-test scores were higher than pre-test scores for every session, only Session 1—Public Health and Health Disparities (17.7), Session 3—Health Literacy (11.3), Session 4—Introduction to Epidemiology (25), and Session 6—Evidence-based Public Health and Program Planning (24) had a statistically significant difference in pre- and post-test scores. An examination of the pre-test scores indicates that three of these sessions—Introduction to Epidemiology (29.6), Evidence-based Public Health and Program Planning (44), and Health Literacy (47.2)—were among the areas of greatest knowledge deficit. Sessions 2—Community Based Participatory Research & Community Organizing, Session 5—Behavioral Health/Cultural Competency, and Sessions 7 through 12—Research Methods & Data, Quantitative Methods, Qualitative Methods, Clinical Trials & Biobanks/Research Ethics, Health Policy Research/Grant Writing, and Human Subjects Certification—did not have a statistically significant difference in pre- and post-test scores.
Table 3. CRFT VI Virtual Online Pre-test and Post-test Scores.

3.2. Participant Feedback

The following quotes were offered in the final assessment in response to the prompt to “Please provide us with any additional comments or suggestions”. The Fellows indicated that they appreciated the effort to continue the program despite ongoing pandemic precautions. Representative quotes are provided below.
“Thank you for taking the time to re-structure the program and your continued energy, patience and support throughout the program!”
“Thank you for the opportunity to participate in this outstanding program. I (sic) is a testament to the dedication of your staff to engage the community in the advancement of public health. Please continue the program!”
“This was an awesome opportunity. The information gained will be used to help me to be a better collaborator as I work to improve areas of the community in which I serve”.
In addition, Fellows were asked to share opportunities to change or improve the training sessions. There were comments related to virtual versus in person sessions. Some Fellows liked the virtual sessions and some did not.
“I think in person would of course be better, but with the pandemic that couldn’t be done”.
“I really liked the virtual setting”.
“Do not offer online/virtually”.
However, the main concerns discussed were the desire for more interaction and group discussions, more time for each session (“Continue to offer virtual classes, extend the time 30 min”).
Finally, data from the final assessment indicated that 81% (n = 17) of Fellows agreed or strongly agreed “the structure of the training was beneficial to the learning process”. In addition, 95% (n = 20) agreed or strongly agreed that they “would recommend the CRFT program to others”.

4. Discussion

This paper describes the process of modifying the original CRFT 15-week in person training program to a 12-week virtual online format. The virtual format proposed was the most feasible strategy to continue training community members during the COVID-19 pandemic. In addition, it increased the number of communities and participants with access to this community-based public research training program [,,,] by continuing the training during the COVID-19 pandemic.
There were both facilitators and barriers to implementation. Throughout the 12-week training, Fellows remained engaged in the training, indicating high potential for a virtual CRFT training environment. In addition, all CRFT faculty continued to be committed to the goals of the training and willing to adapt curricula from for online virtual format. Faculty used several Zoom© features, including breakout rooms, polls, and the chat to facilitate participation. On the other hand, while many Fellows understood and supported a virtual format, feedback also indicates that many value opportunities for face-to-face trainings. In addition, the recommendations offered suggest the need to add additional time to online sessions. Low pre-/post-test completion rates indicate some limitations in administration of online materials asynchronously, as discussed in the next section. One participant who failed to complete the program was never able to establish an LMS account and this likely hampered the quality of the experience and program completion.
The cohort of Fellows was comprised of individuals affiliated with community-based organizations and other diverse stakeholders. The cohort had greater age and educational diversity than racial/ethnic and gender diversity. Ninety-two percent of CRFT VI Fellows completed the 12-week training program and ninety-one percent of these Fellows completed both the baseline and final assessments. There was a statistically significant average score increase from the baseline assessment to the final assessment (mean change of 11%).
The average post-test scores were higher than the pre-test scores for every session; however, as Table 3 data indicate, only four sessions—Public Health and Health Disparities, Health Literacy, Introduction the Epidemiology, and Evidence-based Public Health and Program Planning—had a statistically significant difference in pre- and post-test scores. An examination of the pre-test scores indicates that three of the four sessions (Health Literacy, Introduction the Epidemiology, and Evidence-based Public Health and Program Planning) are important areas for public health and research and were among the areas of greatest knowledge deficit.
Given the virtual nature of the course, participant evaluations are important. Ninety-five percent of Fellows indicated that they would recommend the CRFT program to others and a smaller, but large, percentage reported that the structure of the program was beneficial to the learning process. While these findings establish the feasibility of virtual course delivery, the knowledge gains were not as strong as those observed for the in-person training program.

Limitations

Despite these positive findings, there are limitations to this evaluation and some findings should be interpreted with caution. First, a low percentage of Fellows completed all of the pre/post-tests (16%), and the percentages completing the tests during the first half of the program was significantly higher than the percentage completing the tests in the second half of the program. One issue was the ability to send reminders to Fellows reliably. With implementation of the virtual format, CRFT staff reminded Fellows to complete pre- and post-tests at the beginning and end of each session. The in-person format allowed staff to ensure compliance with the tests, while the effort to use REDCap® to provide reminders failed. We learned that it is important to test all features of the LMS system and develop alternative strategies for implementation of program components prior to the launch of virtual programming, including those related to evaluation processes.
While prior CRFT cohorts have been predominantly female [,], this cohort included fewer males than the previous cohorts. There are 29 male CRFT alumni (19% of alumni). Cohort V was 25% male (unpublished data). Lower participation by males in this cohort may have been due to differences in COVID work roles and schedules as well as family and domestic adjustments among male and female workers. It may also be important to understand how the restricted recruitment strategy created by COVID, differential recruitment efforts of alumni, and the use of the virtual online format affected male participation. Similarly, while there was educational diversity, there were no Fellows with a high school education or less who completed the program. To date, there are only four CRFT alumni with a high school education or less. These data speak to the need to consider programming that may appeal to individuals with less education. This may require programming that provides similar content but offered through a series of focused short courses that provide more time and support for participants to absorb the information than is available in the current CRFT program. Despite these limitations, the findings suggest the appropriateness of continuing to adapt CRFT for virtual formats.

5. Conclusions

CRFT VI demonstrated the feasibility of providing community-based training in public health research using a virtual online format. Despite the existence of other programs designed to provide community training on research [,], most formats reported on are in person. This virtual adaptation was conducted with fewer sessions, sessions were shorter in duration, and homework assignments and activities were modified to accommodate the virtual format. Although there were a greater number of sessions with statistically significant gains for the in-person program compared to the virtual online program, Fellows demonstrated gains in research knowledge in both formats. The virtual implementation experience highlights the need for adequate staff support for participants, as well as for creating and managing online course content. Findings also highlight the importance of participant training on the use of the course management system and the virtual meeting platform prior to program implementation to increase comfort and familiarity. These findings suggest the potential of a virtual online community research training program, although adjustments and further evaluation are necessary.

Author Contributions

K.L.D., N.A., S.H. and V.S.T. led project conceptualization and implementation and supported drafting/reviewing of the curriculum and paper. C.L.R. and C.S. supported program conceptualization, implementation, and paper revision. J.V.C. supported program implementation and paper drafting/reviewing and revision. All authors have read and agreed to the published version of the manuscript.

Funding

Funding for the program was provided by the Staenberg Foundation, Siteman Cancer Center, and the Washington University School of Medicine. Use of REDCap was supported by a Clinical and Translational Science Award (CTSA) Grant (UL1TR000448) and Siteman Comprehensive Cancer Center and NCI Cancer Center Support Grant P30CA091842. The funders were not involved in the writing of this manuscript.

Institutional Review Board Statement

The Washington University in St. Louis Institutional Review Board designated this program evaluation study as non-human subjects research.

Data Availability Statement

Not applicable.

Acknowledgments

We acknowledge the Community Advisory Board and support staff of the project for engaging in program and curriculum design and the participant selection process.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Thompson, V.L.S. Making decisions in a complex information environment: Evidential preference and information we trust. BMC Med. Inform. Decis. Mak. 2013, 13 (Suppl. S3), S7. [Google Scholar] [CrossRef]
  2. Coats, J.; Stafford, J.D.; Thompson, V.S.; Javois, B.J.; Goodman, M.S. Increasing research literacy: The community research fellows training program. J. Empir. Res. Hum. Res. Ethic 2014, 10, 3–12. [Google Scholar] [CrossRef] [PubMed]
  3. McGowan, L.D.; Stafford, J.D.; Thompson, V.L.; Johnson-Javois, B.; Goodman, M. Quantitative Evaluation of the Community Research Fellows Training Program. Front. Public Health 2015, 3, 179. [Google Scholar] [CrossRef]
  4. Fawcett, S.B.; Paine-Andrews, A.; Francisco, V.T.; Schultz, J.A.; Richter, K.P.; Lewis, R.K.; Williams, E.L.; Harris, K.J.; Berkley, J.Y.; Fisher, J.L.; et al. Using empowerment theory in collaborative partnerships for community health and development. Am. J. Community Psychol. 1995, 23, 677–697. [Google Scholar] [CrossRef]
  5. Butterfoss, F.D.; Francisco, V.T. Evaluating Community Partnerships and Coalitions with Practitioners in Mind. Health Promot. Pract. 2004, 5, 108–114. [Google Scholar] [CrossRef]
  6. Komaie, G.; Goodman, M.; McCall, A.; McGill, G.; Patterson, C.; Hayes, C.; Thompson, V.S. Training Community Members in Public Health Research: Development and Implementation of a Community Participatory Research Pilot Project. Health Equity 2018, 2, 282–287. [Google Scholar] [CrossRef]
  7. Komaie, G.; Ekenga, C.C.; Thompson Sanders, V.L.; Goodman, M.S. Increasing Community Research Capacity to Address Health Disparities: A qualitative program evaluation of the Community Research Fellows Training Program. J. Empir. Res. Hum. Res. Ethics 2018, 12, 55–66. [Google Scholar] [CrossRef]
  8. Goodman, M.S.; Gonzalez, M.; Gil, S.; Si, X.; Pashoukos, J.L.; Stafford, J.D.; Pashoukos, D.A. Brentwood community health care assessment. Prog. Community Health Partnersh. Res. Educ. Action 2014, 8, 29–39. [Google Scholar] [CrossRef]
  9. Goodman, M.S.; Si, X.; Stafford, J.D.; Obasohan, A.; Mchunguzi, C. Quantitative Assessment of Participant Knowledge and Evaluation of Participant Satisfaction in the CARES Training Program. Prog. Community Health Partnersh. Res. Educ. Action 2012, 6, 361–368. [Google Scholar] [CrossRef]
  10. Goodman, R.M.; Speers, M.A.; McLeroy, K.; Fawcett, S.; Kegler, M.; Parker, E.; Smith, S.R.; Sterling, T.D.; Wallerstein, N. Identifying and Defining the Dimensions of Community Capacity to Provide a Basis for Measurement. Health Educ. Behav. 1998, 25, 258–278. [Google Scholar] [CrossRef]
  11. Iacobucci, D.; Duhachek, A. Advancing Alpha: Measuring Reliability With Confidence. J. Consum. Psychol. 2003, 13, 478–487. [Google Scholar] [CrossRef]
  12. Israel, B.A. Methods in Community-Based Participatory Research for Health; Jossey-Bass: San Fransisco, CA, USA, 2005. [Google Scholar]
  13. Israel, B.A.; Coombe, C.M.; Cheezum, R.R.; Schulz, A.J.; McGranaghan, R.J.; Lichtenstein, R.; Reyes, A.G.; Clement, J.; Burris, A. Community-Based Participatory Research: A Capacity-Building Approach for Policy Advocacy Aimed at Eliminating Health Disparities. Am. J. Public Health 2010, 100, 2094–2102. [Google Scholar] [CrossRef] [PubMed]
  14. Israel, B.A.; Schulz, A.J.; Parker, E.A.; Becker, A.B. Review of Community-Based Research: Assessing Partnership Approaches to Improve Public Health. Annu. Rev. Public Health 1998, 19, 173–202. [Google Scholar] [CrossRef]
  15. Israel, B.A.; Schulz, A.J.; Parker, E.A.; Becker, A.B.; Allen, A.J.; Guzman, J.R. Critical issues in developing and following CBPR principles. In Community-Based Participatory Research for Health: From Process to Outcomes; Jossey-Bass: San Fransisco, CA, USA, 2008; pp. 47–66. [Google Scholar]
  16. Israel, B.A.; Parker, E.A.; Rowe, Z.; Salvatore, A.; Minkler, M.; López, J.; Butz, A.; Mosley, A.; Coates, L.; Lambert, G.; et al. Community-based participatory research: Lessons learned from the Centers for Children’s Environmental Health and Disease Prevention Research. Environ. Health Perspect. 2005, 113, 1463–1471. [Google Scholar] [CrossRef] [PubMed]
  17. Jagosh, J.; Macaulay, A.C.; Pluye, P.; Salsberg, J.; Bush, P.; Henderson, J.; Sirett, E.; Wong, G.; Cargo, M.; Herbert, C.P.; et al. Uncovering the Benefits of Participatory Research: Implications of a Realist Review for Health Research and Practice. Milbank Q. 2012, 90, 311–346. [Google Scholar] [CrossRef]
  18. Jagosh, J.; Pluye, P.; Macaulay, A.C.; Salsberg, J.; Henderson, J.; Sirett, E.; Bush, P.L.; Seller, R.; Wong, G.; Greenhalgh, T.; et al. Assessing the outcomes of participatory research: Protocol for identifying, selecting, appraising and synthesizing the literature for realist review. Implement. Sci. 2011, 6, 24. [Google Scholar] [CrossRef]
  19. Khodyakov, D.; Stockdale, S.; Jones, A.; Mango, J.; Jones, F.; Lizaola, E. On Measuring Community Participation in Research. Health Educ. Behav. 2012, 40, 346–354. [Google Scholar] [CrossRef]
  20. Khodyakov, D.; Stockdale, S.; Jones, F.; Ohito, E.; Jones, A.; Lizaola, E.; Mango, J. An Exploration of the Effect of Community Engagement in Research on Perceived Outcomes of Partnered Mental Health Services Projects. Soc. Ment. Health 2011, 1, 185–199. [Google Scholar] [CrossRef]
  21. Cargo, M.; Mercer, S.L. The Value and Challenges of Participatory Research: Strengthening Its Practice. Annu. Rev. Public Health 2008, 29, 325–350. [Google Scholar] [CrossRef]
  22. Goodman, M.S.; Dias, J.J.; Stafford, J.D. Increasing Research Literacy in Minority Communities: CARES Fellows Training Program. J. Empir. Res. Hum. Res. Ethic 2010, 5, 33–41. [Google Scholar] [CrossRef]
  23. Lantz, P.M.; Viruell-Fuentes, E.; Israel, D.B.A.; Softley, D.; Guzman, R. Can Communities and Academia Work Together on Public Health Research? Evaluation Results From a Community-Based Participatory Research Partnership in Detroit. J. Urban Health 2001, 78, 495–507. [Google Scholar] [CrossRef] [PubMed]
  24. Nilson, L.B.; Goodson, L.A. Interactions with content, instructor, and peers. In Online Teaching at Its Best: Merging Instructional Design with Teaching and Learning Research; Jossey–Bass: San Francisco, CA, USA, 2018; pp. 228–229. [Google Scholar]
  25. Boettcher, J.V.; Conrad, R.M. The Online Teaching Survival Guide: Simple and Practical Pedagogical Tips; John Wiley & Sons: New York, NY, USA, 2021. [Google Scholar]
  26. Ko, S.; Rossen, S. Teaching Online: A Practical Guide; Routledge: Philadelphia, PA, USA, 2017. [Google Scholar]
  27. Nguyen-Truong, C.K.Y.; Tang, J.; Hsiao, C.-Y. Community Interactive Research Workshop Series: Community Members Engaged as Team Teachers to Conduct Research. Prog. Community Health Partnersh. Res. Educ. Action 2017, 11, 215–221. [Google Scholar] [CrossRef] [PubMed]
  28. Cunningham-Erves, J.; Joosten, Y.; Hollingsworth, C.P.; Cockroft, J.D.; Murry, V.M.; Lipham, L.; Luther, P.; Vaughn, Y.; Miller, S.T. Implementation and Evaluation of a Dual-Track Research Training Program for Community Members and Community-Based Organizations. Prog. Community Health Partnersh. Res. Educ. Action 2020, 14, 75–87. [Google Scholar] [CrossRef] [PubMed]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.