Supporting Statement B
Enhancing HIV Care of Women, Infants, Children and Youth Building Capacity through Communities of Practice
OMB Control No. 0915-XXXX
1. Respondent Universe and Sampling Methods
The project intends to evaluate the effectiveness of the coaching and support services via process and outcome evaluation methods. Evaluation of coaching and support depends on establishing clear goals and plans from the beginning of the process. This includes specifying the intended impact of the coaching and support with concrete, measurable objectives. To judge performance against goals, we will administer coaching and support evaluation surveys following offsite coaching and support and training, webinars, teleconferences, and meetings. Our findings will drive quality improvement activities and reports.
The components of the Kirkpatrick Evaluation model that we propose to apply are reaction, learning, and behavior. We have operationalized these components to include measures of satisfaction with the coaching and support (reaction), change in knowledge after the coaching and support (learning), and change in behavior or practice after the introduction of evidence-based interventions (behavior). More specifically, our evaluation plan includes (1) post coaching and support satisfaction measures, (2) pre-post measures of CoP staff knowledge about effective practices, (3) measures of coaching and support usefulness and impact on CoP performance, and (4) pre-post-follow-up measures of CoP adoption and demonstration of evidence-based practices and effectiveness. Copies of the survey tools for CoP and coaching and support participants are included in Appendix A of this document.
The evaluation of coaching and support provided to CoP participants will focus on the performance of all CoP Teams, including participant learning, participant satisfaction, and participant behavior (or reported behavior). We propose to evaluate CoP coaching and support requests/events to be sure the CoPs learn from those requests/events and are able to incorporate improvements as they continue their work in the community.
The project plans to create three Communities of Practices (CoPs). HRSA has prioritized three topic areas for the CoPs. The three topics are: 1) Pre-conception Counseling, 2) Trauma-Informed Care, and 3) Youth Transitioning into Adult HIV Care. The key components of the CoP include:
• A series of learning sessions and action periods with peer learning, coaching and training led by a core group of experts (i.e., Faculty). Between the learning sessions, check-in calls and technical assistance will occur to ensure engagement between the CoP faculty and recipient teams.
• Using the IHI Breakthrough Series and Model for Improvement (Plan Do Study Act (PDSA), each CoP will focus on one of the identified topics and use evidence-based, evidence-informed, or emerging interventions to make improvements and reach established CoP goals.
• CoP participants will report successes and challenges, and experts will advise and coach participants on addressing any challenges encountered during learning sessions and action periods.
• Contractor will assist participants with developing clinical protocols as well as using QI tools and techniques.
Each CoP will consist of up to 15 RWHAP Part D recipients and representatives from collaborating organizations, with a total of up to 45 Part D recipients participating over the course of the project. Each participating recipient would convene a Core Team, composed of staff, partners and WICY with HIV. The Team will work to implement the selected evidence-based, evidence-informed, or emerging interventions and the quality-improvement framework that address their care environment.
The Teams will consist of at least a project director or manager, HIV case manager, a clinical provider, one- or two-Part D participants with HIV, Quality Improvement (QI) lead to help collect, analyze and provide updates on data associated with the intervention, and relevant collaborates. For the purpose of collecting evaluation survey data, we anticipate an estimated universe of target population size of 90 respondents for each CoP. For all three CoP the estimated universe size will be 270. Each Part D Recipient will have a core team of up to 6 members. All 270 CoP core team members will be participating in the coaching and support activities including foundational Technical Assistance and Session (Learning and Action Period) focused technical assistance.
Table 1. Universe of Survey Respondents
CoPs and Team Composition |
Number of Part D Recipients within Each CoP |
Total Universe |
Pre-Conception Counseling CoP 15 |
||
Project Director/Manager (1) |
|
15 |
HIV Case Manager (1) |
|
15 |
Clinical Provider (1) |
|
15 |
Part D participants with HIV (2) |
|
30 |
Quality Improvement (QI) lead |
|
15 |
Subtotal |
|
90 |
Trauma-Informed Care CoP 15 |
||
Project Director/Manager (1) |
|
15 |
HIV Case Manager (1) |
|
15 |
Clinical Provider (1) |
|
15 |
Part D participants with HIV (2) |
|
30 |
Quality Improvement (QI) lead |
|
15 |
Subtotal |
|
90 |
Youth Transitioning into Adult HIV Care CoP 15 |
||
Project Director/Manager (1) |
|
15 |
HIV Case Manager (1) |
|
15 |
Clinical Provider (1) |
|
15 |
Part D participants with HIV (2) |
|
30 |
Quality Improvement (QI) lead |
|
15 |
Subtotal |
|
90 |
Grand Total |
45 |
270 |
We plan to survey the entire universe of CoP leaders (i.e., 270 core team members). It is important to include all the participants in each of the three CoPs. A census of all Part D recipients is necessary, in part, because we want to ensure that all participants have been receiving the training and gaining knowledge to apply in their HIV/AIDS care of WICY. Second, a census is necessary due to the heterogeneous nature of the programs. These programs encompass a wide variety of organizational types. Because of the variety between programs, it is critical to the evaluation to capture the details of each program in order to answer the evaluation questions and assess which program characteristics are associated with better outcomes for particular types of recipients.
Measures will be taken to ensure a response rate of at least 80 percent. These efforts include:
Conducting a pilot of the survey among a small (nine) representative group of potential participants and analyzing the results using post-survey interviews.
Developing and implementing a survey promotion and launch strategy that leverages the influence of agency leadership.
Developing and implementing effective follow-up processes and procedures to be applied uniformly to all “late” responders.
2. Procedures for the Collection of Information
Survey design and construction, especially question selection, was informed by the research questions the survey is intended to address. The survey questions were developed by collecting questions used in previous surveys into an Excel workbook. These served as the basis for developing questions specific to the intent and purpose of the Enhancing HIV Care project. The source of each previous survey question was tracked along with the development of modifications to the question. These modifications tailored the questions to the survey areas of interest. Modifications were proposed and developed in a series of iterative teleconferences. To address gaps, some questions were crafted by the Enhancing HIV Care project team members. The final set of survey questions was methodically cross walked to the Research Questions the survey will answer.
Each section of the survey was organized to support the effective and efficient implementation of “skip logic,” with stem questions located at the beginning of the section. The stem question determines whether a set of questions is relevant to the respondent. If not, the respondent is automatically forwarded, or “skipped” to the next relevant question or section. The minimum time for the average respondent to complete one of the surveys is 4 minutes, with the maximum time of 28 minutes This 4 to 28-minute was obtained from the responses from the 9 CoPs who took part in the pilot testing of the assessment instruments.
The expectation is that the CoP core team Project Director/Manager or their designee will be responsible for ensuring that the surveys are completed. This approach would not prevent the Project Director from seeking input from others in the organization or asking them to complete portions of the survey.
Data collection procedures include three key areas: (1) developing information required for the survey frame, (2) communicating with targeted respondents; and (3) Project Officer support. These key components are detailed below.
We will use SurveyMonkey, the online survey platform, to conduct the survey. Besides being an affordable option, online surveys allow us to create, distribute, and collect data from a single platform that is accessible to the entire team. The margin of error is greatly reduced because answers are entered directly into the online survey system. Online surveys are easy to create. SurveyMonkey will allow us to develop the survey within a short time. It allows accessible surveys that are Section 508 and WCAG 2 compliant. It allows respondents to take the survey on any device. Grantees who have not responded will be sent reminder emails 3 days, 1 week, and 3 weeks following the initial email with the survey link.
We do not plan to implement any sampling design for this survey. All three CoP Part D grant recipients comprising of 6 members, the universe, will be included in the survey. The HRSA Project Officers are the primary source for verifying that contact names and information are accurate.
Two keys to obtaining participation in a survey are to establish the importance of the information provided and to limit the amount of effort required by the respondent. As part of establishing the importance we propose that a lead letter be sent via email to all potential respondents from an HRSA authority. We recognize that this strategy would require lead time with the busy schedule of persons at this level. This letter should go out approximately 1 week prior to the actual implementation date for the first survey.
Limiting the amount of effort required by respondents includes keeping all communication clear and succinct along with limiting the time required to complete the survey. Any emails or other communication will be designed so that the reader can grasp the key elements and what is expected of them from the first few lines. We have already piloted the surveys with 9 CoP leaders who provided feedback on the clarity of survey questions and instructions. We made revisions to our assessment instruments based on feedback we received from the pilot respondents.
On the release date of the survey a customized email will go out to each potential respondent from Enhancing HIV Care project team members. This email will contain a link to the survey site. The email will identify HRSA as the sponsor of the survey and reference the lead letter previously sent. It will also contain information on how to obtain assistance if the respondent has questions or technical issues with completing the survey. This will include both an email address and a phone number for obtaining help should questions or issues arise. These same contact points will also be listed on the first and last pages of the electronic survey.
We propose a 30-day data collection period to allow sufficient time for respondents to complete the survey and our staff to undertake efforts to encourage this participation.
Grantees who have not responded will be sent reminder emails 3 days, 1 week, and 3 weeks following the initial email. This schedule is designed to establish and maintain a positive relationship with the potential respondent, encouraging their continued engagement and interest while assuring them of the importance of their participation. The communication strategy will be evaluated for effectiveness and modified if it does not effectively encourage a response. For example, instead of sending only an email at week 3, the Enhancing HIV Care project team may decide to also place telephone calls to each non-responder, asking them to expect the reminder email and reminding them it is not too late to respond.
3. Methods to Maximize Response Rates and Deal with Nonresponse
Attaining a high response rate to the survey of at least 80% is essential to the success of the evaluation. To do so, efforts will be undertaken to ensure clear and easy communication between respondents and survey administrators. As noted, HRSA will implement a comprehensive strategy to maximize the response rate, which will include the following methods:
We have kept the survey length as brief as possible to reduce the respondent burden.
The survey will be promoted by the program government Project Officers and by agency leadership. The importance of the survey will be identified and the value of the data to the respondents will be highlighted.
The survey itself will be deployed on the internet, to make completion of the survey faster and simpler for the participants.
A methodology for non-responder follow-up will be developed and implemented using email and telephone. Non-responders will be tracked to individual CoPs to determine if one or more CoPs have a high non-response rate. In this case, the government Project Officer’s assistance would be requested in encouraging the grantees in that program to respond.
Survey participants will have access to the final survey report. Research indicates that when data is made available to participants, the quality of data reported improves and the response rate increases.
The planning and administration of the survey will identify and address potential barriers to participation, especially those related to the web-based technology to be used.
7. Messaging, including scripts to guide follow-up efforts, will be reviewed by staff that specialize in communication, to help ensure clarity and effectiveness.
4. Tests of Procedures or Methods to be Undertaken
To mitigate any undue burden and to maximize the effectiveness and utility of the survey instrument, a pilot test was conducted with 9 CoP participants. The pilot tested whether the questions were clearly stated and that they captured the intended information. The results of the pilot were used to inform efforts to refine the wording of questions that were not clear and to clarify questions that were not understood. The pilot test was also used to determine the actual time of administration for the intended audience and to ensure that the target range was achieved.
The pilot included a maximum of nine CoP participants in the survey frame, with invitations to participate based on Project Officer recommendations. The CoP participant who accepted the invitation were asked to complete the survey at a pre-arranged time, noting the total time for completion. Participants were provided with information concerning the nature and intent of the data collection effort, and a paper-based, loosely formatted tool for taking notes on issues as they emerged and their observations about the survey.
Survey analyses will use univariate and bivariate presentations of the distribution of key variables in tabular format related to core areas of the survey. The HRSA Part D Recipient CoP evaluation uses a series of interdependent analysis frameworks that have been selected to maximize the coverage of the key Evaluation Questions posed for assessing the objectives of HRSA Part D Recipient CoPs. The analysis plan proposes a series of analyses that move from basic descriptive analyses (e.g., means, frequencies, percentages) to the use of sophisticated quantitative analysis techniques.
The HRSA Part D Recipient CoP evaluation will use a pre/post design. Evaluation design and evaluation questions guided the selection of the analysis framework. In addition to descriptive analyses, appropriate statistical techniques to estimate effectiveness using repeated measures design will be used.
A repeated measures design involves measuring the same variable on the same subjects at multiple points in time or under multiple conditions. In a repeated measures ANOVA, the within-subjects variability is partitioned into different sources of variation, including the effect of the independent variable (such as the CoP coaching), the effect of time, and the interaction between the independent variable and time.
The repeated measures ANOVA has several advantages over other types of ANOVA, including increased power, reduced error variance, and the ability to control for individual differences between subjects. However, it also has some assumptions that need to be met, such as normality of the distribution of the outcome variable and sphericity (the equality of variances of the differences between all pairs of conditions or time points). Violations of these assumptions can affect the validity of the results.
If the assumptions of repeated measures ANOVA are violated, we will consider using other statistical approaches such as mixed-effects models or generalized estimating equations (GEE). These methods can provide more flexibility and can handle missing data, non-normality, and other issues that may arise in repeated measures designs.
Data collected for this project will be valuable information that can inform the healthcare literature on Preconception Counselling, Youth Transitioning into Adult HIV Care, and Trauma-Informed Care. The reports or publications that come out of this project will not attempt to make any national estimates based on the information collected from the surveys. Instead, they will focus on the usability properties of the CoP assessment tools. In addition, the data will help inform performance reporting on various future CoP topics for HRSA programs.
5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data
Enhancing HIV Care of WICY Project Personnel (contractors)
These individuals will initiate the data collection activities and supervise a project coordinator in the preparation of survey platform with the CoP Assessment questions and skip logics, administering of the surveys, and implementation of the non-responder follow-up plan. The team will ensure the provision of reports on the survey response rate, and initiate interventions in the event that the response rate is not satisfactory.
Rhonda Waller, Ph.D.
Enhancing HIV Care of WICY Project Director
Bizzell US
Phone: 678-695-8136
Email: rwaller@bizzellus.com
Kazi Ahmed, Ph.D. (designed the data collection, will collect the data, and will analyze the data)
Enhancing HIV Care of WICY Program Evaluator
Bizzell US
Phone: 301-798-9123
Email: kahmed@bizzellus.com
William Scarbrough, Ph.D.
Vice President, Health Solutions
Bizzell US
Phone: 202-938-3895
Email: wscarbrough@bizzellus.com
Janis Sayer, Ph.D. (designed the data collection, will collect the data, and will analyze the data)
Senior Scientist, Behavioral Health
Advocates for Human Potential, Inc.
Phone: 978-831-9342
Email: jsayer@ahpnet.com
HRSA Personnel
Gail Glasser
Government Contracting Officer's Representative
Senior Project Officer, Division of Community HIV/AIDS Programs, HIV/AIDS Bureau, Health Resources and Services Administration
Phone: 301.443.1214
Email: gglasser@hrsa.gov
Ijeamaka Ogbonna, MPH
Government Task Lead
Public Health Advisor, Division of Community HIV/AIDS Programs, HIV/AIDS Bureau, Health Resources and Services Administration
Daytime Phone: 301.945.9638
Email: iogbonna@hrsa.gov
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Instructions for writing Supporting Statement B |
Author | Jodi.Duckhorn |
File Modified | 0000-00-00 |
File Created | 2025-07-16 |