Evaluation of AHRQ Initiative to Accelerate the Dissemination of PCOR Findings into Primary Care R01
Technical Assistance Conference Call Transcript
Moderator: David Meyers
April 24, 2014, 3:30 p.m. ET
April 24, 2014, 3:30 p.m. ET
Please note: The FOA should be considered the source document for applications.
Coordinator: Welcome, and thank you for standing by. At this time, all participants are on a listen-only mode. During the question-and-answer portion, you may press star one from your touchtone phone. Today's conference is being recorded. If you have any objections, you may disconnect at this time. Now I will turn the conference over to Dr. David Meyers. You may begin.
David Meyers: Thank you. Good afternoon, everyone, and welcome to our technical assistance call for RFA HS-14-009, also known as "Evaluation of AHRQ Initiative to Accelerate the Dissemination and Implementation of PCOR Findings into Primary Care".
During today's call, we'll be providing an overview of the FOA, providing answers to frequently asked questions, and conducting an open forum to respond to additional questions. As we begin, I'd like to invite the AHRQ staff members who are working on this initiative and are with me on the call today to introduce themselves.
I'll begin. I'm David Meyers, and I direct the Center for Primary Care Prevention and Clinical Partnership.
Debbie Rothstein: I work in the Office of Extramural Research and Priority Populations at AHRQ.
Rebecca Roper: I work in the Improving Primary Care group.
Janice Genevro: I lead the Improving Primary Care group within the Center for Primary Care.
Phillip Jordan: Center for Primary Care.
Bill Borden: Hi. Center for Outcomes and Evidence.
Kishena Wadhwani: I work on the review aspect of the application.
Galen Gregor: Grants Management. I'm listed on the FOA.
David Meyers: Okay. And on the phone today listening in—I'm not sure if he has an open line—is our colleague Bob McNellis, who is one of the FOA writers.
Those folks on the call should have received a set of slides that we'll be using today, but all the information will be in the oral presentation as well. The purpose of the FOA is to conduct a rigorous external evaluation of an AHRQ grant-funded initiative to disseminate and implement PCOR findings to improve heart health and to improve the capacity of primary care practices and implement PCOR findings into practice.
The three goals of this AHRQ evaluation FOA are to 1) provide a summative evaluation of the findings from each individual implementation grantee; 2) extract, examine, and rapidly report key themes and findings from across the implementation grantees; and 3) to evaluate comparative effectiveness of the dissemination and implementation strategies utilized by the different implementation grantees, with particular attention to analysis of the contextual and environmental factors and their influences.
Through a separate funding announcement (RFA-HS 14-008), AHRQ is soliciting proposals to disseminate PCOR evidence directly to primary care practices and to support the practices in implementing evidence-based clinical and organizational findings. Applications to this companion FOA will propose comprehensive approaches to utilize evidence-based quality improvement techniques, such as practice facilitation, that are designed to improve the capacity of primary care practices to implement new PCOR evidence into the delivery of care.
Funded projects will work with primary care practices over a 3-year period to disseminate PCOR findings and support primary care practices in implementing them for the purpose of improving heart health with a focus on the ABCS—aspirin use for high-risk individuals, blood pressure control, cholesterol management, and smoking cessation.
The primary objectives of that separate funding announcement are to 1) disseminate and implement PCOR findings particularly related to the ABCS, and 2) build primary care practices' capacity to receive and incorporate other PCOR clinical and organizational findings in the future.
Each project will conduct a broad internal evaluation that includes routine measurement of the ABCS, measurement of practice capacity to implement PCOR findings, analysis of both internal and external contextual factors, and evaluation of their process and implementation.
Each grant will also include some form of control or comparison group. AHRQ intends to make separate awards to fund up to eight cooperatives that will each work with a minimum of 250 small- to medium-sized primary care practices.
Earlier today AHRQ held the technical assistance call for this companion FOA—the transcript of which will be posted on the AHRQ funding announcement Web site in the coming weeks. I strongly encourage anyone interested in applying to the evaluation FOA to carefully review the entire companion FOA.
FOA 14-009 seeks an overarching evaluation that is summative and complementary to the individual evaluations that will be conducted by each of the up to eight implementation grantees. The external evaluation will thus require close collaboration with the implementation grantees. Specifically, AHRQ seeks an overarching evaluation that includes the following five aspects: 1) an assessment of improvements in the delivery of the ABCS by primary care practices supported by the initiative, 2) an assessment of changes in the capacity of primary care practices involved in the initiative to implement new PCOR findings into practice, 3) an assessment of how each grantee actually implemented their proposed approaches, 4) an evaluation of the relative effectiveness of the different comprehensive approaches to disseminating and implementing PCOR findings to improve heart health and build practice capacity, and 5) an assessment of the internal and external contextual factors and environmental factors that led to various dissemination and implementation approaches to being successful or not successful in different settings.
AHRQ also expects that a successful applicant will be prepared and able to clearly communicate the results and insights of this multilevel evaluation with multiple stakeholders in a timely manner.
Next let's turn to some of the basics of this FOA. AHRQ is utilizing an RO1 mechanism and intends to make a single award. The grants are limited to $3.5 million total costs per year, and for AHRQ, total cost means your direct costs plus your indirect costs. For this grant, direct plus the indirect may not exceed $3.5 million per year and the project itself may not exceed 4 years.
Grants are made to organizations, not individuals. Eligible organizations that may submit and lead applications under this FOA include both public and nonprofit private institutions, units of local or State government, eligible agencies of the Federal Government, and the Indian or Native American tribal governments and designated organizations. For-profit organizations and foreign institutions are not eligible to lead applications.
We're often asked if for-profit organizations may be sub-grantees under this FOA. HHS grants policy requires that the main grant recipient perform a substantive role in the conduct of the plan's project or program activity and not merely serve as a conduit of funds to another party or parties.
That said, if a consortium of activities represent a significant portion of the overall project, the applicant must justify why the applicant organization rather than the party performing this portion of the overall project should be the grantee and what substantive role the applicant organization will play.
If you'd like more information about this, you can ask questions later or seek guidance from either Dr. Wadhwani or Ms. Gregor, who are listed in the application.
The application will be led by a program director/principal investigator (PD/PI). Any individual with the knowledge, skills, and experience required to carry out the proposed research is eligible to serve as the project PD/PI.
There are no degree requirements for a successful PD/PI. They're evaluated on their knowledge, skills, and experience and ability to lead the project. Sometimes we are asked whether the PD/PI must be employed by the lead organization submitting the application. They do not need to be, but they must be accountable to the organization submitting the application.
Additionally, AHRQ requires on all of our grant applications, including this one, that there be one—and only one—listed PD/PI on the applications to both FOA 14-008 and 14-009. You can have many, many important people on the application. If you want to make them feel important by giving them fancy titles, you may, but you may only call one person the project program director/principal investigator.
As we will discuss later on regarding this application, no individual may be listed as the PD/PI on applications to both this FOA and the companion FOA 14-008.
The research strategy section of the application is limited to 30 pages and must include the following sections: 1) understanding of the issues (maximum length 3 pages), 2) organization and team (maximum length 5 pages), 3) evaluation plan (maximum length 20 pages), 4) project timeline (maximum length 2 pages), and 5) dissemination strategy (maximum length 3 pages). Please note the section maximum page limits that are listed in the FOA are intended to allow flexibility in developing the research plan. If an applicant maximized each section, the total would be greater than 30 pages. Applications that exceed the maximum total allowance for the research strategy will not be reviewed. Please make sure you understand this guidance.
I'm now going to read to you a series of quotes from the FOA that we've pulled out just as details and highlights that you may want to pay attention to as you consider developing your application. Applicants should demonstrate consideration of issues related to health and health care disparities across all aspects of the planned evaluation. You should describe a plan for developing a productive relationship with all of the implementation grantees, and discuss how you will establish these relationships with implementation grantees, refine data collection approaches collaboratively, and execute your evaluation within the timeframe of the evaluation grant.
Grant applications should detail plans for data standardization, collection, quality assurance, and management. You may budget funds to compensate implementation grantee organizations for efforts related to additional data collection for the overarching evaluation. Applicants should plan on conducting site visits to all—up to eight—implementation grantees.
Additionally, applicants should also plan to conduct site visits to multiple primary care practices throughout the project period. You may consider collecting other practice- or patient-level data in order to produce a richer overall evaluation. You may budget funds to compensate primary care practices for efforts for data collection activities related to the overarching evaluation.
Your application must describe a plan for maintaining flexibility and allowing for the adaptation of the evaluation approach. You must plan and budget for six team members, including the PD/PI, to travel to the Washington, DC area once a year each year for the first 3 years of the grant to participate in a 2-day meeting with the implementation grantees and AHRQ. The first meeting is expected to be held in the spring of 2015.
Finally, it is expected that results will be shared in a timely manner, with updates at least quarterly, and not only through peer-reviewed publications and presentations.
With those highlights ringing in your ears, I'd strongly recommend that you read the entire FOA completely and carefully. There are many, many other details.
Moving on to some high-level reminders. First, on budget: AHRQ does not accept modular budgets. AHRQ only uses the detailed research and related budget. Any application submitted in modular format will not be reviewed. As we mentioned earlier, the budget ceiling is for total costs. Total costs are direct plus indirect costs.
Matching funds are welcomed. In fact, they're encouraged, but they are not required for a successful application.
In preparing your application, I recommend you pay particular attention to the review criteria that will be utilized by peer reviewers in determining the merit of your proposal. You can see Section Five of the FOA for further details.
The scored review criteria include significance, investigators, innovation, approach, and environment. Sometimes folks overlook this, but in another section of the FOA, AHRQ describes what we will use as criteria in making the final selection of the grantee. AHRQ will consider the following five criteria in making our award decision: 1) scientific and technical merit of the proposed project as determined by peer review, 2) the availability of funds, 3) responsiveness to goals and objectives of the FOA, 4) relevance and fit of the application to our portfolio priorities as well as overall program and balance, and 5) different than in many other FOAs we may have seen before, this FOA calls out that as a selection criteria AHRQ will consider the independence of the implementation grantee teams and the evaluation grantee team. We'll talk more about that in a moment.
There are a few dates highlighted in the FOA that I'd like to review with you today. Letters of intent are requested by May 23. The earliest submission date for applications is June 3, 2014. All applications must be successfully and completely submitted by July 3, 2014. AHRQ expects the independent peer review of these applications to be done in the fall of 2014 and that grants will begin in approximately February 2015.
I just said that we are requesting letters of intent. These nonbinding letters are extremely helpful to AHRQ staff and assist us in ensuring that your application gets the highest quality review possible. I strongly encourage anyone who's considering submitting an application to submit a letter of interest.
Letters should include the following information: the number and title of this funding opportunity; a descriptive title of the proposed activity, such as the title that you would put on your grant application; the name, address, and telephone number of the PD/PI; lead institution and all additional institutions that will be participating in the proposed application; and the names and institutions of other key personnel. Letters are requested by May 23 but may be submitted at any time. They should be sent via Email to our colleague Phillip Jordan at email@example.com. His name and information are found in the FOA.
Please put the words "letter of intent" and the number 14-009 in the subject line so we are sure to file and present your letter of intent appropriately. While we hope to answer many of your questions in the remainder of our time together today, we recognize and encourage you to ask for help throughout the open period for this grant. If at any time you have questions about FOA contact, you can direct them to Phillip Jordan—firstname.lastname@example.org.
If you have questions about the peer review process, you can send it to our colleague Dr. Kishena Wadhwani. Mr. Wadhwani's Email is email@example.com.
For all financial matters, Ms. Galen Gregor is available to assist you: Galen.Gregor@ahrq.hhs.gov.
Now I'm going to turn over to a couple of questions that we've heard already and that we'd like to answer before we open the lines for questions from participants.
And the first question we've heard is: May an eligible organization submit applications to both the implementation and evaluation FOAs?
The answer to that question is yes; however, AHRQ is committed to ensuring that the evaluation is independent. As we just discussed, please pay attention to the selection criteria AHRQ will use in selecting and awarding the FOA. In preparing applications, organizations should carefully consider how they would maintain independence of the implementation and evaluation teams if they intend to apply to both FOAs, and how they can demonstrate this independence to both peer reviewers and AHRQ.
The second question we recently received was: Does AHRQ really expect the evaluation team to release results before the evaluation is complete?
The answer to that is yes. Applicants are required to propose how they will share preliminary results and insights at least quarterly throughout the project period. AHRQ seeks applicants that are committed to providing timely information to stakeholders including health care decision makers, organizations dedicated to primary care quality improvement, primary care practices and professionals, and the public throughout this project.
With that background, I'm happy now to turn it over to questions from our audience.
Coordinator: QUESTION/ANSWER PORTION OF CALL BEGINS
Question: Thank you. I have two quick questions. One is—will the questions we ask offline be available to the rest of the participants in writing?
Answer: In general, no. Any questions asked during today's call will be included in the official TA call transcript.
Question: Thank you. The second and more substantive question—if I understand it correctly from the previous call, there will be a meeting right after the local grants are awarded with the evaluator and AHRQ, of course. In a part of that call, it sounds like there will be some negotiation around standardizing data to be provided to the overall evaluation. And if that's correct, which I hope it is, will that also include measures such as standard measures of practice capacity?
Answer: Great series of questions, and thank you for participating on the earlier call. In both calls today, I mentioned—and earlier Bob McNellis mentioned—that there are grantee meetings for this project, the first one of which will be in the spring of 2015. And as I said earlier, we expect these awards will be made in February of 2015, so it is a very early meeting.
That meeting will involve participants from up to the eight (or what we expect will be eight) implementation grantees as well as six team members from the implementation and the evaluation FOAs—the one we're talking about today—coming together with AHRQ staff for 2 days, and we expect these meetings will happen not only in 2015 but also in ‘16 and ‘17.
At that very first meeting, it is our intention—and it's written in the implementation and the evaluation FOA—that we will share what has been proposed by everybody and that the group will try to build common metrics and consensus around their measures both for the ABCS, practice capacity, and any other areas that folks are interested in exploring. In addition, we anticipate that the meetings will involve working together to create the data sharing policies by which the implementation grantees will quarterly provide their updates, and their data on what's happening, to the evaluation team.
Nonetheless, any of you applying for this evaluation FOA are going to have to describe your overall approach in the FOA. This is your application, and what we ask is that you think about and describe how you're going to maintain flexibility to shift a little bit as a result of partnering with the implementation grantees.
I'm looking around the room for the other colleagues here who are working on this. Anything else you'd like to add? Okay, it seems like I got that mostly right. Thank you for the question.
Question: Hi, David. This is a bit of a followup that was prompted by listening to the first call, where you mentioned sort of that harmonization of data at that first meeting. And one of our questions is related to that because we're wondering if there is any sort of other technical assistance that the evaluation grantee would be expected to provide to the implementation grantees, especially in the effort to maintain a productive relationship with those grantees.
Answer: Very interesting question. Maintaining relationship—yes. It's expected that the evaluation people submitting an application to this FOA will talk about and put out resources so that they maintain open communication, data assurance, data quality kinds of things. But it is not our assumption that the applicant who receives this grant will be expected to provide lots of technical assistance on how the implementation grantees do their work or do their own evaluations.
The area that's sort of a gray area may be some technical assistance or support about how to provide the data to the central evaluation for the overarching evaluation. So things like creating a template that all of the implementation grantees would use to submit their data would be within scope. What we would not expect folks to be able to do would be things like teaching implementation grantees how they should conduct focus groups or standardizing how implementation grantees conduct focus groups.
Additional AHRQ Response: But also the evaluation grantee might provide guidance on possible data quality assurance procedures and guidance so that all of the grantees know what the overarching evaluator expects. There may be some assistance or guidance provided about that, I would imagine.
Answer: Right. So quality assurance about how to submit your data and what you do before you submit your data—helping them all have that, yes—but the evaluation grantee would not be expected to teach implementation grantees how to do modeling, interventions, or find PCOR findings.
Question: Thank you. Hi. Nice to hear your voices. So, I guess the independence of the evaluator from the implementation grantees seems interesting. The question is how; I know that you've made it clear that that's going to be in AHRQ's criteria to look at that and, you know, the difficultly of proving a negative finding, you know, legendarily that is hard to prove. I'm curious—in what ways have you thought about implementing that criteria and what are other particular kinds of relationships that you think are probably, you know, off the table and others that are more gray?
Answer: I'll see if I can try to answer and, yes, we recognize that proving the negative is always a challenge. But if an applicant —University of South Manitoba, who I don't think is applying, so I can use them as an example—were to come in for this main overarching evaluation and also was a partner in the collaborative from a group from Minnesota, and they were going to do the evaluation part of the implementation FOA for the Minnesota collaborative—in that case, it would be really hard for them to say the same evaluation group is going to do the overarching evaluation, which includes saying how well did Minnesota do, when they are themselves doing the evaluation as part of a Minnesota group.
If the University of Manitoba, however, had a public health school and a general internal medicine program, and one of those groups got contacted by Minnesota and was working with them on the implementation, and the other one across campus wanted to come in for the overarching evaluation, then they would do things like show none of the staff on the two projects are the same people.
On the other hand, if they say, "Well, actually all of the evaluation qualitative evaluators are the same. They have joint appointments." That would make them look not as good. So, is that helping to further show the difference?
Answer: And so one of the other ones that we will be looking at, and we want folks to look at, is that while it is explicit that the same individual may not lead or be the PD/ PI for both an 09 and an 08 application, being the lead for one and being key staff on the other would start triggering that need for proof. So, I think the key is ensuring an objective evaluation on the implementation program by demonstrating independence and objectivity.
Question: Could you put in an application for both of these RFAs but not accept one of them? In other words, you know, become both the grantee for implementation and evaluator but not accept one of the grants?
Response:Well, it's certainly possible, with the exception that you can't put the same person as PD/ PI for both. You'd have to think for yourself before you did that how AHRQ—how the study section, but more importantly how AHRQ—would try to pull things together. We have the ability to say, well, if we give it to them here, then we take them away over here, and then we use somebody else over there—but that is not a game we like to play.
So it is allowed and, therefore, could be a way … it's a gamble.
Question: Yes, I have a question related to your request of the regular updates of the data. Is that envisioned to be aggregate across all eight or individually reported by each center that's in the implementation sites?
Response:That's something for you all to propose to us how you would like to do it. And again the motivator there—what we're looking for is that we recognize that people are looking for this—for information about how PCOR evidence—PCOR findings can be disseminated more rapidly and they don't want to wait 4 four years to find out if this methodology works.
So what—how do you communicate to policymakers what's going on in this FOA? How do you communicate to practices and health care professionals and QI groups about what you're learning? And it may be that at some points you're just doing snippets and lessons learned and insights, and other times you're giving data, and sometimes it's about different projects and other times it's overarching and pulling it together. That's up for you all to think about, what ways you would want to do that.
And we're assuming the application isn't necessarily going to say on June 12, you know, 2017, we will put out our third brief on X. It's going to be more an approach to how you'd like to move that forward. I know giving flexibility is a little scary, but we're hoping it's going to work here.
Question: In the first overarching goal, it is to provide a summative evaluation of findings from each implementation grantee. Could you clarify or talk a little bit more about that and particularly where it says the findings from each individual grantee? Is this function of the overarching evaluation a synthesizing and reporting function, or is this where you're looking to the overall evaluator if we've done some added data collection enrichment to include those in that particular area? Does that make any sense?
Answer: Yes, it very much makes sense. Thank you for the question. I think we're not so specific on that. Breaking it down, though, we do expect a summative evaluation of each of the eight individual projects as well as across all eight of them, a larger thing about what this whole program did as well as on top of that, some comparative work between the different approaches.
If you propose to collect additional data, which we want to encourage people to do, you might say "just these basics that the grantees are collecting is limited. We're going to do more than that, and here's how we're going to collect that and work towards that," and you incorporated that at level one, two, three, or all of the above, great.
David Meyers: Again, I'd like to strongly recommend that if you didn't participate in the early call that you both read the FOA carefully and read the technical assistance transcript because so much of what we're looking for in this evaluation is there. If at any time you have more specific questions, again Phillip Jordan for content of the FOA, Dr. Wadhwani for questions related to peer review, and Ms. Gregor for questions related to financial matters.
Okay then, with that we want to thank everybody for signing on today. We strongly, strongly encourage you to apply. We can't really say that enough. If you think you might be putting in an application, please submit a letter of intent. It is nonbinding, but it really does help make sure that we're able to provide all of you with the highest quality peer review when the time comes, so it is in both your advantage and our advantage. The application period opens June 3 and closes July 3. We look forward to hearing from you. Thank you.
Coordinator: Thank you for your participation in today's conference. You may now disconnect and have a good weekend.
Current as of May 2014