Do
The second phase of the PDCA cycle is doing—on-the-ground implementation. In this step, questions to ask include the following: How is it working? Are we on target with established timelines? What evidence do we have?
Documenting challenges as well as unexpected and positive findings is useful. Focus on training and professional development on the specific program or practice; coaching, supervision, and communities of practice; implementing; adapting; and monitoring and evaluation.[39]
In a review of various implementation processes, Meyers and colleagues[40] describe three tasks that occur within this step as well as questions to enable action:
“Do” Task | “Do” Action Question |
---|---|
Providing needed ongoing technical assistance, coaching, and supervision to frontline providers | Do we have a sound plan in place to provide needed technical assistance? |
Monitoring implementation | Are we assessing the strengths and limitations that occur during implementation? (See additional questions below.) |
Creating feedback loops so there is an understanding of how things are moving forward | Is the feedback system rapid, accurate, and specific enough that successes can be recognized and changes to improve implementation be made quickly? |
The answers to these questions may identify additional training needs, supports for managing the challenging parts of the program, conflicts that need to be resolved (such as administrative or scheduling issues), and necessary changes in program implementation.
Many initiatives fail for lack of study and reflection on what is actually being done and the results of having done it.[41] Observing, describing, and documenting are critical during this stage when key functions of programs are emerging. Seven questions that implementation teams can use to promote continuous improvement are noted below
- What does the program look like now?
- Are we satisfied with how the program looks?
- What would we like the program to look like?
- What would we need to do to make the program look like that?
- How will we know whether we’ve been successful with the program?
- What can we do to keep the program like that?
- What can we do to make the program more efficient and durable?[42]
Continuing to use data during this stage can help address barriers and develop systems solutions quickly rather than allowing problems to reemerge and reoccur[43]
Example of Action
The Public Health Department of Maricopa County, Arizona, used Plan-Do-Study-Act to make improvements in the reach of its Special Supplemental Nutrition Program for Women, Infants, and Children (WIC).[44] The county had seen a significant decline in women seeking WIC, with potential consequences of negative health outcomes (lower birth weights and lower cognitive development). As of June 2013, the program was at 68,711 participants. The Health Department team used PDSA to determine the cause and test improvements. A core team of WIC staff, county, and state stakeholders convened. They planned, identifying root causes and potential areas for improvements, and considered why the drop in WIC enrollment occurred. Using data to inform decisions, they made a plan of action and set a goal to increase the number of walk-in clients—72,500 cases by June 2014. Decisions and actions followed:
- Staffing schedules were changed to accommodate fluctuations in walk-in demand. For example, demand was higher during lunch hours, so more staff were scheduled from 11:00 a.m. to 1:00 p.m.
- Two clinics were changed and dedicated to seeing walk-in clients only.
In June of 2013, 14 months after the initiative began, Maricopa County’s caseload was at its highest level since late 2012. At the time the study of these events was published, the caseload was on track and expected to hit the target of 72,500 cases by June 2014.
To read more about the work in Maricopa County, see the Resources section of this guide.
[39] Ontario Centre of Excellence for Child and Youth Mental Health (2013). Implementing evidence-informed practice: A practical toolkit. Ottawa, Ontario: Author. Retrieved from https://www.cymh.ca/modules/ResourceHub/?id=874A13B4-95EE-4B22-BD9D-717D085E898D.
[40] Meyers, D. C., Durlak, J. A., & Wandersman, A. (2012). The Quality Implementation Framework: A synthesis of critical steps in the implementation process. American Journal of Community Psychology, 50(3), 462–80. Retrieved from http://link.springer.com/article/10.1007%2Fs10464-012-9522-x#/page-1.
[41] Metz, A., Naoom, S. F., Halle, T., & Bartley, L. (2015). An integrated stage-based framework for implementation of early childhood programs and systems (OPRE 2015-48). Retrieved from https://nirn.fpg.unc.edu/sites/nirn.fpg.unc.edu/files/resources/OPRE-stage_based_framework_brief_508.pdf.
[42] See footnote 41
[43] Metz, A., & Albers, B. (2014). What does it take? How federal initiatives can support the implementation of evidence-based programs to improve outcomes for adolescents. Journal of Adolescent Health, 54 S92–S96.
[44] Eisen-Cohen, E. (2015). “We influence change”: Applying PDSA to increase the reach of WIC within the Maricopa County Department of Public Health. Retrieved from https://childcareta.acf.hhs.gov/systemsbuilding/systems-guides/design-and-implementation/plan-do-check-act/do.