Early Childhood Systems Building Resource Guide

The second phase of the PDCA cycle is doing—on-the-ground implementation. In this step, questions to ask include the following: How is it working? Are we on target with established timelines? What evidence do we have?

Documenting challenges as well as unexpected and positive findings is useful. Focus on training and professional development on the specific program or practice; coaching, supervision, and communities of practice; implementing; adapting; and monitoring and evaluation.[1]

In a review of various implementation processes, Meyers and colleagues[2] describe three tasks that occur within this step as well as questions to enable action:

Table 4. “Do” Tasks and Enabling Questions
“Do” Task“Do” Action Question
Providing needed ongoing technical assistance, coaching, and supervision to frontline providersDo we have a sound plan in place to provide needed technical assistance?
Monitoring implementationAre we assessing the strengths and limitations that occur during implementation?
(See additional questions below.)
Creating feedback loops so there is an understanding of how things are moving forwardIs the feedback system rapid, accurate, and specific enough that successes can be recognized and changes to improve implementation be made quickly?

The answers to these questions may identify additional training needs, supports for managing the challenging parts of the program, conflicts that need to be resolved (such as administrative or scheduling issues), and necessary changes in program implementation.

Many initiatives fail for lack of study and reflection on what is actually being done and the results of having done it.[3] Observing, describing, and documenting are critical during this stage when key functions of programs are emerging. Seven questions that implementation teams can use to promote continuous improvement are noted below

  1. What does the program look like now?
  2. Are we satisfied with how the program looks?
  3. What would we like the program to look like?
  4. What would we need to do to make the program look like that?
  5. How will we know whether we’ve been successful with the program?
  6. What can we do to keep the program like that?
  7. What can we do to make the program more efficient and durable?[4]

Continuing to use data during this stage can help address barriers and develop systems solutions quickly rather than allowing problems to reemerge and reoccur[5]

Example of Action

The Public Health Department of Maricopa County, Arizona, used Plan-Do-Study-Act to make improvements in the reach of its Special Supplemental Nutrition Program for Women, Infants, and Children (WIC)[6].  The county had seen a significant decline in women seeking WIC, with potential consequences of negative health outcomes (lower birth weights and lower cognitive development). As of June 2013, the program was at 68,711 participants. The Health Department team used PDSA to determine the cause and test improvements. A core team of WIC staff, county, and state stakeholders convened. They planned, identifying root causes and potential areas for improvements, and considered why the drop in WIC enrollment occurred. Using data to inform decisions, they made a plan of action and set a goal to increase the number of walk-in clients—72,500 cases by June 2014. Decisions and actions followed:

  • Staffing schedules were changed to accommodate fluctuations in walk-in demand. For example, demand was higher during lunch hours, so more staff were scheduled from 11:00 a.m. to 1:00 p.m.
  • Two clinics were changed and dedicated to seeing walk-in clients only.

In June of 2013, 14 months after the initiative began, Maricopa County’s caseload was at its highest level since late 2012. At the time the study of these events was published, the caseload was on track and expected to hit the target of 72,500 cases by June 2014.

To read more about the work in Maricopa County, see the Resources section of this guide.


[1] Ontario Centre of Excellence for Child and Youth Mental Health (2013). Implementing evidence-informed practice: A practical toolkit. Ottawa, Ontario: Author. Retrieved from http://www.excellenceforchildandyouth.ca/sites/default/files/docs/implementation-toolkit.pdf.

[2] Meyers, D. C., Durlak, J. A., & Wandersman, A. (2012). The Quality Implementation Framework: A synthesis of critical steps in the implementation process. American Journal of Community Psychology, 50(3), 462–80. Retrieved from http://link.springer.com/article/10.1007%2Fs10464-012-9522-x#/page-1.

[3] Metz, A., Naoom, S. F., Halle, T., & Bartley, L. (2015). An integrated stage-based framework for implementation of early childhood programs and systems. OPRE research brief 2015­48. Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

[4] See footnote 1

[5] Metz, A., & Albers, B. (2014). What does it take? How federal initiatives can support the implementation of evidence-based programs to improve outcomes for adolescents. Journal of Adolescent Health, 54 S92–S96.

[6] Eisen-Cohen, E. (2015). “We influence change”: Applying PDSA to increase the reach of WIC within the Maricopa County Department of Public Health. Public Health Quality Improvement Exchange. Retrieved from https://www.phqix.org/content/we-influence-change-applying-pdsa-increase-reach-wic-within-maricopa-county-department.