Design and Implementation

The purpose of the third phase of PDCA is to check on the results, review data, compare what has happened to what was planned and expected, and make decisions about needed improvements. Is the program going as planned? How do the data compare to what was expected? What worked? What did not work and why? What did you learn? Did anything surprise you? Could you make implementation more efficient? Are you meeting the needs of the community you intend to serve? These questions are best answered by using data to track progress, monitor, and measure on a regular basis.

Even with a well-articulated plan, missteps and mistakes are likely. The key to progress after a mistake is what happens after a problem is identified. Leaders should gain an understanding of what happened and why and then correct the course. Document what happened. Be honest and transparent about what happened, what is being done to fix the problem, and what lessons have been learned. When things do not go according to plan, when results do not match expectations, or when results are not positive, it may require boldness and courage to acknowledge what happened and make changes.

Example of Checking

An example of check is found in the work of the Nurse-Family Partnership (NFP). This home-visiting program for low-income first-time mothers had a system for collecting and reporting data that the NFP used throughout implementation in community settings. It provided information on how implementation of key features was going; whether there were indications of positive effects from the program; descriptive data on the target population; aspects of implementation such as frequency, duration, and content of home visits; data on program management practices (such as how often reflective supervision occurred); and other specific observable items (such as tobacco and alcohol use during pregnancy and results of developmental screenings).

The data system relied on information from the ground up—reports from every supervisor and nurse, on every home visit—and it allowed regional staff to recognize and resolve problems: “When patterns of concern are observed in data from many different implementing agencies, changes can be planned in the guidance provided to new agencies [and] the education required from all new NFP home visitors and supervisors.” This approach also elevated pervasive issues so that “they could then be addressed by systematically strengthening implementation supports.”[48]


[48] Halle, T., Metz, A., & Martinez-Beck, I. (2013). Applying implementation science in early childhood programs and systems. Baltimore: Brookes Publishing. Page 197.