Issue #
20
June 25, 2025
This week: Turning post-implementation reviews into engines of institutional intelligence.
Most post-implementation reviews check boxes instead of building better systems.
If you’re overseeing public delivery programs, you know how often decisions are made quickly—with little evidence and even less time to reflect. And when evidence does exist, it’s usually in the form of raw data or fading institutional memory.
Yet most teams don’t prioritize building that evidence base. Post-implementation reviews are typically treated as formalities—designed to validate success, not generate insight.
But when done right, these reviews are among the most powerful and underused tools for improving how government delivers. Intentionally planning how you’ll collect data, capture insight, and feed it back into implementation is how systems get smarter over time.
Learning isn’t a final step, it’s part of the implementation architecture.
Four ways to turn reviews into learning loops that fuel better decisions:
To turn post-implementation reviews into a source of institutional intelligence, consider using their insights to shape future implementation checklists.
"We live in a world of great and increasing complexity, where even the most expert professionals struggle to master the tasks they face. (…) Checklists seem able to defend anyone, even the experienced, against failure in many more tasks than we realized. They provide a kind of cognitive net. They catch mental flaws inherent in all of us—flaws of memory and attention and thoroughness. (…) It is common to misconceive how checklists function in complex lines of work. They are not comprehensive how-to guides, whether for building a skyscraper or getting a plane out of trouble. They are quick and simple tools aimed to buttress the skills of expert professionals.”
— Atul Gawande, The Checklist Manifesto
In this brutally honest account of city policy delivery from Vital City, a team tries to reduce crime in public housing with little evidence to back their choices. It highlights the opportunity—and cost—of not embedding learning early.
“We scrutinized crime rates in the city’s public housing developments and identified the small number of places that bore the brunt of violence. We put together a budget (…) to have immediately visible assets and programs materialize in the key neighborhoods.
At the top of our list were two things: keeping community centers open late (…) and lighting the dark places (…) These ideas did not come out of deep research or a thorough literature review. It was more in line with the “we’ve heard” and “everyone knows” school of thought, as in “we’ve heard that if kids have nowhere to go at night, they’ll get into trouble” and “everyone knows you get knifed in a dark alley.”
But we had almost no idea how to answer a fundamental question: Would all or any of this reduce crime? Who would do such an evaluation?”
— Politics, evidence and sheer dumb luck, Vital City
What would change if your post-implementation reviews focused less on what was done right—and more on what was learned?
“Experience is not what happens to you. It is what you do with what happens to you.”
— Aldous Huxley