Issue #

20

June 25, 2025

Learning While Delivering

This week: Turning post-implementation reviews into engines of institutional intelligence.

Insight

Most post-implementation reviews check boxes instead of building better systems.

If you’re overseeing public delivery programs, you know how often decisions are made quickly—with little evidence and even less time to reflect. And when evidence does exist, it’s usually in the form of raw data or fading institutional memory.

Yet most teams don’t prioritize building that evidence base. Post-implementation reviews are typically treated as formalities—designed to validate success, not generate insight.

But when done right, these reviews are among the most powerful and underused tools for improving how government delivers. Intentionally planning how you’ll collect data, capture insight, and feed it back into implementation is how systems get smarter over time.

Learning isn’t a final step, it’s part of the implementation architecture.

Insight in Practice

Four ways to turn reviews into learning loops that fuel better decisions:

  • Use the budget process.
    Use tools from evidence-informed budgeting—like requiring outcome summaries or learning memos in budget submissions—to make sure insights from past programs shape what gets funded next.

    This means that decisions are based on prior insight and that new evidence is collected for the current program’s post-implementation review.
  • Create new information flows.
    Don't create new departments, create new information flows. Establish regular "learning reviews" where program managers present what they've learned (not just what they've accomplished). Make this a standard agenda item, not a special meeting.

    Learning reviews don’t require new infrastructure. They just need airtime with the right people: Program Execs, ADMs, DGs.
  • Start small.
    Pick one program/policy area where you have a willing champion and the political space to experiment. Establish a simple data collection system from day one.

    To get the most accurate data, keep in mind that stakeholders are more open to surfacing failure when they know it’s tied to learning, not blame.

    Gradually expand successful practices to other areas.
  • Focus on decision points, not process.
    Don’t try to fix the whole system. Start by mapping the 5–10 most important decisions your organization makes each year—like allocating training budgets, renewing pilot programs, or selecting delivery partners.

    Then design specific evidence inputs to support those decisions.

    Small changes in key moments lead to smarter systems.

Perspectives

To turn post-implementation reviews into a source of institutional intelligence, consider using their insights to shape future implementation checklists.

"We live in a world of great and increasing complexity, where even the most expert professionals struggle to master the tasks they face. (…) Checklists seem able to defend anyone, even the experienced, against failure in many more tasks than we realized. They provide a kind of cognitive net. They catch mental flaws inherent in all of us—flaws of memory and attention and thoroughness. (…) It is common to misconceive how checklists function in complex lines of work. They are not comprehensive how-to guides, whether for building a skyscraper or getting a plane out of trouble. They are quick and simple tools aimed to buttress the skills of expert professionals.”
— Atul Gawande, The Checklist Manifesto

In this brutally honest account of city policy delivery from Vital City, a team tries to reduce crime in public housing with little evidence to back their choices. It highlights the opportunity—and cost—of not embedding learning early.

“We scrutinized crime rates in the city’s public housing developments and identified the small number of places that bore the brunt of violence. We put together a budget (…) to have immediately visible assets and programs materialize in the key neighborhoods.

At the top of our list were two things: keeping community centers open late (…) and lighting the dark places (…) These ideas did not come out of deep research or a thorough literature review. It was more in line with the “we’ve heard” and “everyone knows” school of thought, as in “we’ve heard that if kids have nowhere to go at night, they’ll get into trouble” and “everyone knows you get knifed in a dark alley.”

But we had almost no idea how to answer a fundamental question: Would all or any of this reduce crime? Who would do such an evaluation?”
Politics, evidence and sheer dumb luck, Vital City

Question to Consider

What would change if your post-implementation reviews focused less on what was done right—and more on what was learned?

Quote of The Week

“Experience is not what happens to you. It is what you do with what happens to you.”

— Aldous Huxley

Start now, we’re here to help!