Evaluation and Storytelling: Proving Worth and Sharing the Impact

Master the CDC Framework for Program Evaluation, design effective data visualizations and dashboards, produce digital stories for advocacy, integrate all components into cohesive narratives, and deliver compelling final pitches.

Evaluation and Storytelling: Proving Worth and Sharing the Impact

You've planned, designed, and implemented. Now comes the question every funder, policymaker, and community member will ask: Did it work?

But proving worth isn't enough. You must also share the story—in ways that inspire continued support, inform policy decisions, and honor the communities you serve.

The Evaluation Imperative

Beyond Accountability

Evaluation serves multiple masters:

For Funders: Did the investment produce results?

For Policymakers: Should this be scaled or replicated?

For Practitioners: What worked and what didn't?

For Communities: Did this improve our lives?

For the Field: What can others learn?

Good evaluation answers all these questions. Great evaluation does so while building rather than extracting from communities.

The CDC Framework for Program Evaluation

A Systematic Approach

The CDC Framework provides structure without being overly rigid:

Six Steps:

  1. Engage Stakeholders: Who needs to be involved? Who will use the findings?

  2. Describe the Program: What is the program theory? (Connect to your logic model from Week 4)

  3. Focus the Evaluation: What questions matter most? What's feasible to measure?

  4. Gather Credible Evidence: What data will convince skeptics?

  5. Justify Conclusions: What standards determine success? How will you interpret findings?

  6. Ensure Use and Share Lessons: How will findings be used? Who needs to know?

Evaluation Types

Process Evaluation:

Outcome Evaluation:

Impact Evaluation:

Connecting to Logic Models

Your Week 4 logic model becomes your evaluation blueprint:

| Logic Model Component | Evaluation Question | |----------------------|---------------------| | Inputs | Were resources sufficient and appropriate? | | Activities | Were activities implemented with fidelity? | | Outputs | Did we produce intended deliverables? | | Short-term Outcomes | Did knowledge/skills change? | | Medium-term Outcomes | Did behaviors change? | | Long-term Outcomes | Did health status improve? |

Data Visualization and Dashboards

Data Must Be Visible to Be Actionable

A 50-page evaluation report that no one reads fails its purpose. Visualization transforms data into insight.

Dashboard Design Principles

Clarity Over Cleverness:

Audience Awareness:

Real-Time When Possible:

Common Visualization Mistakes

Chart Crimes to Avoid:

Better Practices:

The One-Page Dashboard

For executive audiences, distill to essentials:

Top Row: Key metrics with trend indicators Middle: Progress toward objectives (process, outcome) Bottom: Action items and decisions needed

If they want more, they can ask. But they won't read a report.

Digital Storytelling for Advocacy

Beyond the Evaluation Report

Evaluation reports serve funders. Stories reach communities, policymakers, and the public.

"Data makes people think. Stories make people feel. People act on feeling, then justify with thinking."

Elements of Effective Digital Stories

The Human Element:

The Data Element:

The Multimedia Element:

Story Structure for Impact

The Challenge: What problem exists? Who faces it? Why does it matter?

The Solution: What was tried? How did it work? What made it different?

The Result: What changed? For whom? How do we know?

The Call: What should happen next? What can the audience do?

Ethical Storytelling

Consent and Control:

Representation:

Attribution:

Capstone Integration

Assembling the Whole

After eight weeks, you have:

The capstone integrates these into a coherent proposal.

Internal Consistency Check

Does everything align?

| Component | Alignment Question | |-----------|-------------------| | Problem Statement | Does it match your assessment findings? | | Logic Model | Does theory of change support causal claims? | | Intervention | Does it address persona pain points? | | Implementation | Is the Agile plan realistic for available resources? | | Budget | Do line items support planned activities? | | Evaluation | Do indicators map to logic model? |

Misalignment reveals gaps in thinking. Fix them before presenting.

The Narrative Thread

Your proposal tells a story:

Act 1 (Weeks 1-3): The problem is real, urgent, and solvable. We understand it deeply.

Act 2 (Weeks 4-5): Our solution is evidence-based, theory-driven, and community-designed.

Act 3 (Weeks 6-8): We can implement effectively, sustain over time, and prove our impact.

Every section should advance this narrative.

The Final Pitch

Simulation of Reality

The capstone pitch simulates real-world pressure:

The skills you develop serve you throughout your career.

Pitch Structure (10 minutes)

Opening Hook (30 seconds):

The Problem (2 minutes):

The Solution (3 minutes):

The Plan (2 minutes):

The Ask (1 minute):

Q&A (remaining time):

Handling Tough Questions

Budget questions: Know your numbers cold. "Our total ask is $X, with personnel at $Y..."

Evidence questions: Acknowledge limitations honestly. "The evidence is strongest for... We're still learning about..."

Sustainability questions: Show long-term thinking. "We've identified three potential pathways to sustainability..."

Scale questions: Be realistic. "We're starting with [scope] because... If successful, we could expand to..."

Common Pitch Mistakes

Too Much Detail: You have 10 minutes, not 60. Hit highlights; they'll ask if they want more.

Reading Slides: The audience can read. Add value beyond what's on screen.

Defensive Responses: Questions aren't attacks. "That's a great question" is almost always appropriate.

No Clear Ask: End with a specific request. "We're asking for $X to accomplish Y."

The Transformation Complete

What You've Become

Eight weeks ago, you were learning to plan programs. Now you are:

A Systems Thinker: Seeing connections, feedback loops, and leverage points

An AI-Augmented Analyst: Using technology to deepen and accelerate assessment

A Human-Centered Designer: Starting with empathy, designing with communities

A Logic Model Architect: Building clear pathways from inputs to impact

An Agile Implementer: Adapting to reality while maintaining direction

A Resource Mobilizer: Understanding money, sustainability, and advocacy

An Evaluator and Storyteller: Proving worth and sharing impact

This is modern public health practice.

The Ongoing Practice

Program planning isn't a course—it's a career. You'll use these skills:

The frameworks become second nature. The thinking becomes reflex.

The Impact Imperative

Remember why this matters:

Communities are waiting for solutions to problems that harm them daily. Policymakers need evidence to make better decisions. Resources are limited and must be allocated wisely.

Your ability to plan, implement, and evaluate effective programs—then tell their story compellingly—directly affects whether health improves.

This is the privilege and responsibility of public health practice.

Now go make health happen.


Continue Your Learning

This article is part of an 8-week course on Adaptive Program Planning in the Digital Age. Learn systems thinking, AI-augmented assessment, Human-Centered Design, and Agile implementation for modern public health practice.

Watch the Full Course on YouTube