The Data
A new diabetes prevention program costs more than standard care but improves health outcomes. The base-case analysis suggests it is cost-effective, but every input value carries uncertainty. (Data are simulated for illustration.)
Base-Case Cost-Effectiveness
The base-case looks favorable.
But how sensitive is this conclusion to our assumptions? Which parameters matter most?
Tornado Diagram
One-way sensitivity analysis varies each parameter individually while holding others at base-case values. The tornado diagram shows which parameters have the largest impact on the ICER. Longer bars mean greater influence on the decision.
Vertical dashed line = $100,000/QALY threshold. Bars crossing this line indicate parameters that could flip the decision.
Some parameters could flip the decision on their own.
But in reality, multiple parameters might be wrong simultaneously. How do we account for that?
Probabilistic Sensitivity Analysis
One-way analysis varies parameters individually. Probabilistic sensitivity analysis (PSA) varies all uncertain parameters simultaneously, drawing from probability distributions to generate thousands of possible scenarios.
What Is Probabilistic Sensitivity Analysis?
PSA acknowledges that all parameters are uncertain at once. Instead of varying one parameter while holding others fixed, PSA:
- Assigns a probability distribution to each uncertain parameter
- Draws random values from each distribution simultaneously
- Recalculates the ICER for each combination (typically 1,000-10,000 times)
- Reports the probability that the intervention is cost-effective
One-Way Analysis (OWSA)
- Vary one parameter at a time
- Hold all other parameters at base case
- Shows which single parameters matter most
- Cannot capture parameter interactions
Probabilistic Analysis (PSA)
- Vary all parameters simultaneously
- Draw from probability distributions
- Captures realistic joint uncertainty
- Quantifies decision confidence
PSA gives us a probability, not just a point estimate.
How confident should we be in our cost-effectiveness conclusion? Let us run a simulation.
Interactive PSA
Run a probabilistic sensitivity analysis with 1,000 simulations. Each dot represents one possible scenario based on random draws from parameter distributions. Adjust the uncertainty ranges to see how they affect the probability of cost-effectiveness.
The probability gives decision-makers realistic expectations.
What should we conclude from this analysis?
Key Insight
Sensitivity analysis transforms cost-effectiveness from a yes/no answer into a probability statement about decision confidence.
Key Takeaway
Reporting a single ICER without sensitivity analysis is like reporting a poll result without a margin of error. Decision-makers need to know not just whether an intervention appears cost-effective, but how confident they should be in that conclusion. Sensitivity analysis identifies which parameters could flip the decision and quantifies the probability of making the right choice under uncertainty. This is what economists mean by "characterizing decision uncertainty."
Concepts Demonstrated in This Lab
Questions for Reflection
A program has a 70% probability of being cost-effective. Is that high enough to recommend adoption? What factors might influence this decision?
The tornado diagram shows treatment effect is the most influential parameter. What additional research might reduce this uncertainty?
How might a decision-maker weigh a highly uncertain but promising intervention against a well-established but modest one?
If the probability of cost-effectiveness changes dramatically with different willingness-to-pay thresholds, what does this tell us about the decision?