Bob Lewis
Columnist

Evidence-based decision-making: A framework for IT

Opinion
Sep 18, 20255 mins
Business IT AlignmentIT Governance FrameworksRisk Management

‘Evidence-based decision-making’ sounds like a great idea. Before trying to build it into your IT culture, you’ll first need to figure out what it means.

Debating before reaching a decision. Shot of a businessman leading a meeting in the boardroom.
Credit: PeopleImages.com - Yuri A / Shutterstock

Solicit a proposal from your average consultant and you’re likely to unearth in the ensuing PowerPoint a slide that extols the virtues of the consulting firm’s “frameworks and methodologies.”

You — and, for that matter, the consultants as well — probably chalked up the phrase to the realm of redundancy-driven concept amplification, where saying the same thing twice using different words lends a certain profundity to the proceedings.

But frameworks and methodologies aren’t the same thing. They’re both important when you’re trying to organize a change effort, but they’re fundamentally different.

A framework shows how the relevant bits and pieces fit together. A technical architecture framework, for example, explains how applications, platforms, infrastructure, and so on can be assembled into a functioning technical environment.

A methodology describes the work that must get done to assemble the frameworks’ various components into a working whole.

Fail to settle on a framework and you’ll have the equivalent of a pile of lumber and other construction stuff. Lack a methodology and nobody knows which 2×4 to pick up and nail into place, and when to pick it up.

A rational approach to IT decisions

Which takes us to your desire to incorporate evidence-based decision-making into your organization’s business culture, and the need you didn’t know you had for a framework to facilitate it.

Fortunately for you, your tax dollars (okay, my tax dollars) have led Minnesota Management and Budget to develop a framework for defining evidence in a practical way.

While developed to help Minnesota government agencies make evidence-based policy and service-design decisions, you and your organization might find it useful, too. To get you started, here’s a quick summary of MMB’s six-level framework:

  • Proven effective: A service or practice that’s been Proven Effective offers a high level of research on effectiveness for at least one outcome of interest. Qualifying evaluations use rigorously implemented experimental or quasi-experimental designs.
  • Promising: A Promising service or practice has some research demonstrating effectiveness for at least one outcome of interest. Qualifying evaluations use rigorously implemented experimental or quasi-experimental designs.
  • Theory Based: A Theory Based service or practice has either no research on effectiveness or research designs that do not meet standards. These services and practices may have a well-constructed logic model or theory of change but, at the risk of being redundant, aren’t supported by evidence.
  • Mixed Effects: A Mixed Effects service or practice offers a high level of research on the effectiveness of multiple outcomes. However, the outcomes have contradictory effects.
  • No effect: A service or practice rated No Effect has no impact on the measured outcome or outcomes of interest.
  • Proven Harmful: A service or practice that’s Proven Harmful offers a high level of research that shows program participation adversely affects outcomes of interest.

Weighing risk vs. reward

A framework like MMB’s can be immensely valuable, especially when contrasted with meaningless catch-phrases like “best practice” or advocacy based on a key decision-maker “trusting their gut.”

But it does have a serious limitation: While decision-makers can be confident that approaches rated proven or promising will result in a positive outcome, that doesn’t mean they’ll deliver as good an outcome as you might get from one or more untried alternatives.

And in fact, MMB’s framework could, due to its chicken-and-egg-ish nature, stifle promising alternatives that have never been tried.

So if your goal is to minimize the risk of failure, limit your choices to what has been proven effective or are promising. And certainly, avoid those that are proven harmful.

But limiting your choices like this also means filtering out alternatives that might turn out to be spectacular successes — when, that is, there’s reason to think the value returned from an untried, theory-based approach exceeds the risk of failure.

Because (okay, call me Captain Obvious) until you and a bunch of others have tried a bunch of stuff that hasn’t yet been proved effective, it can never be proved effective.

A theory-based conclusion

The moral of this story is that choosing a course of action based on evidence-based decision-making should, for most business managers, be the default way of making decisions. But that doesn’t mean you shouldn’t ever try alternatives that might not work out.

Especially, if you find the theory supporting a course of action to be convincing and you can manage the risk that comes with something untried.

And that’s where progress comes from.

P.S. I know you’re wondering if MMB’s framework is, in fact, proven, or even promising.

Me too. So I’m counting it as a Theory-Based approach, which probably makes it as good as any alternative you’re likely to encounter.

See also: