In my earlier work, I applied systems modeling to business problems using agent-based and system dynamics models. My background in systems thinking prompts me to question every time we make claims we make about causal relationships in data that has many moving and interacting parts, as in most business analytics problems.
In business, when we look for cause-and-effect relationships, we typically measure the impact of a single action. The stakes are measured in financial terms, not human lives. Even if we get it wrong and attribute the increase in sales to a new ad rather than a competitor’s price hike, we can only lose money. But if we fail to identify the causes of one accident, we may fail to prevent another one, and the cost is human life.
So with accidents, we need a different framework to answer causal modeling questions, a deadly one of which is: why did the plane crash? We can no longer downplay interactions, interference, interdependencies, and feedback loops, as we tend to do when measuring the causal effect of a new ad or coupon. Systems thinking should be part of the modeling process.
Causal Analysis based on System Theory (CAST) is an accident analysis framework that aims to maximize causal learning from accidents and incidents. This is clearly a difficult and vital task. Here’s a nice summary of CAST (credit to Joel Parker):
-
Causal: Don’t assume accidents are due to one “root cause” or a few “probable causes”, because it turns out that most accidents are actually due to many interacting causes.
-
Analysis: Don’t blame people, because it turns out you learn more by doing a blame-free examinations of why a loss occurred, and how it occurred i.e. “ask why and how, not who”.
-
System: Don’t fix just the one thing that broke, because it turns out it’s smarter to discover multiple causes, then consider multiple ways to improve the whole system.
-
Theory: Don’t wait until something breaks, because it turns out it’s wiser to plan ahead by using scientific control theory and process model theory.
One of the basic tenets of CAST is that human error is a symptom of a system that needs to be fixed: redesigned or reengineered. Indeed, this is often the case. It’s easier to blame a person for pushing or not pushing a button, but why was that button needed in the first place? Well, that’s a systems question.
Root cause analysis and other approaches that are motivated by finding a single cause (e.g., average treatment effect of X) tend to miss the bigger picture by oversimplifying, and lead to a false sense of accomplishment in “solving the problem”. Here’s an example from the CAST handbook by Nancy G. Leveson:
In the crash of an American Airlines DC-10 at Chicago O’Hare Airport in 1979, the U.S. National Transportation Safety Board (NTSB) blamed only a “maintenance-induced crack” and not also a design error that allowed the slats to retract if the wing was punctured. Because of this omission, McDonnell Douglas was not required to change the design, leading to future accidents related to the same design error.
Takeaway?
Every time we think we’ve identified and estimated a causal effect, we may be better off pausing for a moment and taking a systems view. A systems mindset will lead to more questions than answers, but asking those questions is better than jumping to conclusions. This is a way to increase our chances of actually “solving” the problem.