In hindsight, maybe I shouldn’t have said that my career was a Ponzi scheme.
First, oops. Second, for the sake of my career prospects after I return from spending more time with my family,1 a counterpoint: Analytics is not a Ponzi scheme! That post was some moron trying to go after us with false rumors. Analytics is fine. We have a long history of creating shareholder value, and that remains true today. And I’d love it, fellow optimists, if we could work together for the ecosystem.
Details, of my new idea:
—
In his frequently-bought but rarely-read opus Thinking, Fast and Slow, Daniel Kahneman famously argues that everyone has two systems for thinking:
System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control.
System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.
System 1 is reactive, and makes decisions instinctively. While this system can be trained—it includes “learned skills such as reading and understanding nuances of social situations,” which are hard when we’re young but eventually become reflexive—it isn’t driven by the careful study of a particular situation. Instead, it responds automatically, using assumptions that we develop about how the world works.
When we think about thinking, Kahneman says, “we identify with System 2, the conscious reasoning self that has beliefs, makes choices, and decides what to think about and what to do.” Though we might assume that this second system—our brain’s logical processor—constructs the mental models that System 1 uses, the opposite is true: Much of System 2 is built by System 1. The impressions and feelings that originate from System 1 “are the main sources of the explicit beliefs and deliberate choices of System 2.” According to Kahneman, in the study of how we make decisions, we often focus on the analytical brain—but the real action is in the autonomous brain.2
Organizations, I’d say, think in an analogous way.3 Some decisions are made relatively slowly, with intention, and based on reasoned analysis. But most decisions are made quickly and instinctively, based on heuristics, experience, and some rough estimation of how the company works. For example, a business that struggles to bring in new customers with Facebook ads will eventually decide that online advertising doesn’t work for their market—and that will become an unspoken operational assumption that everyone “just knows.” If a CEO talks loudly about their product’s usability issues, people will begin to assume that product quality is one of their organizational weaknesses. If an executive spends too much time on Hacker News, they’ll make decisions based on the principle that they should do things that don’t scale, or that a company’s market matters more than their product, or that the best leadership teams are the ones that fight.4
When we talk about helping a company make better decisions—and particularly when data teams talk about it—we’re naturally drawn to improving how well System 2 works, and how it can counteract the impulsive nature of System 1. We need to rely less on intuition and more on insight, we say; we need everyone to be more comfortable reasoning about data. We need to invite more analysts into the rooms in which decisions get made. In many ways, this is how analysts define our jobs—to be System 2.
But I think that’s misguided, and why a lot of analysts are often frustrated by their position on the corporate seating chart. One of the key arguments in Thinking, Fast and Slow is that we can neither outrun or out-reason System 1’s unconscious engine. The same is true for how companies think: They have to make too many decisions to analyze all of them. No matter how fast we are or how good our tools get, most decisions will be made using System 1; we can’t whack-a-mole them away, by putting an analyst in every room or a BI tool on every laptop.
Nor can ever fully override System 1. Once executives have theories about how the world works, developed by their professional experience and confirmed their professional success, a bean counter with a disagreeable spreadsheet is unlikely to change their mind.5 If we are the company’s second system, we’ll always be second chair.
Still, this framework suggests that there’s a third option for data teams: Stop trying to work against System 1 thinking, and start working with it.
Drill, ream, countersink, spot face, hone and bore
I have no idea what’s been going on at Boeing over the last twenty years, but it sounds like the story went something like this:
A while ago, some Boeing executives developed a belief that Boeing planes were good, that their manufacturing standards were reliable, and that a variety of their quality assurance programs were more expensive than they needed to be. I imagine they asked some analysts to run the numbers: “If we cut some corners, will the doors start blowing off of our airplanes?”
“It depends,” the analysts probably responded. Today, we employ 300 door latch mechanics, and the probability of a door blowing off of an airplane this year is 0.456 percent. If we fire half of our mechanics, the probability of a door blowing off goes up to 1.661 percent. Here are various precise figures and statistics that say this might be dangerous.
“Eh,” the executives must’ve said, “1.661 percent still sounds very low. It sounds like our general theory is right; we can still make good planes with good door latches even if we cut our safety program’s budget. Plus, if we fire half of our mechanics, the probability of us saving 20 million dollars is 100 percent, which is a lot higher than 1.661 percent. George, fire the mechanics.”
And then Boeing fired the mechanics, and then the doors started blowing off of Boeing airplanes, and then Boeing was worth 40 billion dollars less than it was before the doors blew off its airplanes.
So the analysts crunched more numbers. If another door blows off of another airplane, they estimated, the FAA will require us to ground 12 percent of our fleet. Revenue will decline by between 15 and 18 percent over the next two quarters. Our stock will lose another 40 to 50 billion dollars. We will all get fired. A report was produced that contained many alarming charts.
We should rehire some of those mechanics, the (new) executives concluded. Job listings were posted.6 They did not specifically mention doors, but they’re probably about the doors.
One way to look at this story7 is that executives made several data-driven decisions, based on the careful research of their analytics team. But that’s almost certainly not what really happened? Though System 2 was present, it was overwhelmed by the executives’ System 1 assumptions for how Boeing works—first, that Boeing had highly reliable but overly expensive quality assurance programs, and then, all of the sudden, that product quality is an existential problem and that Boeing should put “safety and quality at the forefront of everything that we do.”
Obviously, Boeing’ executives didn’t change their mind because of some analyst’s report; they changed their mind because the doors started blowing off of their airplanes. Years ago, analytical warnings from engineers didn’t change any executives’ minds about the need to invest in quality and safety; now, I suspect, no amount of analysis could change their minds that anything matters more than quality and safety. And that’s the point—when companies make decisions, analysis will almost always be the lower order bit. The mental model dominates.
Theory over action
So does this mean analysts are useless? No—it just means that we’ve probably been doing the wrong thing.
Today, if you ask an analyst what their job is, the Very Smart answer is that they help people make better decisions by providing them with actionable insight. We’re at our best, we say, when we can study a strategic problem and make concrete recommendations about how to address it.
But these sorts of recommendations are individual fish. Best case, they’ll feed a company for a day.8 The much more powerful thing that we could do is teach a company a theory—or as Ahbi put it, our job shouldn’t be to look for insights, but to mutate mental models.
In other words, we shouldn’t work against an organization’s System 1 thinking; we should work through it. Don’t veto biased thinking; update the bias. Don’t protect companies from making decisions with rough heuristics; make sure the heuristics are still current.
Which, easier said than done—it took the several massive tragedies for Boeing’s executives to update theirs. Still, for those of us who work on more trivial problems than keeping planes in the air, I think there are a couple relatively simple things we could all do to nudge ourselves in that direction.
First, we should try to explicitly identify the assumptions that people have about how stuff works. Bobby Pinero describes a potential approach:
I would keep a list of every comment I overheard in any discussion that sounded like an opinion, a thought, or a disagreement that I thought might possibly be proved, debunked, or clarified with some form of analysis. I would source these comments in meetings, water-cooler chats, over lunch, happy hour, or any other forum of discussion. Tune your ear, and you’d be quite surprised at the number of conversations with no resolution, or worse yet, that decisions are made from unfounded, unproven opinions.
These unproven opinions are the edges of an organization’s System 1 brain. Just as Kahneman’s exploration of our cognitive biases was the first step in learning how to work with them, Bobby’s observational search for hidden mental models is our first step in figuring out how to mutate them.
Second, we probably shouldn’t focus on delivering “actionable insight;” we should deliver theories. If we conclude some analysis with a recommendation to take this particular action in this particular scenario, it’s easy to view that result in isolation. It might violate some System 1 framework but it doesn’t challenge it. It can be comfortably dismissed as an exception.
Theories, by contrast, ask people to see the world differently. There’s something inherently seductive about theories, from orange peel theory to loud budgeting to theories about the relationship between Boeing’s quality assurance engineers and their airplane’s doors. They compel us to stop and reflect, in ways that facts and insights don’t. And if we change them, the effects echo, not through one decision, but through thousands.
Or that’s my theory, at least until I can get another Ponzi scheme up and running.
I mean, no, good grief, this is a joke. I wasn’t fired because of that post. I left to pursue other interests.
These quotes are from page 21 of Thinking, Fast and Slow, because I bought it for my Zoom background and obviously haven’t read anything past chapter 2.
Well, I wouldn’t say this. Abhi Sivasailam said this, to me, at a conference last week. It was a very good analogy, and I like analogies, so I stole it.
If there’s an earthquake in New York on the same day that Olivia Rodrigo plays at Madison Square Garden and Griff drops a new EP, I will assume that it’s a sign from the cosmos and will spend the rest of my life trying to figure out What It All Means.
As analysts, we often get upset about this, but we do the same thing. Imagine, for example, that a high school student is asking you for career advice. Think about the various fortune cookie aphorisms you might tell them, like “find a great manager, because people don’t quit their jobs, they quit their bosses.” Now imagine that I show up with some detailed analysis that says, well actually, that’s not quite right. Would you immediately change your mind? The next time you were looking for a job, would you pay less attention to who your potential manager would be because my charts said it doesn’t matter as much as you think? Or would you discount my analysis, or try to contort it to fit into your existing worldview? Of course you would—and probably should—do the latter! Once we develop these sorts of general theories about how the world works, it’s really hard to reject them.
This job posting is wild. The job duties are basically to build an entire airplane? You will be asked to “assemble aircraft and/or spacecraft structures or aircraft and/or spacecraft support equipment structures.” You will also need to “install press fit bushings and force fit bushings. File, fit, hone, ream, drill, tap, saw, burr and adjust.” Dodge, duck, dip, dive, and dodge!
In which all the specifics are quite obviously made up.
Worst (and more likely) case is that they’re deemed “interesting” and dismissed, much like the warnings were at Boeing.
First - I am sad to hear of the events of this week, and at the same time I feel like I had excellent timing in my own departure. Second - I appreciate that you make something as serious as the Boeing situation with doors blowing off planes hilariously funny to read about. Third - only you could figure out a way to weave major current event headlines (Boeing's shitshow, Kahneman's passing and What Went down on Wednesday) into a thought provoking and truly engaging read for this week. Whatever you find yourself doing in the coming weeks (spending quality time with family, pursuing other interests and exploring options), I hope this substack remains a constant. I feel incredibly privileged to have worked with you, albeit only briefly and in an at best tangential manner, and I will continue to be an avid fan of Benn.substack, as I have been a loyal subscriber for well over a year and get truly excited every Friday to read this.
There are echos of this same principle in one of Matt Levine’s posts this week about meme stocks.
“With time, I have become more comfortable with the answer to "what are we all doing here?" The answer is "not fundamental analysis." Maybe it is
"having fun online." Maybe it is "playing a complex game of mass psychology." Maybe it is "using our investments as a form of self-expression, buying stocks and cryptocurrencies we identify with and feeling better about ourselves if they go up." The third era is new, and we do not understand the mechanisms here as well as we understand discounted cash flow analysis, but maybe there are mechanisms to discover; maybe in 10 years there will be textbooks on Meme Stock Analysis.”