Does data make us cowards?
The thin line between being analytical and being afraid.
Let's talk about a hard decision: Choosing a movie on Netflix. You and your roommates are looking for something to watch on a Friday night. After burning an hour watching trailers and wading through page after page of aggressively early holiday rom-coms and remastered director's cuts of The Guns of Navarone, you find two movies that everyone likes. The data is promising: Netflix says they're both popular; Rotten Tomatoes says they're both good. With half the room preferring one and the other half preferring the other, you check how long both of them are. Would you rather for both to be a reasonable length, or—truly; be honest—one of them to be way too long?
Let's talk about a hard decision: Planning a product roadmap. You're running a company that serves two audiences, and it’s pulling your product in opposite directions.1 You know you can't split the difference; you have to commit to one path. Three months of analysis show that each option is promising. Your company's leadership is divided on what's best. You hire a consulting firm to conduct their own research on the two markets you could potentially serve. Do you want the study to say that both options are extremely promising, or—truly; be honest—hope that it says one is good and the other is bad?
Let's talk about a hard decision: Settling on a pricing model. Your company is launching a new product, and you're responsible for deciding how to price it. You and the data team have been working for weeks trying to figure out the different implications of various options. The results are stubbornly inconclusive. No model is clearly better than the others. You're pretty sure a handful of proposals are equally good, though each has its own tradeoffs and internal advocates on the executive team. You've got a week left before the decision is due. Do you go ahead and pick one, or—truly; be honest—do you keep plugging away in hopes of finding some small new detail that finally breaks the tie?
Let's talk about a hard decision: Picking a restaurant. It's Monday night, and you're planning a surprise dinner for several out-of-town friends. They’re visiting from Barcelona, and, like anyone who’s been to Barcelona, love to remind you that they’ve been to Barcelona. Determined to show them that San Jose is a cool city too, you find two places that are well-reviewed on Yelp. One was Michelin-reviewed and is known for their scallops and ambiance; the other was a top pick in The Infatuation and serves cocktails in tiny top hats. Unable to decide, you check OpenTable. Do you want both to be available, or—truly; be honest—hope that only one is?
Let's talk about a hard decision: An internal promotion. You run a growing team full of star performers, and need to promote one of them to help lead it. You know two people, both of whom you've worked with for years, would make great managers. Both deserve the opportunity, and both want the job. To remove any appearance of bias, you decide to interview them, with the help of several other department heads. After the interviews, you convene the panel to get their feedback. Do you want them to confirm that both candidates are outstanding options, or—truly; be honest—do you hope that one candidate stumbled and there’s now a clear favorite?
Let's talk about a hard decision: Figuring out your holiday plans.2 A pandemic is finally ending. You haven’t seen your extended family, who lives in a different state, in two years. But neither has your significant other, and their family lives in a third state. You check plane ticket prices; you ask who else will be in town; you see how each potential trip affects your other travel obligations; you ask how you'll take care of pets and house plants. In asking these questions, do you want to make sure both trips are possible so that you can make the best decision, or—truly; be honest—are you looking for something to make the choice easier?
Personally,3 professionally,4 and financially,5 I’m deeply invested in the value of data and the analysis of that data. Data, I truly believe, captures complex and abstract realities about the worlds around us that we wouldn’t be able to detect otherwise. Analysis makes us better equipped to react to and understand those realities. We should, in the shorthand of our time, listen to data and believe science.
But often, we turn to data for reasons other than education. In some instances, data is a rhetorical club and dishonest cover, “a weapon for prosecuting your point, and a defense for protecting yourself as reasonable and impartial.” While this sort of lying isn’t always easy to see, there are at least lots of people looking for it.
Decision makers, from CEOs to the person on the couch with the remote, also use data to serve a more subtle purpose: As a substitute for courage. In these cases, we don’t use data to lie to others; we use it to lie to ourselves.
In all of the hard decisions above, the concluding piece of analysis wasn’t commissioned to find some novel insight, or to avoid a bad choice. It was a lifeboat, an escape from the paradox of choice, meant to carry us away from a tough decision. In each scenario, we’re looking for something to tilt the scales in something’s—anything’s—favor. Our analysis isn’t meant to take on the problem; it’s meant to dodge it.
As an analyst turned executive (turned disgruntled blogger6), I relate to this temptation. In countless instances, leaders are asked to break ties in high-stakes decisions for which the analysis is inconclusive and other people’s opinions are split. Owning that decision is hard. You have to tell people that they won’t get their way and you’re the reason why. And if the decision doesn’t work out, you have to own that result too.
Data offers an incredible out. If some bit of analysis can make the choice appear obvious, you can tell people who disagree with you that the decision wasn’t yours, not really—you were just listening to the data; your hands were tied; you want to be data-driven, right? Later, if the decision works out, you can claim credit for your cleverness. If it doesn’t, you can say you just did what any reasonable person would’ve done.
Analysis like this, that’s chasing a conclusive result for its own sake, is a cover for surreptitiously handing decisions over to chance. It’s an abdication of responsibility, motivated not by data or principled reasoning, but by fear. We do this when we’re afraid to commit to a choice, or when we don’t want to admit to the option we’d prefer.
It’s also tempting because, frankly, you can usually get away with it. But it’s ineffective leadership. Leaders aren’t promoted or elected because they’re better at running cost-benefit analyses than everyone else. They’re in charge because they don’t shirk from their responsibilities when that calculus is inconclusive.7 They’re in power not to be calculators, but to be courageous.
Making the leap
When people go fishing for courage, analysts can get caught in the crossfire, asked to be unwitting quantitative therapists for indecisive leaders. We can only do so much to resist this, though keeping an eye on how long it takes to make a decision can at least make us more mindful of when it's happening.
Still, these moments are instructive, not in what they teach us the jobs we have today, but in what they reveal about the jobs we could have tomorrow.
Despite my apparent inclinations to the contrary, I'm long on analysts and their career prospects. They’re generally nimble and crisp thinkers;8 they’re comfortable zooming in and out in their work, moving from tactical problems to abstract strategy; they work across different business units, giving them a wide perspective on how companies work. These are good attributes to have on executive teams, in boardrooms, and walking through legislative halls.
But, we often have one fatal flaw: We hide in our data. To whatever extent that today’s executives are willing—even unknowingly—to dodge hard decisions with data, analysts would surely be worse.
It's not exactly our fault—it's what we're trained to do. We're taught to turn ambiguous problems into mathematical models; we’re encouraged to be rigorous and objective; we’re told to be unrelenting in our pursuit of the unknown; and we're prompted when we do these things well. It's easy to assume, in a paraphrased version of the Peter principle, that this is what we should keep doing, all the way to the top.
For better or for worse, it's not. Not every problem is quantifiable; not every choice will have a clear consensus; not every decision can be data-driven. Sometimes, the most we can say is, “We don’t know, and we’re not going to know.” When that happens, someone still has to make a call—and they need to own that call, and its consequences, as their own.
For some of us, that’s deeply discomforting. That’s ok—presidents need advisors who don’t carry the same weight that they do. Those advisors, however, have to recognize that in some moments, their advice—their numbers and their charts—isn't what’s needed.
In those moments, when the path ahead is foggy and people’s opinions are divided, it doesn't matter how smart we are if we're at the head of the table. When the room turns to us, they aren’t looking for a final insight to nudge one option ahead of the other. They need to know our opinion. They need to see our conviction. They need us to be courageous.
Without that courage, we’re just clever puppets, dancing to whatever tune our data sings to us, hoping nobody sees the wires we’re submitting to. To be real leaders, we have to prove ourselves brave enough to know when to walk on our own.
On the discourse
Last week's post on measuring analytical work started some fights. First, fair. Second, being too-long winded for Twitter (and Slack, but that's a different rant for a different day), I wanted to give a proper response to those who objected.9
Though the details differed, nearly every objection took issue with focusing on speed “without qualification or caveat.” This approach, the arguments go, is simplistic and dangerous. Analysis is complex, and we need multidimensional frameworks for assessing it.
On one level, sure; analysis is complex, and there will always be exceptions to general rules. But, as far as guiding principles go, most analysts don’t need to be reminded to be more cautious. Our bias (as this week’s main piece argues) usually runs in the other direction. If, as an industry, we find ourselves barreling through warning signs—or outright misleading people for the sake of lowering a “time to decision” metric—I’d absolutely agree that this approach is misguided. But in our current context, in which we’re much more inclined to “well ackshually” a proposal than blindly agree to it, qualified descriptions of success only encourage that tendency. That—both in our analysis and in posts like last week’s—is exactly the problem with nuance: More than clarifying a precise point, nuance often muddies the idea, and gives people room to interpret it through whatever lens they find most comfortable. Given the choice between recommending that analysts focus on speed, full stop, and recommending they use a carefully balanced framework of competing and caveated measures, I enthusiastically choose the former.
Beyond that general objection, I’d group people’s challenges to the post into three buckets:
The rubber stamp objection: People have already made a decision, and it’s a bad one. They just want your analysis to prove them right. If we optimize for speed, we become sycophants.
This is a valid concern, especially on the margins. If we know what someone wants to do, we might be less inclined to take a serious look at the other alternatives. To correct for it, it’s worth adding an addendum to the original principle: Start by convincing yourself of what’s right, and then convince other people as quickly as you can.
The one-way door objection: We need to get big, irreversible decisions right. In these cases, speed is a bad metric.
This is true, though as both Tristan and Sven Balnojan say, we tend to treat a lot of two-way doors as one-way doors. (We do this, I think, because going back through the door might require another hard decision. A risky hire, for example, is a two-way door that emotionally feels like a one-way door.)
Even for true two-way doors, however, I don’t think it’s bad for analysts to still optimize for speed. The decision maker shouldn’t necessarily do that; their bar for what it takes to be convinced should be higher. But analysts should still be racing to get over that bar.
The ROI objection: One big decision that takes a long time can be way more valuable than a bunch of small decisions that are made quickly. This is related to the “it’s just wrong” objection: Analysts aren’t good because they’re fast; they’re good if they understand the problem and inform the decision.
Both of these things are also true, though I think they misrepresent the original argument (or, more likely, the original argument misrepresented what was in my head). The point isn’t for analysts to make decisions quickly in an absolute sense; it’s to help people make decisions quickly given the decisions that need to be made. Implicit in Kobe’s regime of making 2,000 shots a day was that the shots are real, not dropped from a ladder above the rim. Similarly, implicit in any measure of analytical success is that we continue to work on real problems, and that we don’t game the principle by declining to take on anything that might slow us down.
Based on a true story. Either way, I promise I’ll be home for Christmas.
I’ve been tracking some “quantified self” nonsense in two spreadsheets for more than 900 straight days.
I have a weekly blog for yelling about data.
Launching soon, www.bennthoughts.gov.www\bennthoughts.
“Building consensus” follows a similar pattern. In the face of difficult decisions, some leaders act as though their job is to convince everyone to get on board. While alignment is useful, consensus can also be a stand in for courage. It’s a way to share the burden of the decision, and to deflect blame—“this faction objected, so I couldn’t do it.” This, too, is wrong. Leaders are in charge to make decisions in the absence of consensus. Declining to do so is conflating what you can do with what you can comfortably do.
I really appreciate all the comments. If you’re here for a real conversation, consider this an invitation to continue it. And if you read this and are like, “I just wanted to shitpost, we’re all talking about Taylor’s re-release of Red, move on already,” all good too, no hard feelings, you can stop reading, looking forward to the next one.