Discover more from benn.substack
The insight industrial complex
Most people just want to know what's going on.
At some point over the last decade, every company became a platform, leeching all meaning out of the word:
Peloton, which sells indoor exercise bikes, calls itself an “interactive fitness platform.” Casper, which sells mattresses, is a “platform built for better sleep.” Beyond Meat, which sells faux-burgers that taste like beef, pork and poultry, insists that these are actually “three core plant-based product platforms.”
Companies are stale and corporate. Tools are lifeless antiques. Services are valued at low revenue multiples. But platforms—platforms are dynamic and modern. They have viral flywheels, and network effects, and economies of scale. They don’t sell to customers; they champion their communities. Since 2020, 1,926 companies have been accepted to Y Combinator. Thirty-four percent, or 655 startups, market themselves as a platform.
Fortunately, styling ourselves as platforms is a mostly victimless crime; it might make some smug little Substackers roll their eyes, but forty percent of YC’s top companies are platforms, so it also might make you successful.2 Unfortunately, oversaturating the definition of a platform isn’t the worst act of rhetorical strip mining that we routinely engage in. And our other grammatical grift—bludgeoning the word “insight” to death—has more meaningful costs.
A platform for insight
As data teams, we’ve painted ourselves into a corner. On one hand, no data team wants to be a help desk or dashboard factory, resolving Jira requests for data pulls or cranking out ghosted dashboards. On the other hand, as much as we might resent it, this is some of the most important work we do. Optimistically, we’re victims of our boring successes; cynically, our egos are bigger than our abilities.
Data vendors—analyst advocates, community champions, etc, etc—have found a solution: We should build some dashboards, yes, but our higher calling is to find insights.
MicroStrategy, a BI tool (and “platform built for disruption”), promises to “bring actionable data insights to everyone.” PowerBI, a BI tool (and “analytics platform”) will “quickly find meaningful insights.” Looker, a BI tool (and “data analytics platform”), and can unify and empower your teams with “integrated insights.” Sigma, a BI tool (and “cloud analytics platform”), tells us we “need insights for the enterprise.” ThoughtSpot, a BI tool (and “developer-friendly platform”) can “get insights faster from your cloud data.” BusinessObjects, a BI tool (and “single, scalable platform”) provides “access to real-time insights.” Sisense, a BI tool (and “leading cloud analytics platform”), will “uncover powerful insights.” Qlik, a BI tool (and the “only cloud platform built for Active Intelligence”), modernizes analytics “for deeper insights and action.” Mode, a BI tool (and “single platform” for data teams) will “clear the path from data to insights.” Hex, a hosted notebook (and “magical, modern platform”), is for “generating insight.” Snowflake, a data warehouse (and “platform that powers the data cloud”) unlocks “previously unimaginable insights.” Teradata, a data warehouse (and “flexible data and analytics platform”), is used to “gain new insights.” Databricks, a data warehouse (and “simple platform to handle all your data, analytics and AI use cases”) helps people “derive new insights.” And that’s just on our homepages.3
What are these ubiquitous insights? We’re never really told,4 which is part of the point. Insights are grandiose enough to sound valuable, and amorphous enough to avoid actually saying what that value is. It promises buyers the world without promising anything at all. It confirms to us that our job isn’t to build dashboards without telling us what we should do instead.
Left to figure this out on our own, it’s easy to romanticize “finding insights” into eureka moments and dramatic presentations. Hollywood shows us what these look like: They’re the late-night discoveries that stop us in our tracks, and rescue the firm.5 They’re the whiteboards that upend an industry. They’re the needles that are found in haystacks of financial data and pieced together into a trading position that returns 489 percent. They’re the cracked codes that save the world.
Though our work is less cinematic—we send bulleted summaries on Slack; we don’t give presentations in the Oval Office—it’s hard not to picture ourselves somewhere in these characters.6 We’re storytellers, yes, but we don’t want to be just reporters; we want to be investigative journalists, more detective than narrator. We want to find the clues that others don’t see; to crack cases; to gather everyone around and tell them that things aren’t what they seem. I still remember an old tag line from Periscope Data’s website7 because of how well it resonated with how I saw myself as an analyst: “We love our customers, overpriced avocado toast, and that moment when a blip in the data makes you say, ‘wait a minute…’”
It’s not an outright fiction—these blips happen, and we all have our moments of theatrical glory. But they’re exaggerated, stretched, “inspired by” a true story. In reality, data teams can provide insights—they just don’t look like the photoshopped versions in our ads.
An insight by any other name
Gong, a sales call recording service (and “reality platform”), says that it can deliver sales teams “the insights they need to close more deals.” Chorus, another sales recording service (“and conversation intelligence platform”), markets the same kind of “actionable deal insights.” Both products claim to use AI to mine your customer calls for patterns that can identify which deals are at risk, figure out the types of pitches that resonate, and provide automatic coaching tips to sales reps.
Sounds great—but it’s not, apparently, why a lot of customers buy these products. I’ve heard from people at both companies that the most useful thing both products do is transcribe sales calls. Sales managers can use those transcriptions to quickly catch up on a deal, or to run simple searches to see how often different competitors get mentioned on calls. The fancy insights—deals that engage a VP in the negotiation stage are forty percent more likely to close! The highest performing reps talk twenty percent less than the lowest performing ones!—are neat, at best. But the real insights, the ones that people are paying for and are consistently valuable, are simple descriptions of fact. They tell sales teams what’s happening, and leave it to the sales teams to figure out what to do about it.
I’ve heard analogous stories about other “smart” products and services. Silicon Valley regularly cranks out healthcare startups that promise to use data about people’s vitals, the medicines they take, and their genetic history to preemptively diagnose problems and prescribe interventions. A couple doctors I know think this is fantasy, but think an app that accurately tracks when people take their medicine and makes that information available to healthcare providers would be almost revolutionary. Similarly, I have a number of friends who bought nutrition and fitness apps for the insights that the products would supposedly uncover about what they should eat, when they should exercise, and what time they should go to bed. In the end, they liked the products, not for the pseudo-scientific calculations that they were sold, but because the apps had simple monitors that tracked how many hours they slept every night.
The insight (ha) from these stories is that experts—be it sales leaders, doctors, or anyone who’s trying to figure out why they feel tired in the morning—aren’t looking for complicated and fancy analyses; they usually just want to know what’s happening. The value a data team provides is making that information available. We rarely need to figure anything else out, or obscure these basic facts behind a Daily Readiness Score. More often than not, this isn’t insight; it’s distortion.
The same applies to internal applications of data. Just as doctors want to know when their patients took their medicine more than they want to know the composite reading on the Mental, Cardiovascular, Dietary, Respiratory, Emotional And Medicinal Yardstick (MCDREAMY™ by ThriveHealth.ai, YC W24), customer success teams often just want to know what their customers are doing more than they want to see an aggregate health score.
To each according to his effort
All of this raises a rather obvious final question: Why do we build the health scores? Why do we do the complicated work and in-depth analysis when the easier thing—just some basic reporting—is also the more valuable thing?
My guess is that it’s the result of two illusions that amplify each other. The first is an extension of the locksmith paradox. In principle, people should be willing to pay a locksmith who picks their lock in a minute more than one who picks it in an hour—the former is a strictly better service. However, people will actually pay the latter more, because, the hypothesis goes, people pay for effort and fairness as much as utility.
The same applies to the work of data teams. For people who are buying some form of insight, whether that’s a sales team evaluating Gong or an executive wanting to understand their business, complicated work seems more valuable than simple work. Whereas people might pay tens of thousands of dollars for a Revenue Intelligence Platform, a simple transcription bot feels like it should be free. Internal data teams are encouraged to do the same thing through social “payments.” Analysis and ostensibly incisive insights are celebrated; building dashboards and reporting is seen as us doing the bare minimum.
Data teams, however, aren’t like locksmiths in one important way: Basic reporting isn’t actually easy. For many SaaS businesses, accurately and consistently calculating revenue is often harder than far more advanced looking analysis. In many cases, reporting is tedious, fragile, and often under a very precise microscope; analysis is fun, freewheeling, and rarely picked apart by the people who consume it.8
Put together, these two things trap us in a bad equilibrium, in which data teams aren’t rewarded for the hard work of simply sharing what’s happening. The solution, I suspect, is similar to what’s required to fix any sort of market failure—government intervention to subsidize what’s undervalued. In this case, the authority is an executive team, and the subsidy is praise and promotions. And until we celebrate and reward the “simple” work of creating basic reports and of working with experts around the business, data teams and data vendors will keep chasing the false idol of insight.
It’s sound analysis like this that gives me the confidence to be a smug little Substacker.
Part of the problem here is that there’s a general assumption that data teams can’t really talk about their work. And in practice, that’s true—there aren’t many blogs out there that talk about the specific problems data teams are trying to solve. But maybe there could be? Every data vendor has dozens of case studies on their websites that do exactly this. If someone rewrote all of these case studies so that they highlighted the business problem the company had rather than the tool they used to solve it, we’d very much be talking about the work.
For example, if you build a dataset of customers and how much they pay you every month, people will comb through it looking for surprises or inconsistencies. If you put together some analysis based on the same data, people might prod at your conclusions, but they’ll rarely pick through the source data.