When a startup launches a new product, the second worst thing that can happen is that nobody buys it. The worst thing that can happen is that five people do.
Five customers could be the beginning of something big. Five customers could be a sign of much more to come. Five customers could be, in the eyes of a founding team that's hoping to validate their exciting new idea, evidence that they are. Five customers could be a wedge, a foothold, a sharp edge cutting into a huge market. Five customers rounds up to ten customers, and ten customers is a milestone.
But five customers could also be a false start. Five customers could be a few eager early adopters that aren't representative of anything bigger than themselves. Five customers could be, in the eyes of a founding team that's hoping to validate their exciting new idea, confirmation bias. Five customers could be a cul-de-sac, a dead end, a commanding market share among a tiny group of idiosyncratic buyers. Five customers could be dogecoin: appealing to a passionate niche, useless to everyone else, and ngmi.
Five customers could be forever, or go down in flames.
Over time, reality has a way of revealing itself. The durable successes—the products with real product-market fit—keep selling out. Five customers turns into fifty, and five hundred. Companies no longer have to fight for every sale; they cross the chasm;1 the uphill struggle flattens out and turns into a downhill run.
When products don’t fit their markets, however, the climb keeps getting steeper. Initial customers show up for unique reasons, and their interests aren’t representative of what other people want. The further the company gets from that group, the more it struggles to sell.
The problem is that it takes time to tell which path a company is on. Did the early excitement around the iPhone foreshadow a global revolution, or was it brand loyalists getting overhyped about a product that "will be passé within 3 months?" Were conference calls the next huge innovation in social media, or was Clubhouse just the perfect product for grandstanding VCs and crypto grifters in the middle of global pandemic? Is an exclusive and minimalist email client with dozens of snappy keyboard shortcuts the future of productive digital communication, or is Superhuman just a status symbol for startup CEOs?
In the heat of the moment, at the beginning of what feels like the vertical leg of the exponential growth curve, these aren’t easy questions to answer—but it is difficult to get a man to take the pessimistic view when his stock options depend on him not taking it. So we often choose to believe that our initial idea was a good one; that five customers is just the beginning; that we can build and iterate and educate our way up and over the mountaintop; that, if demand is slowing or sales take more selling, the market just needs a bit more annealing.
In some cases, the company is right. Their relentless effort works, and they pound the product and the market into one another. But in other cases, the initial pop of interest was actually the peak, and the company dies with their hammer in their hand.
Sold, not bought
Over the last several months, the data punditry has shifted from talking about the tools to talking about the work—and specifically, how we work with teams outside of our own. This is important, we say, because we don’t yet have the influence within our companies that we know we could.
Sure—this is a useful discussion, and a version of a podcast that I’d listen to. But here’s the thing: These aren’t new conversations. We’ve been hammering on this problem for years. Katie Bauer’s recent post about data teams being left on the sidelines is excellent—and depressingly evergreen. Two years ago, Erik Bernhardsson wrote a story about navigating the same problems of being misunderstood, left out, and asked to do the wrong work. The frustrations resonated then as much as now: “It’s so realistic;” “very relatable;” “this is so spot on;” “OMG - its my life!”
Evidence of data teams’ persistent struggle is everywhere. This advice from 2015 to business stakeholders on how to work with analysts could’ve been written today—and in fact was, by David Jayatillake, just two weeks ago. For as long as we've had analytics and BI teams, we've tried to create good processes for people to ask them questions. We’ve shared intake forms. We’ve built products. And yet, most data teams still can’t convince their business partners to regularly use them, and our most common ticket management system is still Slack DMs. We’ve put self-serve interfaces between us and everyone else, and declared it disaster—no, critical—no, a lie. And we’ve punted on efforts to measure our value to the point that it’s become an inside joke.
In this context, imagine that the data community is the company, and the product that company makes are data teams that operate in the manner that’s popular today—embedded in a business, designed to help people make better decisions. Our business partners who choose to work with us are our customers.2
If we had product-market fit—if stakeholders were enthusiastic buyers of the services we offer—would we still be kept out of the influential rooms we want to be in? Would we have to fight to be heard? Would we have to constantly remind people why our work is valuable?
No. Products that have product-market fit are bought, not sold. And as people’s responses to posts like those from Katie and Erik show, most data teams still have to do an awful lot of selling.
Shepard tones
There are, of course, data teams that don’t have these problems. We have our famous idols, and there are surely scores of other teams who’ve been successful outside of the limelight.
These companies, though, might be our first five customers: the peculiar businesses for whom the current model of a data team works. They might have data that is particularly valuable. They might have executive teams who bought the hype, and now have blind faith that data teams are essential. They might be companies that hired a uniquely talented set of analysts who can make good on promises that most of us can’t.
However, none of these cases are necessarily representative of the market writ large. If we’re only useful to companies with especially useful data, we’re like Superhuman: good for a small and specialized audience; overpriced and unnecessary for everyone else.3 If we need people to be true believers, we’re like the Apple its skeptics imagined: fueled by brand loyalty and not product utility. And if we need our users to be extremely talented, we’re like Clubhouse: great only for people with the right resumes.4
In these cases, user education and a few more features won’t fit the product into the market. The mismatch is more fundamental than that. Incremental iterations may seem like progress, but the motion is mirage, a Shepard tone that never reaches where it feels like it’s going.
Though I’m not convinced that data teams are that far from finding their way, I think it’s worth assuming, if just for a moment, that they are. What if we’re not prophets in the wilderness, but salespeople selling a lemon? What if the problem isn’t that market doesn’t understand what we’re offering—it’s that they do, and they don’t want it?
One good idea and four bad ones
The first thing we should do is talk to some product and marketing managers. They’re paid to figure this stuff out, and they can certainly outline a better plan than we can.
But I was a product manager for a minute (it didn’t go well), ran a marketing team once (it went worse), don’t have the creative courage to end a blog post so abruptly (respect), and am a white guy (I’m not qualified, but I got this), so here are four ideas fired off from the hip.
Pay attention to what’s working
In a comment to last week’s post, Kendall Willets pointed out the big X I’ve been standing on but never saw: different types of data work are treated differently. When marketers want to optimize their ad spend, there’s a ton of pull for our services. We can pout about that—”I want a strategic oompa loompa!”—or we can ask why that’s the product of ours that people want to buy.
My suspicion is that it’s because it matches an actual customer demand that we’re uniquely suited to provide. As much as we may hate it, people need data pulls. They don’t need junior strategic advisors armed with spreadsheets and an attitude. If we want to be in the room where it happens, we shouldn’t spend our time trying to sell an unnecessary service to a reluctant buyer; we should spend it figuring out what we can do that would make it necessary for us to be there.
Pay attention to what’s broken
In asking myself if data teams have product market-fit, I kept getting hung up on how we handle work requests. This is a simple task and a solved problem. Huge engineering teams do it for far more complicated projects than we work on. Support teams do it for hundreds of thousands of daily tickets. Why have five-person data teams still not figured it out?
I don’t know.5 But I suspect there’s something interesting about our relationship with our customers in this answer. Just as actual vendors can learn a lot from churned customers and lost opportunities, we can learn a lot from our unexpected failures.
Don’t assume, ask
The corollary to both of these points is that we have to talk to our customers. We have to research them; understand them; put ourselves in their shoes and figure out why they do what they do. When they push us aside, we shouldn’t assume that we’re offering something valuable—strategic advice! Metrics and alignment! Experimentation and the scientific method!—and our job is to sell it; we should instead ask them why they don’t want it.
The answers may surprise us. Our advice might be bad. Metrics might not be that useful. We might create more disruption and doubt than we do agreement and alignment. If there’s a case to be made for hiring a data PM, this is it—to do discovery, and figure out what people actually want from a data team.
Read the history books
Finally, my least favorite idea: We should do our historical research. Though our technology is new, our job titles are new, and many of us are, in the scheme of things, new, the organizational problems we’re trying to solve are old. We can choose to solve them again, or we try to learn from our ancestors, and not repeat their mistakes.
Do I know what those mistakes are? Of course not. But I’m sure that prior generations of data teams, BI developers, and IT professionals have tried to sell their services to hesitant customers. And they probably have stories to share.
That sinking feeling
When writing these blog posts or putting together presentations, I usually start with a bunch of mushy waypoints through a loosely formed idea. These aren't bullets of Takeaways,6 but the stories I want to tell, articles I want to reference, petty grievances I want to air—and sometimes, substantive points I want to make. In a post's or presentation's early stages, I'm usually shuffling around paragraphs or slides to see if there’s a coherent way to fit them together.
It doesn’t always take. The story doesn't fit; the slides refuse to give up the fight; I read it over and over and still can’t stand it.7 A sinking feeling of disappointment sets in, and you know what you have to do: concede defeat, kill your darlings, and start over.
For the better part of a decade, we’ve been trying to manifest a narrative about how data teams are critical pillars in modern companies. We sell our ideas to our employers, get frustrated with them when they don’t buy it, and blame ourselves for not having the influence we feel like we should.
I’m not sure that’s the right reaction. The problem might not be our sales pitch, but what we’ve been taught to sell. We may be extrapolating too much from our first few customers, and are trying to force their story onto a broader market that never wanted it. In which case—let’s shake it off, and reinvent ourselves.
At the risk of turning this blog into a snarky design critique site, what on earth is going on with the signup box at the end of this post? Why is it tilted by one degree? One degree?!? It’s subtle enough that you can’t quite tell what’s going on, other than you’re slowly getting dizzy. It’s the online equivalent of being in a stopped train, looking out of the window at another stopped train, seeing something move, not being able to tell if it was you or them, and then feeling nauseous for the next ten minutes. Except someone did this on purpose.
Just what we need, yet another definition of data as a product to be added to the lexicon.
As it’s now trendy to say, when interest rates were low and money was cheap, it was easy to lose sight of what cost more than it was worth.
Importantly, product-market fit also requires the product to be accessible. People may want to buy a self-driving car, but if it costs fifty million dollars, it doesn’t work in the market it needs to be sold to. Similarly, a surgery that cures cancer would clearly be in high demand, but it doesn’t have product market-fit if it’s so complicated that it can only be performed by a few doctors.
Though I have theories. Perhaps it's because data team work is often seen as "just pulling a number," so creating a ticket seems like more overhead than it’s worth. Perhaps it's because these asks are often offhand curiosities that come up in meetings and not formal requests. Perhaps it's because a lot of our work comes from questions we ask ourselves, and we’re the lazy ones who don’t stamp our timesheets. Perhaps it's because we do a lousy job of showing the value of having a paper trail, so nobody creates one (but shoutout to Caitlin Hudon for breaking this cycle).
My five (for real, not a joke) rules for giving an effective presentation: 1. Think in prose. 2. Go fast. 3. Spend most of your time on transitions. 4. Rhyme off the beat. 5. Don’t listen to other people’s rules for giving effective presentations.
I think the problem is hiding in the idea of an outsourced ‘data team’, seperate and independent from ‘the business’. Businesses with a lot of data have for decades had technology teams managing data and tech stacks, and trusted advisors and analysts in their own teams. This model worked well (but variably, as it often depended on an individual) and depended on the tech stack not changing too much (perhaps upgrading the version of oracle). But these advisors built trust through thier relationship with key business figures. They didn’t use a ticketing system, because they knew what was important to their leaders. Their leaders held the information with regard, because it was delivered by someone who they felt truly understood their problem (and was probably regularly slipped into different projects, outside their area of expertise, Eg. Business requirements for rewards programs, or delivering a new market segmentation including staffing sales channels).
We used to deliver customised insights to help an individual make a decision, because we knew how they made decisions. Now we use best practice, design frameworks, the latest shiny tools (where we lose a whole bunch of our time and effort) to deliver insights for a “business”.
The big consulting firms understand this, and while they have some great data teams, results from those are often repackaged in PowerPoint for the senior leaders that hired them. They build relationships with individuals.
Thought-provoking. Thanks for writing this.
From what I can tell there's some priors you're bringing that you're not being explicit about. It might be useful to see if we agree on what those are to draw conclusions from on the overall post.
1. I think you're largely talking about the concept of a "data team" at a digital native business. The entire conversation about "what is the appropriate role of / interface to the 'data team'?" isn't really being had inside of most enterprises. The enterprises that I talk to have a very well-understood org structure (understood inside the organization, anyway) and I don't perceive any lack of clarity around who is "buying" their services and why. There is no lack of PMF for data in the enterprises I talk to; rather, it's the opposite. That's not to say that enterprise data is some shining beacon on a hill (they have their own problems for sure)...they just don't experience this particular problem in this particular way.
2. I think you're really talking about the "analysis / insights / strategy" function of a data team, not the "pipelining / modeling" function of a data team. I don't think you're making the case that companies don't care about _data_, rather, that companies aren't buying the "data team as strategic partner" model.
If those two things are true, then I agree with you. If not, we can fight about that in some other forum :)
And if those two things are true, then it sharpens the question. The question becomes (I think):
> Why should we expect data professionals, who by definition specialize in technical--as opposed to functional--skills, to know more about strategy in a given functional area than their peers who have actually built many-years-long careers inside of that functional area?
Sure, data people do and should build competencies in the functional areas that they are partnered with. But after spending 7 years as a marketer, I can tell you that it will be very unlikely that a data analyst (however skilled) is going to understand the marketing data they're looking at better than I do. Maybe if they're working in a sufficiently-narrow and well-defined problem domain. But if you all the sudden observe a drop-off in a particular conversion rate, the instincts that will get you closer to an answer _fast_ are marketing instincts, not data instincts. Having access to data to answer this question is critical (and so pipelines and modeling and metrics are critical), but the specific expertise required to diagnose and fix this problem is marketing expertise.
The hardest part about data in a digital native business is to figure out how to get two skillsets--functional and data (technical)--sharing a headspace. Long-lived embeds of data people on specific product teams are the model that many of the most successful companies have learned to do this. But this team structure is unusual...perhaps because it's more expensive than most companies can afford. These companies do it because their scale and value of data more than justifies it, and because the atomic group--the two pizza engineering/product team--is very receptive to the addition of this person.
IMO there is not a single right answer to this, but there are wrong ones. If you're expecting a really small data team to support all of the functional areas of a business and to be able to offer "strategic insights" that's likely just not a real thing unless you have some really unusually exceptional data people or unless you focus them really narrowly. But IMO that's fine. That just means that data teams need to be really clear about their mandate given the particular context of their business. And analysis / insights / strategy may or may not be a part of it. And that's ok.