“Am I the Jared Kushner?”
Why analysts should do less analysis. Also, content warning: A picture of Jared Kushner.
I don't use Reddit. But every once in a while, something leaks out of its fermenting grounds into one of the clearance bins of cheap content that I frequent. Usually, it’s an “NBA” “highlight”, some meme that Elon Musk stole, or—relatedly—the question, "Am I the Asshole?"
r/AmITheAsshole, or AITA, is a subreddit where people explain some interpersonal conflict that they’re currently dealing with, and ask if they’re the person who’s being a jerk. As best I can tell, the canonical form of this is a man asking the internet to tell him that the terrible thing he’s doing to his girlfriend isn’t terrible.
To the best of my knowledge, Jared Kushner has never posted on AITA. There is no need for him to; he is, rather obviously, an asshole. He is, however, one of a particularly refined form: Superficially educated, supremely confident of his own middling abilities, and willing to inject himself into any problem because, he assumes, his intellect and level head will cut through the petty emotional arguments of those on the ground. He’s Clark from Good Will Hunting, with worse hair. His purest form1 is a man who reads two dozen books on the Middle East,2 and sets off to solve a fifty-year-old conflict by, first and foremost, ignoring history.
It's a pretty lousy person to be: A drive-by expert, armed with cursory research, data, and “first principles," here to tell everyone else that they’re too irrational to understand. Don’t be, I think we can all agree, the Jared Kushner.
But for those of us who work as analysts, are we so sure we aren’t Jared Kushner? And more acutely, are we so sure we don’t see our entire job—to deploy ourselves into other people’s problems, armed with an objective eye and empirical coolness—to be a (nicer, kinder) Jared Kushner?
Take, for example, Katie Bauer’s excellent post on how data teams can elbow their way into positions of influence. It’s good, practical advice, from a qualified and compassionate source. Yet, it isn’t that far from recommending we all act as (nicer, kinder) Jared Kushners, showing up in conversations we weren’t invited into, on the belief our skills and our spreadsheets bring something that was heretofore missing.
To be clear, this isn’t an indictment of the post or of Katie; she’s one of the best out there, and her advice is as close to canon as we data people have. Instead, it’s an indictment of the nature of our job (or at least, flagging that nature as a person of interest). Because if we anonymize the hypothesis—instead of Jared Kushner showing up with research and quantitative reasoning to help make a decision, it’s someone more banal—I’m not sure that we wouldn’t say that that intervention is Right, Good, and, in fact, the very thing we are paid to do.3
So, are we all analysts, or are we all the Jared Kushner? What do we do with this new bit of data team existentialism?
An article a day keeps Jared away
Like AITA, AITJK prods at a fuzzy line. There is value in new perspectives. Experts aren't always right, people get trapped under groupthink, and, perhaps even worse, get enamored with their own ideas.4 The New York Times’ Spelling Bee game lets you shuffle the letters for a reason5—sometimes, the best ideas come from fresh eyes and a new perspective.
On one hand, then, perhaps being a (nicer, kinder) Jared Kushner isn’t actually such a bad thing. Out-of-the-box ideas—and a bit of bedside manner—could go a long way.
On the other hand, I think it’s worth thinking a bit more deeply about why we flinch at Jared Kushner skydiving into various problems that have been worked on by experts for years (or centuries). Because if I’m honest with myself, reading 24 books on something does feel like a pretty comprehensive crash course. Graduate students read about two hundred pages a week; if the average book is three hundred pages, 24 books is 36 weeks of classes. That’s not so bad!
What does seem bad, however, is how it’s consumed. To understand something as textured as the Middle East, you can’t shotgun a bunch of text books; you have to steep yourself in it. You have to digest it, day in and day out. A crash course, no matter how intense and thorough, would be a poor substitute for just reading the news for twenty years. Unstructured immersion, I’d argue, is a far better teacher than a rigorous sprint.6
This same principle can be applied to data teams and their roles within their organizations. We often act as though the most impactful thing we can do is sit alongside other departments and support them in the decisions they have to make. Dashboards and reporting are busywork; strategic research is our highest calling.
This work might feel helpful. It might even feel heroic. People may even be politely thankful and gracious for our support. But when they don’t listen or don’t act on our recommendations, it may be for the same reason that career diplomats roll their eyes at Jared Kushner: Our involvement just isn’t that useful, and we’re mostly saying things they already know.
Fortunately, this potentially suggests a better way forward. Rather than trying to help with explicit decisions, what if we focused most of our efforts on creating as informed companies as possible? What if our mission was to make everyone as aware of the shape of the business—of how it’s growing, of where it’s struggling, of what this user segment thinks about that product, of how the market is shifting—as the CEO? I’d argue that a well-informed company, defined by this kind of atmospheric awareness, would make far better decisions than one supported by even the sharpest consultants.7
Decision-making dies in darkness
Importantly, this isn’t a suggestion to build more reports and dashboards. Dashboards lack narratives, and those narratives are often more important than the metrics themselves. Memorizing employment numbers tells you a bit about the economy; reading monthly commentary on those numbers tells you a lot.
Moreover, reports and dashboards are often targeted and pulled based on need, rather than broad and pushed out to passive readers. A product manager might look up how some feature is used by different user personas, but a sales rep likely won’t. Leaning on dashboards to keep people informed is like leaning on Google to understand the news: It can be very effective for the questions people ask, but only the questions they ask.
So what do we do instead? Stephen Bailey gave us the solution a year ago: an actual newspaper. Instead of being researchers and scientists who make the news, data teams could be journalists and editors who report on it. Though they would still work alongside other teams to make decisions, their primary role in that process would be to publish what happened. It would be to keep people informed about how the finance team is currently modeling churn, about an unexpected dip in an operational KPI, or about an interesting finding that a CSM uncovered when researching one of their customers. And each update could be put in context with the others, and with the trends across the company.
In one uncomfortable twist, this sort of data team would probably have to focus on the quantity of what they published, and not the quality. Reading the news works in the same way that pointillism works. No single element matters, but viewing them in their totality, even if lots are missing, paints a very clear picture.8 Keeping people informed about a business is likely best done in the same way: Constant and steady updates—that are stories, and not just readouts of the same metrics—delivered with the assumption that nobody reads everything.
This type of data team would also solve two other nagging problems. First, it works better in a post-AI world. If ChatGPT is the final form of self-serve BI—which I think it will be; more later—people won’t need a (nicer, kinder) Jared Kushner because they’ll be able to do their own research. But, one downside of replacing both dashboards and analysts with chatbots is that people will get even more isolated in their news bubbles, and what they know will become even more dependent on what they ask. Without curation, publication, and distribution, this could, counterintuitively, make a company less informed. The data news organization solves this.
Second, the mission of creating a well-informed company comes with—at long last—an easily quantifiable way to measure success: a news quiz. Data teams could publish a weekly quiz about recent updates, about the current trends of key metrics, and about the various operating assumptions that exist around the company. The goal of the team is, above all else, to improve quiz scores.
Would it work? I’m not sure—but if I had to bet on a race between one company that was broadly uninformed but well-researched on a few key decisions, versus one that was well-read about its business but had to make most of its decisions on that awareness, I know which one I’d take. And if nothing else, the second company would have a lot fewer Jared Kushners.
AMITJK, industry edition
One could argue, without too much strain, that the data industry itself has gone full Jared Kushner recently. Over the last ten years or so, we've simplified so much data technology that becoming an analyst or data engineer no longer requires any technical abilities beyond a basic working knowledge of SQL. This opened the industry’s doors to a lot of people—me very much included—who could work their way up their career ladders while completely ignoring the history of what came before them.
That’s not necessarily a bad thing; new perspectives, even if they come from vantage points that can’t see the past, can bring a lot of new and interesting ideas. But it’s an interesting dynamic, and one of the reasons we keep papering over old concepts with new branding and calling ourselves inventors.
AMITJK, personal edition
One could also argue, with even less strain, that I am a Jared Kushner, hectoring the internet about subjects I’ve Googled about twice and read about once.
To which my defense is…uh, ok, never mind, that’s fair.
Which, honestly, I doubt. He probably listened to 25 YouTube talks about books on the Middle East while riding a Peloton.
For example: In 2014, we hired our first marketer at Mode, who’d been thinking about building and marketing products for premier brands for nearly ten years. I, on the other hand, had been working on Mode for nine months, most of which I’d spent writing decidedly unprofessional articles for our corporate blog. My knowledge of marketing was entirely as a consumer of consumer ads: I liked some Nike ads and that first iPhone launch seemed good.
When she joined, she told me that the future of marketing was content marketing. I was, as analysts are often taught to be, skeptical. Wouldn't people see through it? Didn't we need to sell the product more directly? Couldn't we just launch Mode on a stage in Cupertino and then make cool ads with Tiger Woods? How would we measure the success of content campaigns that were meant to increase people’s ever-immeasurable awareness of our brand? (In 1944, the spy organization that eventually became the CIA published a guide for how destroy organizations from the inside called the Simple Sabotage Field Manual. If the CIA were to create one for data teams, constantly demanding that we need a better way to measure success would be on the first page. Also, this is apparently an entry in the actual guide: “Talk as frequently as possible and at great length. Illustrate your ‘points’ by long anecdotes and accounts of personal experiences.” Anyway, back to my 300-word-footnote about something that happened nine years ago.)
I saw these questions as my job. In hindsight, I’m not sure if they were or not. But they didn’t make Mode any better at marketing, didn’t uncover anything our head of marketing didn’t already know, and, perhaps most of all, definitely didn’t make her life any easier.
Like, you know, people who write blog posts that mostly link to their own blog posts.
This, I think, will be one of the problems with ChatGPT and “SynthAI.” Reading the cliff notes is a poor substitute for reading the book, even if the cliff notes have all the ostensibly important points.
There’s actually another problem with outside experts, which is that they can often make bad ideas sound like good ideas. Analysis can be like debate, where the style of an argument matters more than its substance. Quantitative rhetoric like this can be quite convincing whereas our gut feelings are not, even if the latter is often rooted in something more “real” than the former.
I’ve seen this work in corporate settings. When I was working at Yammer the company, we used Yammer the product—which was basically Facebook at Work before Facebook at Work existed—for all of our internal communication. It was the most well-informed company I’ve ever seen. All work conversations happened in a handful of feeds, which people regularly browsed. Because they were threaded by default (unlike, say, Slack), you could easily scan the feed; because popular threads bubbled to the top of the feed, you were much more likely to see important stuff. It created the same awareness about the company that Facebook did about your friends in the early 2010s: Somehow, without even trying, you knew who had recently gotten married, or had a kid, or had transformed into an internet lunatic who was paranoid about chemtrails.