It's on the analytics teams when they fall for the same sales pitch again. And again. And again. No tooling is going to solve tech debt, lack of documentation, and legacy workarounds.
I think this gets at what makes this difficult, which is that "data" isn't quite the same as "analytics," but the two tend to get jumbled up. (I almost called this "disband the data team," and felt like that was too broad.) I'd say that reporting and operational data work is very useful, and a decent amount of the spend on data tools is meant to support that. But not all, and I'd wonder how much people over-engineer their reporting stacks just in case, because they want it to do double duty with reporting and various analytics things.
Yeah, I have to say I found your definition of analytics pretty niche and inconsistent with the bulk of analytics work (data analysis work) that I see done. I feel like the target here is really executive dashboards?
1. Similar to your point around the bet imagine you take analytics engineers and make them CEOs. How would they organize the company? Would they keep them the same way or do something different? How different?
2. A while back I listened to this podcast around data at Ramp (https://roundup.getdbt.com/p/ep-47-ramps-8-billion-data-strategy) and it resonated. The big point is to embed data into product teams (Product + Engineering + Design -> Product + Engineering + Design + Data) where you're thinking of the data you're capturing, how you're going to capture, what you're going to do with it, etc at the start of the project.
Both of these are getting to the idea that data/analytics is too far downstream and are often too reactive. There's value in the data and the analysis but it needs to be an actual part of the business vs a supporting function.
Yeah, I'm increasingly becoming a believer in the embedded model, though I might go even further, and say, just build the central platform function or whatever, and leave it up to teams to hire their own people. It's not even embedded; it's just headcount for those teams.
On your first question, I'm curious what you think would happen? My answer is probably kind of boring, which is that they become like every other CEO. Like, from my (limited) experience in similar-ish problems, once you're in that seat, you realize why the prior people in that seat made the decisions they did. It's very hard, when you've got all these other problems to deal with, not to sacrifice your "principles" to get things done. And next thing you know, you've forgotten where you came from and your just a generic CEO.
Makes sense and I suspect a lot of companies are doing that already. At a large enough scale you have the data platform team but the data engineers still sit adjacent to that. At some point they should be pushed further out and properly embedded.
You're probably right about the CEO behavior as disappointing as it is. I think you also realize it's not a priority so you just maintain the status quo.
"imagine you take analytics engineers and make them CEOs"
I think they would do things differently and that the data ship would be more likely to run on time.
That being said, I think the analytic engineers, business analysts, data product managers, and data scientists would each have different takes.
AEs likely more focused on specific metrics & steady-state optimization, BAs more focused on causal data stories, DPMs more focused on local empowerment of decision-makers, & DS more focused on predictions & analytical rigor.
Or to put it another way, I have seen analytics-driven decisions & companies that are systematically good at this. The type of model is also tied with elite consultancies, but also in day-to-day processes like Industrial Engineering & FP&A. It is part of the secret sauce of certain firms.
But I really don't know how to review your approach. As in, if analytics is a sham, then all of business is a game of blind-luck, suggesting that there is no "goal". There are tasks stupidly decided that win on sheer luck.
Just a follow-up, is it also possible that analytics struggles due to being overly tied to data-technology, and that many companies are at atrophying levels of competence?
Ah, yeah, these are good questions. I think this is how I see it:
1. I'd make a distinction between (1) basic reporting and BI, and the various operational stuff that goes into industrial engineering, operations research, FP&A, and algorithmic stuff like Google ads, (2) "analysis to make decisions."
2. I think (1) can clearly be very valuable. It's better some companies than others, but I wouldn't argue generally against any of that.
3. (2) *can* be valuable, if you have the right problems to apply it to and the right talent to do it. But those things are pretty hard to come by.
4. But, because some companies are good at (2) and we all want to be like those companies; because (2) is adjacent to (1) and we kind of muddle them together; and because new tools made (2) a lot easier, lots of companies started getting into it. And that hasn't gone so great.
Ok, I feel like "basic reporting & BI" is really (1), and IE, Operations Research, & FP&A may be closer to (2), but potentially also a (3) that reflects a more DS-centered view. "Algorithmic stuff" in ads may either be (1) "I just pick top leads per click", or (2) "I run an ads strategy based upon data", or even (3) "I build a custom algorithm for my purposes".
I think (1) definitely can be very valuable. I think the value gets overstated at times, but a lot of it focuses on "What is going wrong?".
I think (2) is actually where most of the value is, but also is beset by multiple problems including talent, problems, and organizational politics. Solved problems tend to be owned by the team solving them. Unsolved problems may disrupt multiple ownership lines.
I agree with the final statement. And it is a large problem. And I don't quite know how to solve it. I expect Business Analysts are supposed to be the solvers, but if they're mediocre at business, or data, or even communication, or even facing unhealthy politics, the entire effort can be waylaid. I think where FP&A has tended to be successful is because they are intrinsically connected with a critical function that requires judgment.
I think (1) is very hard, but probably the easiest as it is the most factual. (2) is even harder as it is multi-disciplinary, and can be challenged with politics. (3) is also very hard as it is potentially more multi-disciplinary, and can be challenged with aligning with business value.
yeah, that makes sense, and I think I probably generally agree? I agree with you that (20 could be really valuable, but fundamentally, it's just hard to do. Which makes sense: It's the least well defined problem, you have to have all these inputs, and all of that. It's maybe akin to being a doctor - someone walks in, says here are my problems, and you have to piece together a bunch of deep domain expertise, analysis of the situation, and some interpersonal stuff to figure out what to investigate, much less to figure out what to do. That's just...hard? But it sounds so appealing, so we say let's all go do it. And then many of us just can't.
I think I am on the same page, and I think it is weirder for us today as analytics as a practice was founded by consulting.
Frederick Taylor (Industrial Engineering) & James McKinsey (Managerial Accounting) were the two people who really did the most to expand the "data" scope.
The issue is that most of us are not consultants, but consultancy is what established the beach-head and then the technical discoveries have just tried to systemize it. As in, SQL is not new, but definitely newer than Shewhart control charts & t-statistics.
Maybe that's ultimately the weird thing here actually - analytics as it is today seems to trace its lineage through data and IT worlds, and not consulting and operations research worlds. Even though the latter is probably where the real roots are.
Thanks! And I might go even further, and say just drop the whole idea, and let teams hire analytics skills as they need them. The hub and spoke model is still an analytics teams to me; it's just one where analysts are each dedicated to one department. Which, maybe, but I'm convinced we even need that? I'd almost argue for having the centralized team to the reporting and infrastructure work, and let other teams decide how they want to hire to make use of that on their own.
Love this and it's a super important take right now. I'm completely onboard with "less numbers" and FWIW I believe there's a real case to be made here:
Exact numbers are the way to optimize machine decisions in ML. As you point out, human decisions aren't number-based. I'd wager they're story-based (good gossip is a good story). To survive / thrive, analytics has to become the story, meaning-making apparatus. Less dashboards, more engaging content. A clear narrative is more important than precise numbers.
This is the path to deliver on our promises without magic or messiah. Good stories create alignment, clarity, confidence. Let go of the pretense of empiricism and take responsibility for qualitative data (customer interviews) for good measure.
So this comment gave me the idea post I just used to for this on Linkedin (a link below, v cringe). Which is basically, I think we always talk about the importance of stories and all that, and I've always been a pretty big believer in it. But maybe that's not that useful? Like, rather than trying to be persuasive (which is hard), maybe it's better to define the facts about how people see the world? Tiktok, for example, is far better for political persuasion than politicians, because *arguments* don't really convince people. Changing the facts that people believe are reality *does.* I feel like we've been playing the persuasion game, and we might be better of playing the fact game.
I have to agree. Perhaps it's semantics. To me, a story is the most efficient way to communicate information about the world between humans. It's designed to determine what "facts" we believe in.
The key is that numbers or persuasive language aren't enough on their own. However, engaging the audience with a persuasive narrative that is informed by the numbers may just work. It may be naive, but it may also be a more efficient channel for information to flow and influence how the organization behaves.
For sure; stories are definitely good at that. I think the nuance to me is the classic thing about the best way to get someone to change their mind is for them to think it was their idea to do it. For a long time, I've thought of stories as needing to be persuasive and to advance my opinion, rather than thinking about them as presenting a reality that will lead people to believe something on their onw.
I will say though, there is an upside to this, which I meant to include but forgot. Which is, I don't think that all analysis is useless; I just think it's really hard to do it well, so it doesn't scale that well into being a mainstream thing everywhere. But, that means if you are good at it, you can still be very valuable.
Ah... Ok, this makes me feel better. Though it feels there maybe some irony in you forgetting to include the upside... Perhaps a symptomatic issue amongst data professionals 😜
It was more that I'm usually in some panic on Friday morning to get the thing out, and the whole thing ends up crash landing at about 70% done. There are a lot more "oh oops I forget that whole thing" moments than I'd like to admit
Very provocative. I do think data teams need to focus less on data and more on narratives, and collective sense making. That’s more about humans, and communication than tech and data.
Kinnnda? This is a new thought, but I think there's some nuance there. Where it's less about narratives and more about...reality creation? So just like, facts?
So I kind of love the idea of adversarial analytics. It's like red teaming your decision making process. If the CEO directs the data team to lie to marketing about their numbers for the month, what would the marketing team do? Do they freak out and go panic mode and it triggers a ton of politics? Or do they fall back on what they control, assess their inputs, get curious about why, and figure it out together.
This reminds me of the chaos monkey from Netflix https://netflix.github.io/chaosmonkey/ Is the ideal for any team in an organization to be resilient to bad data and analytics? That would be a good thing right? Everyone wouldn't immediately trust the charts and they'd dive in and take a breath before going crazy...
This also reminds me that a good way to cut random spend in an org is to stochastically cancel corporate credits to see who and what complains!
We've definitely done some version of that with dashboards and stuff, where we just shut them off and see if anything happens. The results have never been terribly encouraging about the value of most of our dashboards...
I kind of like the idea of going even further though, and start introducing things that are wrong. Like, for every non-financial metric that you've got to report to the SEC or whatever, some days, you just futz with the number by a few percent. We don't tell you which days. On one hand, that's dumb and crazy. On the other hand, it might be like rounding, where it teaches people to look at trends and relative orders of magnitude, and not care that much about precision.
Thanks! And that was very much the original spark that inspired this post. I saw one article about data quality, and another about data teams getting a seat at the table, and was like, haven't we been saying this forever? At what point do we start asking what happens if this never gets better?
Finance is a clear example, but there are other areas where it is supposed to matter a lot.(But I know "supposed to" is very hypothetical, and all of this is less about adopting the latest tech)
Thanks Benn, I always love your thoughts. In this case, and I'm just spitballing here, you could focus on creating a mature analytics function within your organization that actually makes a difference. We created Leading in Analytics to help you do just that. Go ahead and search for it. . . Scroll through the sponsored ads, of course.
Thanks, and thanks for sharing! And yeah, that's something that I may have under-emphasized a bit too much here: I do think it's possible to get very good at this if you invest in it, have talented people, etc. It's not to say that it's all impossible. It's just very hard, and I think most people give up before they get there. Like, we all probably *could* become good pilots and then we could have lots of small zipcar-like private planes that people could use to go places and it would be way more convenient (and maybe environmentally friendly) than us driving everywhere. But flying is just too hard for that to be realistic.
THIS 👉 "Because after flirting with a life of crime, we might need to start keeping our backstory a secret, and learn to blend in with everyone else."
I have spent time as an undercover data agent in marketing, operations, logistics, product... Now that I'm spending most of my time (for the first time) leading with just the "data" play... I'm seeing the gaps more clearly.
Specifically in SaaS I would be love to see more startups start with the use-cases first. Seems obvious but most people don't seem to start there...
I will say, this does seem like it's becoming a bit more of a trend. I know a handful of data people who involved in data startups who are currently working on ideas like "Mixpanel for X," where they want to build an opinionated data tool for a particular vertical. No idea if they'll work, but, that's probably another constant pendulum - bundling to unbundling and back; generic to specific and back.
Love your articles, usually read them twice, something unusual in a world drowning in content.
Would the following classification make sense:
operational data work: real-time (or near real-time) automated systems based on data. Usually containing some ML component but I would include a simple counter as well. Here we see a lot of value (ads etc).
tactical data work: human-facing presentation of metrics (in e.g. dashboards or reports) in a slower cadence. Basically checking the "known unknowns". I have seen a lot of value in catching and root-causing regressions or just aligning people around KRs.
strategic data work: diving into the "unknown unknowns" by coming up with "insights" that fundamentally change the whole business model. This is hard and it is almost impossible to predict any outcome. Therefore, the value density per working hour is pretty low.
All three pillars are usually covered by the same team, mostly because they share some tooling and maybe also thinking. My understanding is that everybody wants to work on important "strategic" stuff, and therefore the third pillar is raising a lot of expectations. At some point I have been researching on any strategic "insight" stories with real impact published online but could not find much, just a lot of corporate noise.
And yeah, that's pretty much how I'd put it. The only nitpick is I don't think operational work is necessarily real time or uses ML. I think the only really important thing here is that it's automated. A marketing drip campaign that sends people emails periodically about what's in their shopping cart would fall into this category for me, even though it's neither ML nor real-time.
But yeah, I think you're exactly right - the same team usually does all three (kind of by default, though it's not strictly necessary), and usually wants to do the last one. But as you say, there's a lot of noise in that pillar, it's really hard to get right, and it's rarely where the team delivers the most value.
LLMs have mastered the craft of creating SQL queries, Pandas scripts, and graphing analyses.
I work at a company where Data plays a service role. In that capacity, they’re far too distant from the problem to come up with unique angles nor ideas on what to dig into. PMs, Ops, and Revenue can now take on 90% of those responsibilities with a little additional effort.
In the near future, this skillset should be a tablestakes for any position hired in a technology company. Data will be relegated to a cost center like IT where they conduct tasks like setting up experiments, sharing reports, etc. To prevent that, BA needs to be the closest to the problems to ask the right questions.
I don't know that I agree with the premise (in my experience, LLMs don't write good SQL at all), but I roughly agree with the conclusion, that people who specialize in data analysis might not be all the valuable on their own. It seems very possible that the gap they have to make up in not knowing the context of the problem is just too big (and other people are often good enough analysts) to justify the cost.
I've been toying with the idea of an analytics sous-vide model for a bit and I wonder what you would think about it. A largely distributed model with a 3ish month rotation into the "hub" / COE so you can stay up to date on larger company context / methods and hopefully break down some of the tribalism that I feel like builds up. Maybe the scale you need to achieve for this to work makes it impossible, but I definitely agree that treating this like an independent function and not a good-to-have skill is aging poorly.
I'd be skeptical, to be honest? A lot of centralized data teams say that one of the things that they're good at is connecting dots across the business. But in practice, that doesn't really seem to happen.
More generally, I guess the question is, would a rotational program be the best of both worlds, where people develop some domain expertise while staying true as "analysts," or the worst of both worlds, where they never get deep enough in the department to be useful. I could see it going either way, though if I had to bet, I'd probably bet on the latter.
Yep that’s totally fair. I think organizations should be prepared to encounter tribalism and cross purposes at a certain stage, and have a plan for how to navigate those situations. It just feels like identifying conflicting incentives is easier and more enlightening than conflicting SQL logic.
I think the ultra distributed approach likely requires excellent department level leadership and analytics folks would rather have control over their own destruction that it be dictated to them lol
Yeah, that's a good point - it doesn't really matter what the team structure is if you can't hire people for that team, which creates some very real constraints, regardless of what structure might be theoretically ideal.
20 some odd years of working in data, and it has been only in the last several where I've extricated myself from the church of "data is everything and all things" to realise that at least 90% of everything in this world comes down to the messiness and complexity of people. (not an exact figure, but that's the point?) I think this is something that a few voices in the field, or data scene, or whatever we want to call this collective have touched upon, but doesn't get a loud enough voice. On the one hand, I think it would save ourselves a lot of trouble as there's no longer the need to quibble around ridiculous levels of precision, but on the other it's having to chip away and break free from decades of data rhetoric we've been taught to believe.
Yeah, I very much agree. I think the current trend is to say stuff like "it's not about tools! It's about people!," which, true, but in most of those critiques seem to be saying we can't just throw dashboards at people; we have to persuade them with data instead.
But I think you're right: it's deeper than that. The problem is the rhetorical nature of data itself. We see it as an argument-ender. There's been this kind of cultural capture by data, where it's put on an unassailable pedestal. Which, sure, yes, science is good and all of that, but we aren't really doing science. We're instead just arguing with numbers. And if everyone does that, it's not useful; it's an arms race to who can sound the most quantitative.
The bigger Ponzi scheme seems to be the data platforms that are making millions ( billions?) by enabling all these analytics teams, no?
It's on the analytics teams when they fall for the same sales pitch again. And again. And again. No tooling is going to solve tech debt, lack of documentation, and legacy workarounds.
I think this gets at what makes this difficult, which is that "data" isn't quite the same as "analytics," but the two tend to get jumbled up. (I almost called this "disband the data team," and felt like that was too broad.) I'd say that reporting and operational data work is very useful, and a decent amount of the spend on data tools is meant to support that. But not all, and I'd wonder how much people over-engineer their reporting stacks just in case, because they want it to do double duty with reporting and various analytics things.
Yeah, I have to say I found your definition of analytics pretty niche and inconsistent with the bulk of analytics work (data analysis work) that I see done. I feel like the target here is really executive dashboards?
I'm not sure I follow? Broadly, I'm saying you could split data work into three things:
1. Dashboards
2. Largely automated operational stuff
3. Analysis to make decisions
Though my gripe is with the third one, I'm not saying that it's the biggest. It probably is pretty niche?
Oftentimes there are too many people to blame Joe/Jill in analytics for getting a new tool.
Stating that in that there is not a need for all of these tools at every org, but a lot of this is resume-driven tool acquisition.
Two thoughts while reading this:
1. Similar to your point around the bet imagine you take analytics engineers and make them CEOs. How would they organize the company? Would they keep them the same way or do something different? How different?
2. A while back I listened to this podcast around data at Ramp (https://roundup.getdbt.com/p/ep-47-ramps-8-billion-data-strategy) and it resonated. The big point is to embed data into product teams (Product + Engineering + Design -> Product + Engineering + Design + Data) where you're thinking of the data you're capturing, how you're going to capture, what you're going to do with it, etc at the start of the project.
Both of these are getting to the idea that data/analytics is too far downstream and are often too reactive. There's value in the data and the analysis but it needs to be an actual part of the business vs a supporting function.
Yeah, I'm increasingly becoming a believer in the embedded model, though I might go even further, and say, just build the central platform function or whatever, and leave it up to teams to hire their own people. It's not even embedded; it's just headcount for those teams.
On your first question, I'm curious what you think would happen? My answer is probably kind of boring, which is that they become like every other CEO. Like, from my (limited) experience in similar-ish problems, once you're in that seat, you realize why the prior people in that seat made the decisions they did. It's very hard, when you've got all these other problems to deal with, not to sacrifice your "principles" to get things done. And next thing you know, you've forgotten where you came from and your just a generic CEO.
Makes sense and I suspect a lot of companies are doing that already. At a large enough scale you have the data platform team but the data engineers still sit adjacent to that. At some point they should be pushed further out and properly embedded.
You're probably right about the CEO behavior as disappointing as it is. I think you also realize it's not a priority so you just maintain the status quo.
"imagine you take analytics engineers and make them CEOs"
I think they would do things differently and that the data ship would be more likely to run on time.
That being said, I think the analytic engineers, business analysts, data product managers, and data scientists would each have different takes.
AEs likely more focused on specific metrics & steady-state optimization, BAs more focused on causal data stories, DPMs more focused on local empowerment of decision-makers, & DS more focused on predictions & analytical rigor.
Honestly, I don't know.
Or to put it another way, I have seen analytics-driven decisions & companies that are systematically good at this. The type of model is also tied with elite consultancies, but also in day-to-day processes like Industrial Engineering & FP&A. It is part of the secret sauce of certain firms.
But I really don't know how to review your approach. As in, if analytics is a sham, then all of business is a game of blind-luck, suggesting that there is no "goal". There are tasks stupidly decided that win on sheer luck.
Just a follow-up, is it also possible that analytics struggles due to being overly tied to data-technology, and that many companies are at atrophying levels of competence?
Ah, yeah, these are good questions. I think this is how I see it:
1. I'd make a distinction between (1) basic reporting and BI, and the various operational stuff that goes into industrial engineering, operations research, FP&A, and algorithmic stuff like Google ads, (2) "analysis to make decisions."
2. I think (1) can clearly be very valuable. It's better some companies than others, but I wouldn't argue generally against any of that.
3. (2) *can* be valuable, if you have the right problems to apply it to and the right talent to do it. But those things are pretty hard to come by.
4. But, because some companies are good at (2) and we all want to be like those companies; because (2) is adjacent to (1) and we kind of muddle them together; and because new tools made (2) a lot easier, lots of companies started getting into it. And that hasn't gone so great.
Ok, I feel like "basic reporting & BI" is really (1), and IE, Operations Research, & FP&A may be closer to (2), but potentially also a (3) that reflects a more DS-centered view. "Algorithmic stuff" in ads may either be (1) "I just pick top leads per click", or (2) "I run an ads strategy based upon data", or even (3) "I build a custom algorithm for my purposes".
I think (1) definitely can be very valuable. I think the value gets overstated at times, but a lot of it focuses on "What is going wrong?".
I think (2) is actually where most of the value is, but also is beset by multiple problems including talent, problems, and organizational politics. Solved problems tend to be owned by the team solving them. Unsolved problems may disrupt multiple ownership lines.
I agree with the final statement. And it is a large problem. And I don't quite know how to solve it. I expect Business Analysts are supposed to be the solvers, but if they're mediocre at business, or data, or even communication, or even facing unhealthy politics, the entire effort can be waylaid. I think where FP&A has tended to be successful is because they are intrinsically connected with a critical function that requires judgment.
I think (1) is very hard, but probably the easiest as it is the most factual. (2) is even harder as it is multi-disciplinary, and can be challenged with politics. (3) is also very hard as it is potentially more multi-disciplinary, and can be challenged with aligning with business value.
Hopefully this is giving some clarity?
yeah, that makes sense, and I think I probably generally agree? I agree with you that (20 could be really valuable, but fundamentally, it's just hard to do. Which makes sense: It's the least well defined problem, you have to have all these inputs, and all of that. It's maybe akin to being a doctor - someone walks in, says here are my problems, and you have to piece together a bunch of deep domain expertise, analysis of the situation, and some interpersonal stuff to figure out what to investigate, much less to figure out what to do. That's just...hard? But it sounds so appealing, so we say let's all go do it. And then many of us just can't.
I think I am on the same page, and I think it is weirder for us today as analytics as a practice was founded by consulting.
Frederick Taylor (Industrial Engineering) & James McKinsey (Managerial Accounting) were the two people who really did the most to expand the "data" scope.
The issue is that most of us are not consultants, but consultancy is what established the beach-head and then the technical discoveries have just tried to systemize it. As in, SQL is not new, but definitely newer than Shewhart control charts & t-statistics.
Maybe that's ultimately the weird thing here actually - analytics as it is today seems to trace its lineage through data and IT worlds, and not consulting and operations research worlds. Even though the latter is probably where the real roots are.
Wonderful insights presented in a wonderful way, as always. I think a lot of companies centralize advanced analytics for too long. "Draining" their centers of excellence can unlock a lot of value. The International Institute for Analytics wrote about this 3 years ago: https://iianalytics.com/community/blog/aspire-to-decrease-the-size-of-your-central-analytics-org
Thanks! And I might go even further, and say just drop the whole idea, and let teams hire analytics skills as they need them. The hub and spoke model is still an analytics teams to me; it's just one where analysts are each dedicated to one department. Which, maybe, but I'm convinced we even need that? I'd almost argue for having the centralized team to the reporting and infrastructure work, and let other teams decide how they want to hire to make use of that on their own.
Love this and it's a super important take right now. I'm completely onboard with "less numbers" and FWIW I believe there's a real case to be made here:
Exact numbers are the way to optimize machine decisions in ML. As you point out, human decisions aren't number-based. I'd wager they're story-based (good gossip is a good story). To survive / thrive, analytics has to become the story, meaning-making apparatus. Less dashboards, more engaging content. A clear narrative is more important than precise numbers.
This is the path to deliver on our promises without magic or messiah. Good stories create alignment, clarity, confidence. Let go of the pretense of empiricism and take responsibility for qualitative data (customer interviews) for good measure.
So this comment gave me the idea post I just used to for this on Linkedin (a link below, v cringe). Which is basically, I think we always talk about the importance of stories and all that, and I've always been a pretty big believer in it. But maybe that's not that useful? Like, rather than trying to be persuasive (which is hard), maybe it's better to define the facts about how people see the world? Tiktok, for example, is far better for political persuasion than politicians, because *arguments* don't really convince people. Changing the facts that people believe are reality *does.* I feel like we've been playing the persuasion game, and we might be better of playing the fact game.
https://www.linkedin.com/feed/update/urn%3Ali%3Aactivity%3A7178063028255358976/)
I have to agree. Perhaps it's semantics. To me, a story is the most efficient way to communicate information about the world between humans. It's designed to determine what "facts" we believe in.
The key is that numbers or persuasive language aren't enough on their own. However, engaging the audience with a persuasive narrative that is informed by the numbers may just work. It may be naive, but it may also be a more efficient channel for information to flow and influence how the organization behaves.
For sure; stories are definitely good at that. I think the nuance to me is the classic thing about the best way to get someone to change their mind is for them to think it was their idea to do it. For a long time, I've thought of stories as needing to be persuasive and to advance my opinion, rather than thinking about them as presenting a reality that will lead people to believe something on their onw.
The data spice must flow!
Well, right, the secret sauce needs to be frameworking + application of logic.
Well I just feel depressed now
I will say though, there is an upside to this, which I meant to include but forgot. Which is, I don't think that all analysis is useless; I just think it's really hard to do it well, so it doesn't scale that well into being a mainstream thing everywhere. But, that means if you are good at it, you can still be very valuable.
Ah... Ok, this makes me feel better. Though it feels there maybe some irony in you forgetting to include the upside... Perhaps a symptomatic issue amongst data professionals 😜
It was more that I'm usually in some panic on Friday morning to get the thing out, and the whole thing ends up crash landing at about 70% done. There are a lot more "oh oops I forget that whole thing" moments than I'd like to admit
Very provocative. I do think data teams need to focus less on data and more on narratives, and collective sense making. That’s more about humans, and communication than tech and data.
Kinnnda? This is a new thought, but I think there's some nuance there. Where it's less about narratives and more about...reality creation? So just like, facts?
This came up in this thread: https://benn.substack.com/p/disband-the-analytics-team/comment/52444178
So I kind of love the idea of adversarial analytics. It's like red teaming your decision making process. If the CEO directs the data team to lie to marketing about their numbers for the month, what would the marketing team do? Do they freak out and go panic mode and it triggers a ton of politics? Or do they fall back on what they control, assess their inputs, get curious about why, and figure it out together.
This reminds me of the chaos monkey from Netflix https://netflix.github.io/chaosmonkey/ Is the ideal for any team in an organization to be resilient to bad data and analytics? That would be a good thing right? Everyone wouldn't immediately trust the charts and they'd dive in and take a breath before going crazy...
This also reminds me that a good way to cut random spend in an org is to stochastically cancel corporate credits to see who and what complains!
We've definitely done some version of that with dashboards and stuff, where we just shut them off and see if anything happens. The results have never been terribly encouraging about the value of most of our dashboards...
I kind of like the idea of going even further though, and start introducing things that are wrong. Like, for every non-financial metric that you've got to report to the SEC or whatever, some days, you just futz with the number by a few percent. We don't tell you which days. On one hand, that's dumb and crazy. On the other hand, it might be like rounding, where it teaches people to look at trends and relative orders of magnitude, and not care that much about precision.
For 20 years I have thought data quality as a solved problem is right over that next hill. Rinse and repeat ♻️. Great post.
Thanks! And that was very much the original spark that inspired this post. I saw one article about data quality, and another about data teams getting a seat at the table, and was like, haven't we been saying this forever? At what point do we start asking what happens if this never gets better?
There is a challenge in that in certain places data quality is asked for by regulators:
https://en.wikipedia.org/wiki/BCBS_239
Finance is a clear example, but there are other areas where it is supposed to matter a lot.(But I know "supposed to" is very hypothetical, and all of this is less about adopting the latest tech)
Thanks Benn, I always love your thoughts. In this case, and I'm just spitballing here, you could focus on creating a mature analytics function within your organization that actually makes a difference. We created Leading in Analytics to help you do just that. Go ahead and search for it. . . Scroll through the sponsored ads, of course.
Thanks, and thanks for sharing! And yeah, that's something that I may have under-emphasized a bit too much here: I do think it's possible to get very good at this if you invest in it, have talented people, etc. It's not to say that it's all impossible. It's just very hard, and I think most people give up before they get there. Like, we all probably *could* become good pilots and then we could have lots of small zipcar-like private planes that people could use to go places and it would be way more convenient (and maybe environmentally friendly) than us driving everywhere. But flying is just too hard for that to be realistic.
Great writing ✍️ 🙌
Thanks!
THIS 👉 "Because after flirting with a life of crime, we might need to start keeping our backstory a secret, and learn to blend in with everyone else."
I have spent time as an undercover data agent in marketing, operations, logistics, product... Now that I'm spending most of my time (for the first time) leading with just the "data" play... I'm seeing the gaps more clearly.
Specifically in SaaS I would be love to see more startups start with the use-cases first. Seems obvious but most people don't seem to start there...
I will say, this does seem like it's becoming a bit more of a trend. I know a handful of data people who involved in data startups who are currently working on ideas like "Mixpanel for X," where they want to build an opinionated data tool for a particular vertical. No idea if they'll work, but, that's probably another constant pendulum - bundling to unbundling and back; generic to specific and back.
A few years from now people will complain about too many tools and the need to consolidate. Then they will trend back toward generic un-opinionated.
We're a fickle people.
Love your articles, usually read them twice, something unusual in a world drowning in content.
Would the following classification make sense:
operational data work: real-time (or near real-time) automated systems based on data. Usually containing some ML component but I would include a simple counter as well. Here we see a lot of value (ads etc).
tactical data work: human-facing presentation of metrics (in e.g. dashboards or reports) in a slower cadence. Basically checking the "known unknowns". I have seen a lot of value in catching and root-causing regressions or just aligning people around KRs.
strategic data work: diving into the "unknown unknowns" by coming up with "insights" that fundamentally change the whole business model. This is hard and it is almost impossible to predict any outcome. Therefore, the value density per working hour is pretty low.
All three pillars are usually covered by the same team, mostly because they share some tooling and maybe also thinking. My understanding is that everybody wants to work on important "strategic" stuff, and therefore the third pillar is raising a lot of expectations. At some point I have been researching on any strategic "insight" stories with real impact published online but could not find much, just a lot of corporate noise.
Thanks, glad you like it!
And yeah, that's pretty much how I'd put it. The only nitpick is I don't think operational work is necessarily real time or uses ML. I think the only really important thing here is that it's automated. A marketing drip campaign that sends people emails periodically about what's in their shopping cart would fall into this category for me, even though it's neither ML nor real-time.
But yeah, I think you're exactly right - the same team usually does all three (kind of by default, though it's not strictly necessary), and usually wants to do the last one. But as you say, there's a lot of noise in that pillar, it's really hard to get right, and it's rarely where the team delivers the most value.
Absolutely, i guess either real-time or batch, whenever small decisions need to be made fast or at scale is the better classification for operational.
LLMs have mastered the craft of creating SQL queries, Pandas scripts, and graphing analyses.
I work at a company where Data plays a service role. In that capacity, they’re far too distant from the problem to come up with unique angles nor ideas on what to dig into. PMs, Ops, and Revenue can now take on 90% of those responsibilities with a little additional effort.
In the near future, this skillset should be a tablestakes for any position hired in a technology company. Data will be relegated to a cost center like IT where they conduct tasks like setting up experiments, sharing reports, etc. To prevent that, BA needs to be the closest to the problems to ask the right questions.
I don't know that I agree with the premise (in my experience, LLMs don't write good SQL at all), but I roughly agree with the conclusion, that people who specialize in data analysis might not be all the valuable on their own. It seems very possible that the gap they have to make up in not knowing the context of the problem is just too big (and other people are often good enough analysts) to justify the cost.
I've been toying with the idea of an analytics sous-vide model for a bit and I wonder what you would think about it. A largely distributed model with a 3ish month rotation into the "hub" / COE so you can stay up to date on larger company context / methods and hopefully break down some of the tribalism that I feel like builds up. Maybe the scale you need to achieve for this to work makes it impossible, but I definitely agree that treating this like an independent function and not a good-to-have skill is aging poorly.
I'd be skeptical, to be honest? A lot of centralized data teams say that one of the things that they're good at is connecting dots across the business. But in practice, that doesn't really seem to happen.
More generally, I guess the question is, would a rotational program be the best of both worlds, where people develop some domain expertise while staying true as "analysts," or the worst of both worlds, where they never get deep enough in the department to be useful. I could see it going either way, though if I had to bet, I'd probably bet on the latter.
Yep that’s totally fair. I think organizations should be prepared to encounter tribalism and cross purposes at a certain stage, and have a plan for how to navigate those situations. It just feels like identifying conflicting incentives is easier and more enlightening than conflicting SQL logic.
I think the ultra distributed approach likely requires excellent department level leadership and analytics folks would rather have control over their own destruction that it be dictated to them lol
Yeah, that's a good point - it doesn't really matter what the team structure is if you can't hire people for that team, which creates some very real constraints, regardless of what structure might be theoretically ideal.
20 some odd years of working in data, and it has been only in the last several where I've extricated myself from the church of "data is everything and all things" to realise that at least 90% of everything in this world comes down to the messiness and complexity of people. (not an exact figure, but that's the point?) I think this is something that a few voices in the field, or data scene, or whatever we want to call this collective have touched upon, but doesn't get a loud enough voice. On the one hand, I think it would save ourselves a lot of trouble as there's no longer the need to quibble around ridiculous levels of precision, but on the other it's having to chip away and break free from decades of data rhetoric we've been taught to believe.
Yeah, I very much agree. I think the current trend is to say stuff like "it's not about tools! It's about people!," which, true, but in most of those critiques seem to be saying we can't just throw dashboards at people; we have to persuade them with data instead.
But I think you're right: it's deeper than that. The problem is the rhetorical nature of data itself. We see it as an argument-ender. There's been this kind of cultural capture by data, where it's put on an unassailable pedestal. Which, sure, yes, science is good and all of that, but we aren't really doing science. We're instead just arguing with numbers. And if everyone does that, it's not useful; it's an arms race to who can sound the most quantitative.