When you think about it, it's not too different than roads, bridges, electricity, trash collection etc.. Those things are far more valuable than other more "interesting" things in our lives, but we rarely stop to think how awesome and important they are.
I agree that people want to see pretty basic numbers to answer the question "what is going on?" But often that is a super complex question to answer as you noted above. It takes serious investment to keep those answers clear and correct. Think about how much money is spent by public companies on their financial reporting. They have to answer pretty basic questions like how much money they make, where that money comes from and what they spend on. The answers have to be clear and correct because its the law.
I'd also add that while the "ah-ha" moments and "insights" do happen, they will happen more often if there is significant investment in making sure the fundamental boring questions are answered correctly.
For sure. I'd originally planned on having a whole section in this about the difference between easy and complex questions (it ended up not really fitting, so I cut it). To me, the thing that's often most valuable is making it simple to answer easy to understand questions but complicated to answer. For example...
- How much money do we make every day? It's very easy to understand what that means, but it's complex to figure out. If we can make it easy to answer, we've probably done something really valuable.
- What's our aggregate customer health score? It's not actually very easy to understand that question. Which typically makes it a lot less useful to answer.
Anyway, there's not quite a clear and coherent idea in there, which I why I cut it from the post.
As a data scientist at a large and fairly old company, the data reporting team is much larger and more respected. They build the reports that business stakeholders look at every day and make decisions from. Most data scientists would dream of having that much influence and "impact." But backwards thinking leads us to seeing more well-established data tools as less innovative and therefore less valuable.
The most common insight I get from data is that it isn't as good or useful as I thought. As one dumb example, I checked a couple columns for missing data and found none. Great! But then I dug one layer deeper and found that 30% of the values were the empty string "". It can be rough being the first person to really look at data that everyone believes is a goldmine.
The "more respected" part is what's most interesting to me. It feels like in Silicon Valley, people appreciate this kind of reliable reporting, but in the way they appreciate a good long snapper in football - there's never any glory in it, and doing a perfect job is seen as doing a passing job, and doing a passing job is seen as failure.
As a fellow data practitioner turning into a leader, I wholeheartedly agree that all the hard work that goes into establishing accurate and resilient reporting needs to receive more praise from the sponsors. Part of the problem is that data teams themselves often see this as table stakes and often opt for sexier things to pitch as their value prop for the business. However, reliable reporting gives that analytics maturity pyramid a solid foundation that enables building more advanced analyses: inc. insights, predictions, and perhaps prescriptive actions (one day we'll trust them?).
If a data team is candid with themselves on where they sit on that pyramid, they should have a much better shot at adjusting their priorities and delivering the right message to the clients. I'd be happy to see every part of the climb up the analytics maturity ladder receive enough attention across the org but it does require work on both ends.
Those actionable/proactive/(3rd buzzword) insights, sales forecasts, and health scores are great, so long as they are not build on the quicksand of faulty data and reporting.
So I generally agree, and used to think exactly the same thing, but I think there's somewhat of a trap hidden in that mindset. If we put reporting and "telling people what's happening" at the base of a pyramid with something above it, we inevitably start seeing reporting as a stepping stone, as lower value work, or something to get done so we can get to the more important thing. It creates this sense that we're working towards something else, rather than reporting being the job. And if we do that, I think we end up in the place where we are today, where that work isn't seen as valuable as it should be. It's something we want to get past, not something we want to do.
Fair point, Benn. I like how a pyramid allocates progressively less space (time, resources, impact?) for each level as it goes from the basics (applicable for everyone) to things that matter for a much smaller group. What I don't like is the inherent message that things at the top are better / more valuable compared to stuff at the bottom.
Perhaps a flywheel model would do a better job here: solid data/reporting at its core, enabling and amplifying more advanced analytics, which informs better reporting. Rinse and repeat.
I echo that for most businesses, maintaining and enhancing the analytics core brings the most value to the table and so it should be recognized as a key contribution of the data team. The big question is how to make this message resonate with people who are prone to fall prey to all those sexy ML demos.
Yeah, there will always be that, I imagine. Something will always give the crisp demo about how all of your self-serve needs are satisfied with AI, and how you can use this auto-forecasting ML tool to proactively make decisions, and so on.
Twyman's law: Any statistic that appears interesting is almost certainly a mistake.
I feel like a lot of data work is a random search for surprises, performatively using the most complicated tools available, in an organization that isn't prepared to evaluate them once found.
Lately I've been reading Kohavi's book on experimentation, which is actually a deep dive into how hard it is to make data-driven decisions. There's a whole hierarchy of evidence and different types of metrics. The org has to be prepared to discard projects that don't test out, for instance, and most Aha! moments have to be followed by months of experimentation and analysis.
What you're probably talking about with reporting is guardrail metrics, which mainly tell us if things are working properly. My 90/10 rule is that 90% of a data system is baseline reports like this, to keep things on track for the 10% which is actionable insights or just higher-level analysis like ML.
There's an interesting question in there to me, about how we want to find interesting things when we do explorations, deep dives, etc. Most of those projects are, in some senses, fishing expeditions for interesting things.
Which sounds bad, though as I type that out, I'm not so sure it is? Interesting and unexpected things can often be useful, even if they aren't the thing we set out to find. In other words, fishing expeditions sound kind of like a form of p-hacking, where we're always looking for the significant result rather than actually testing a hypothesis. But if that interesting thing is real (ie, it's not just noise), that could still be a good thing to know, even if it wasn't what we were initially hunting.
That's a bunch of underbaked ideas, but might be something worth thinking about more.
Cheers to the data team members making insights possible! They are highly valued everywhere I've been. Insights & better decisions are why you have a data team in the first place, but they don't have to all come from the data team.
I have a sense this seems directed to those early in their data journey or where a data culture is still low or building. Elements of the data team shift based on the context of your individual company on where your needs are, which will shift over time. When people can't answer the easy questions they usually value the basics. As they mature they can answer those on their own, quickly. And want you to help answer the harder questions, to make them easy again :)
Investment happens to drive insights and ensure the company is making the best decisions more often. The more you can connect your data team to the strategy, and someone(s) from the team having frequent discussions with the CEO/CFO/CMO, the more valued the whole team will be.
The maturity question is interesting. I agree that it *seems* like this changes a lot as organizations mature, though I'm not sure it actually does. I think there's a sense that as organizations get more mature, the basic reporting gets more solidified and there's more "insight discovery" type work. The question I'd have with that, though, is, is that data team insight discovery work actually all that valuable? I'm not sure that it actually is; plus, if it is, it seems like it'd be much more valuable for less mature companies when a lot less about the business is known.
Agree with the prescription but not entirely sure with the last solution regarding "praise and promotion" given by executive team for data person.
A much effective alternative, I suppose, will be to encourage data person go to the market with their unique "insights" that being rejected by the team, and prove there's value for it in the market.
Both executive and the one who found the insight follow what's customer value. By punishing "company" who just sit in their golden egg, we could force the company to listen more, hence act more because of it.
Yeah, I certainly don't think that we should punish people for looking for "insights," or not celebrate those things. They can be really valuable. But I think we should give the same rewards (be it social, financial, whatever) for people who do the more mundane seeming work of just reporting on what's happening.
Like, right now, to be on a data team, there's nothing good or fun about the reporting work. It feels like drudgery, and is treated like drudgery. My argument is that we should just treat it better.
People don't want "actionable insights", they want "actions" (by insight or not). I also argue they want "measurably correct actions", which of course requires the evaluation of one's actions.
In short, people want to see a graph going up and to the right, with you pointing at it saying "I caused this."
Ouch! We are currently revising our mission statement, and had just settled on one using the word “insight” :facepalm:
But I actually live-tweeted your article into our internal Slack, so maybe you’ll give us the, um, “insight” on how to do better...
You'll notice I didn't offer any alternatives though, because...I have no idea what would be a better alternative.
It depends what your company values and where you are in your analytics maturity. If you are later in maturity I hope you keep insight
Great article!
When you think about it, it's not too different than roads, bridges, electricity, trash collection etc.. Those things are far more valuable than other more "interesting" things in our lives, but we rarely stop to think how awesome and important they are.
I agree that people want to see pretty basic numbers to answer the question "what is going on?" But often that is a super complex question to answer as you noted above. It takes serious investment to keep those answers clear and correct. Think about how much money is spent by public companies on their financial reporting. They have to answer pretty basic questions like how much money they make, where that money comes from and what they spend on. The answers have to be clear and correct because its the law.
I'd also add that while the "ah-ha" moments and "insights" do happen, they will happen more often if there is significant investment in making sure the fundamental boring questions are answered correctly.
For sure. I'd originally planned on having a whole section in this about the difference between easy and complex questions (it ended up not really fitting, so I cut it). To me, the thing that's often most valuable is making it simple to answer easy to understand questions but complicated to answer. For example...
- How much money do we make every day? It's very easy to understand what that means, but it's complex to figure out. If we can make it easy to answer, we've probably done something really valuable.
- What's our aggregate customer health score? It's not actually very easy to understand that question. Which typically makes it a lot less useful to answer.
Anyway, there's not quite a clear and coherent idea in there, which I why I cut it from the post.
Ben, i may have to stop contributing to Diginomica. you’re killing me. Well done. https://diginomica.com/author/neil-raden nraden@hiredbrains.com
Wait, I'm confused! I don't think I disagree with any of the posts of yours I clicked through!
As a data scientist at a large and fairly old company, the data reporting team is much larger and more respected. They build the reports that business stakeholders look at every day and make decisions from. Most data scientists would dream of having that much influence and "impact." But backwards thinking leads us to seeing more well-established data tools as less innovative and therefore less valuable.
The most common insight I get from data is that it isn't as good or useful as I thought. As one dumb example, I checked a couple columns for missing data and found none. Great! But then I dug one layer deeper and found that 30% of the values were the empty string "". It can be rough being the first person to really look at data that everyone believes is a goldmine.
The "more respected" part is what's most interesting to me. It feels like in Silicon Valley, people appreciate this kind of reliable reporting, but in the way they appreciate a good long snapper in football - there's never any glory in it, and doing a perfect job is seen as doing a passing job, and doing a passing job is seen as failure.
As a fellow data practitioner turning into a leader, I wholeheartedly agree that all the hard work that goes into establishing accurate and resilient reporting needs to receive more praise from the sponsors. Part of the problem is that data teams themselves often see this as table stakes and often opt for sexier things to pitch as their value prop for the business. However, reliable reporting gives that analytics maturity pyramid a solid foundation that enables building more advanced analyses: inc. insights, predictions, and perhaps prescriptive actions (one day we'll trust them?).
If a data team is candid with themselves on where they sit on that pyramid, they should have a much better shot at adjusting their priorities and delivering the right message to the clients. I'd be happy to see every part of the climb up the analytics maturity ladder receive enough attention across the org but it does require work on both ends.
Those actionable/proactive/(3rd buzzword) insights, sales forecasts, and health scores are great, so long as they are not build on the quicksand of faulty data and reporting.
So I generally agree, and used to think exactly the same thing, but I think there's somewhat of a trap hidden in that mindset. If we put reporting and "telling people what's happening" at the base of a pyramid with something above it, we inevitably start seeing reporting as a stepping stone, as lower value work, or something to get done so we can get to the more important thing. It creates this sense that we're working towards something else, rather than reporting being the job. And if we do that, I think we end up in the place where we are today, where that work isn't seen as valuable as it should be. It's something we want to get past, not something we want to do.
Fair point, Benn. I like how a pyramid allocates progressively less space (time, resources, impact?) for each level as it goes from the basics (applicable for everyone) to things that matter for a much smaller group. What I don't like is the inherent message that things at the top are better / more valuable compared to stuff at the bottom.
Perhaps a flywheel model would do a better job here: solid data/reporting at its core, enabling and amplifying more advanced analytics, which informs better reporting. Rinse and repeat.
I echo that for most businesses, maintaining and enhancing the analytics core brings the most value to the table and so it should be recognized as a key contribution of the data team. The big question is how to make this message resonate with people who are prone to fall prey to all those sexy ML demos.
Yeah, there will always be that, I imagine. Something will always give the crisp demo about how all of your self-serve needs are satisfied with AI, and how you can use this auto-forecasting ML tool to proactively make decisions, and so on.
Twyman's law: Any statistic that appears interesting is almost certainly a mistake.
I feel like a lot of data work is a random search for surprises, performatively using the most complicated tools available, in an organization that isn't prepared to evaluate them once found.
Lately I've been reading Kohavi's book on experimentation, which is actually a deep dive into how hard it is to make data-driven decisions. There's a whole hierarchy of evidence and different types of metrics. The org has to be prepared to discard projects that don't test out, for instance, and most Aha! moments have to be followed by months of experimentation and analysis.
What you're probably talking about with reporting is guardrail metrics, which mainly tell us if things are working properly. My 90/10 rule is that 90% of a data system is baseline reports like this, to keep things on track for the 10% which is actionable insights or just higher-level analysis like ML.
There's an interesting question in there to me, about how we want to find interesting things when we do explorations, deep dives, etc. Most of those projects are, in some senses, fishing expeditions for interesting things.
Which sounds bad, though as I type that out, I'm not so sure it is? Interesting and unexpected things can often be useful, even if they aren't the thing we set out to find. In other words, fishing expeditions sound kind of like a form of p-hacking, where we're always looking for the significant result rather than actually testing a hypothesis. But if that interesting thing is real (ie, it's not just noise), that could still be a good thing to know, even if it wasn't what we were initially hunting.
That's a bunch of underbaked ideas, but might be something worth thinking about more.
Cheers to the data team members making insights possible! They are highly valued everywhere I've been. Insights & better decisions are why you have a data team in the first place, but they don't have to all come from the data team.
I have a sense this seems directed to those early in their data journey or where a data culture is still low or building. Elements of the data team shift based on the context of your individual company on where your needs are, which will shift over time. When people can't answer the easy questions they usually value the basics. As they mature they can answer those on their own, quickly. And want you to help answer the harder questions, to make them easy again :)
Investment happens to drive insights and ensure the company is making the best decisions more often. The more you can connect your data team to the strategy, and someone(s) from the team having frequent discussions with the CEO/CFO/CMO, the more valued the whole team will be.
The maturity question is interesting. I agree that it *seems* like this changes a lot as organizations mature, though I'm not sure it actually does. I think there's a sense that as organizations get more mature, the basic reporting gets more solidified and there's more "insight discovery" type work. The question I'd have with that, though, is, is that data team insight discovery work actually all that valuable? I'm not sure that it actually is; plus, if it is, it seems like it'd be much more valuable for less mature companies when a lot less about the business is known.
Agree with the prescription but not entirely sure with the last solution regarding "praise and promotion" given by executive team for data person.
A much effective alternative, I suppose, will be to encourage data person go to the market with their unique "insights" that being rejected by the team, and prove there's value for it in the market.
Both executive and the one who found the insight follow what's customer value. By punishing "company" who just sit in their golden egg, we could force the company to listen more, hence act more because of it.
Yeah, I certainly don't think that we should punish people for looking for "insights," or not celebrate those things. They can be really valuable. But I think we should give the same rewards (be it social, financial, whatever) for people who do the more mundane seeming work of just reporting on what's happening.
Like, right now, to be on a data team, there's nothing good or fun about the reporting work. It feels like drudgery, and is treated like drudgery. My argument is that we should just treat it better.
People don't want "actionable insights", they want "actions" (by insight or not). I also argue they want "measurably correct actions", which of course requires the evaluation of one's actions.
In short, people want to see a graph going up and to the right, with you pointing at it saying "I caused this."
I write about this problem of last-mile analytics more here: https://alexpetralia.com/2023/01/19/working-with-data-from-start-to-finish/