20 Comments
Feb 3, 2023Liked by Benn Stancil

Ouch! We are currently revising our mission statement, and had just settled on one using the word “insight” :facepalm:

But I actually live-tweeted your article into our internal Slack, so maybe you’ll give us the, um, “insight” on how to do better...

Expand full comment
Feb 3, 2023Liked by Benn Stancil

Great article!

When you think about it, it's not too different than roads, bridges, electricity, trash collection etc.. Those things are far more valuable than other more "interesting" things in our lives, but we rarely stop to think how awesome and important they are.

I agree that people want to see pretty basic numbers to answer the question "what is going on?" But often that is a super complex question to answer as you noted above. It takes serious investment to keep those answers clear and correct. Think about how much money is spent by public companies on their financial reporting. They have to answer pretty basic questions like how much money they make, where that money comes from and what they spend on. The answers have to be clear and correct because its the law.

I'd also add that while the "ah-ha" moments and "insights" do happen, they will happen more often if there is significant investment in making sure the fundamental boring questions are answered correctly.

Expand full comment
Mar 18, 2023Liked by Benn Stancil

Ben, i may have to stop contributing to Diginomica. you’re killing me. Well done. https://diginomica.com/author/neil-raden nraden@hiredbrains.com

Expand full comment
Feb 8, 2023Liked by Benn Stancil

As a data scientist at a large and fairly old company, the data reporting team is much larger and more respected. They build the reports that business stakeholders look at every day and make decisions from. Most data scientists would dream of having that much influence and "impact." But backwards thinking leads us to seeing more well-established data tools as less innovative and therefore less valuable.

The most common insight I get from data is that it isn't as good or useful as I thought. As one dumb example, I checked a couple columns for missing data and found none. Great! But then I dug one layer deeper and found that 30% of the values were the empty string "". It can be rough being the first person to really look at data that everyone believes is a goldmine.

Expand full comment
Feb 4, 2023Liked by Benn Stancil

As a fellow data practitioner turning into a leader, I wholeheartedly agree that all the hard work that goes into establishing accurate and resilient reporting needs to receive more praise from the sponsors. Part of the problem is that data teams themselves often see this as table stakes and often opt for sexier things to pitch as their value prop for the business. However, reliable reporting gives that analytics maturity pyramid a solid foundation that enables building more advanced analyses: inc. insights, predictions, and perhaps prescriptive actions (one day we'll trust them?).

If a data team is candid with themselves on where they sit on that pyramid, they should have a much better shot at adjusting their priorities and delivering the right message to the clients. I'd be happy to see every part of the climb up the analytics maturity ladder receive enough attention across the org but it does require work on both ends.

Those actionable/proactive/(3rd buzzword) insights, sales forecasts, and health scores are great, so long as they are not build on the quicksand of faulty data and reporting.

Expand full comment
Feb 4, 2023Liked by Benn Stancil

Twyman's law: Any statistic that appears interesting is almost certainly a mistake.

I feel like a lot of data work is a random search for surprises, performatively using the most complicated tools available, in an organization that isn't prepared to evaluate them once found.

Lately I've been reading Kohavi's book on experimentation, which is actually a deep dive into how hard it is to make data-driven decisions. There's a whole hierarchy of evidence and different types of metrics. The org has to be prepared to discard projects that don't test out, for instance, and most Aha! moments have to be followed by months of experimentation and analysis.

What you're probably talking about with reporting is guardrail metrics, which mainly tell us if things are working properly. My 90/10 rule is that 90% of a data system is baseline reports like this, to keep things on track for the 10% which is actionable insights or just higher-level analysis like ML.

Expand full comment
Feb 3, 2023Liked by Benn Stancil

Cheers to the data team members making insights possible! They are highly valued everywhere I've been. Insights & better decisions are why you have a data team in the first place, but they don't have to all come from the data team.

I have a sense this seems directed to those early in their data journey or where a data culture is still low or building. Elements of the data team shift based on the context of your individual company on where your needs are, which will shift over time. When people can't answer the easy questions they usually value the basics. As they mature they can answer those on their own, quickly. And want you to help answer the harder questions, to make them easy again :)

Investment happens to drive insights and ensure the company is making the best decisions more often. The more you can connect your data team to the strategy, and someone(s) from the team having frequent discussions with the CEO/CFO/CMO, the more valued the whole team will be.

Expand full comment

Agree with the prescription but not entirely sure with the last solution regarding "praise and promotion" given by executive team for data person.

A much effective alternative, I suppose, will be to encourage data person go to the market with their unique "insights" that being rejected by the team, and prove there's value for it in the market.

Both executive and the one who found the insight follow what's customer value. By punishing "company" who just sit in their golden egg, we could force the company to listen more, hence act more because of it.

Expand full comment

People don't want "actionable insights", they want "actions" (by insight or not). I also argue they want "measurably correct actions", which of course requires the evaluation of one's actions.

In short, people want to see a graph going up and to the right, with you pointing at it saying "I caused this."

I write about this problem of last-mile analytics more here: https://alexpetralia.com/2023/01/19/working-with-data-from-start-to-finish/

Expand full comment