17 Comments
Aug 19, 2022Liked by Benn Stancil

In some ways what you are asking is not really about analytics, but about how to have a fact-driven / truth-seeking culture. Ray Dalio would have a field day with you on this.

Best organizations (Google, Amazon) supposedly had this early on.

Science-focused universities used to have this.

The easy thought process is that this should come from the leadership. But I don't think this is enough.

I think to have a truth-seeking culture, you also need to be exclusionary and hire the type of Sales people, as an example, who prefer Truth over Fiction, regardless of how truth is born - even if that happens through the process of discovering errors.

Expand full comment
Aug 19, 2022Liked by Benn Stancil

As usual, Benn asks the hard questions most of us would prefer not to think about…

Expand full comment

Hellz Yeah Benn! 🤘

PS- I love rounded numbers, the precision of data to the last penny is nonsense! Round it to the last thousand for me. 👍

Expand full comment
Aug 22, 2022Liked by Benn Stancil

I wonder if Substack were counting fake email opens from Apple's MPP previously.

Actually, that's something else that's potentially interesting, the email platform industry as a whole likes to report on "opens" as if they're some sort of objective truth, but the more you speak to people who understand how those systems work, the more you very quickly realise how deeply flawed those metrics are. Those flaws are frequently either not communicated or communicated incredibly poorly to customers who've been taught that this number somehow matters.

How do you start getting folks to stop looking at vanity metrics when they've been taught to look at them for years? 🤔

Expand full comment
Aug 22, 2022Liked by Benn Stancil

+1 for data = confidence game. Nevertheless the stickiness issue is more to do with our tendency to hide behind data (or so called experts) instead of acknowledging uncertainty.

In the 2032 version, we can do a few things when things go wrong: a) comment on the impact of decisions already taken via data (if we ever can measure it), and recalculate the ones for current/ future b) does the corrected data confirms or rejects our biases / learning c) comment on the revised targets or measurements that needs to be made based on the new corrected base.

Sometimes, the targets are literally basis points (50 bps project for acquisition marketing in mobile channel) = The reason I think estimates won’t do.

Accountability for opinions based on data, may be the way future CDOs or directors are measured on. think we have chief analytics officers then. Currently the blame finding is a mess in the modern data stack based roles.

Expand full comment

You’re a really good writer.

Expand full comment
Aug 20, 2022·edited Aug 20, 2022Liked by Benn Stancil

At my company (and my previous two companies), I've been the annoying person in the room demanding that we define our metrics the same way across the organization, that we have a data dictionary (or if we have one, an up-to-date one), that we require our clients to provide us THEIR definitions of their fields. I've been known to spend days tracking down the reason for a mistake. I've been reprimanded for spending too much time on data prep and exploration. People hate it. They respond in a professional manner, but they don't really want to talk about it. It's like everyone wants to pretend the numbers are right until we get caught with them being wrong. I like your rounding idea and I also want to help stakeholders align their expectations with reality and educate them about how real data really is.

Expand full comment