24 Comments
Feb 10, 2023Liked by Benn Stancil

Benn, how did you break into my brain and steal the article I'm currently writing?

I think this is such an important insight about data that I've personally struggled with. The action our data inspires is as much or sometimes more important than the truths and insights it reveals. We can sigh and say a dashboard objectively worthless all we want, but if it makes people happy and focused, it's having a material impact.

Expand full comment
author

Ooh, in that case, I look forward to reading a much better version of this one.

And yeah, that's why ultimately the cynic in me feels like we do it for self-serving (or I guess self-soothing is probably more accurate) reasons. We could have so much positive impact, and in a way that's not "lying" - it's what basically every other function does - but we really recoil at it. It seems like such a missed opportunity, if we could take on slightly different character.

Expand full comment
Feb 20, 2023Liked by Benn Stancil

Great post, Benn, definitely hit on some key points and issues with value-add. There may be some areas where we might slightly disagree in terms of terminology and characterization. As someone who has worked in both the quantitative and qualitative sides of analysis, I've come to appreciate the value of each tool set, and I think there lies some of those issues you describe.

I think the main issue you are highlighting is that some data team are too strict in their adherence to what the data says (both in input and output) and perceived objectivity, while ignoring or discounting context, intuition, opinion/judgment on the principle that they are some sort of *bias*. We forget that data isn't everything, and there is value in qualitative analysis and analyst judgment. Things like argumentation, pragmatism, intuition, and context. At times, some data teams may lean too heavy on their data tools and discount the value of qualitative analysis. I'd argue it's more "lean into the squishy stuff" rather than "lean into bias".

I don't think we lose our role as truth seekers, but rather, we should not lose sight that we are a part of the rest of the company, aligned with the strategy and policy put forth by leadership and the data and analysis that is produced should support that strategy or policy, but always honest about the performance of that strategy or policy.

Not everything can be explained via clean data or the results of a model, but there is value in analytical judgment. Data teams develop a deep intuitive understanding of the underlying subject matter that comes from wading knee deep in the data day in and day out, and should lean into that more. There is deep and useful knowledge in the minds of the data team that can be synthesized qualitatively and with analytical judgment and in conjunction with leadership and company strategists.

In terms of Hinkie's strategy, I don't believe the data team should be "selling" it, per se, but rather providing analytical support to help decision-makers understand its progress. This requires being selective and using analytical judgment to determine what information is most relevant to share. We need to be analytical entities working in conjunction with decision-makers to provide them with the truth, or as close to it as possible, to ensure success.

While we may discount initial reports showing a decline in performance, we should still provide them with the analytical caveat that this is expected in the initial stages of the strategy and it is important to recognize that it may take time before we start seeing actual signals in the data.

Expand full comment
author

Thanks for the thoughts. And yeah, I think it's probably fair to say that most of this is about "leaning into the squishy stuff," though I think some people can still take that to mean things like "write good stories around your analysis" and the like. Which isn't to say that isn't important, but it sorta dodges the question of whether or not the goal should be complete objectivity or not.

On the question of selling Hinkie's strategy, I'd generally agree that, to Hinkie, be objective and all of that. But, there are two big questions there to me:

1. Does that mean you should be objective to everyone else? That seems like a much harder argument to make. This already happens for lots of people anyway—execs making the case for some strategy and sell it all the time, so it's not like data teams would be doing anything substantively different there.

2. A messier question - should data teams actually be objective to Hinkie? On one hand, it's his job to make decisions, and needs to see all the facts. On the other hand, the bridges have been burned, and Hinkie needs to be committed to the strategy as much as anyone else. Like, I think that there are probably ethical issues with "selling" him on his own strategy to keep him committed to it, but I'm not that convinced that the outcome wouldn't actually be better that way.

Expand full comment
Feb 14, 2023Liked by Benn Stancil

Benn, Great content, great subject. Some remarks:

... corporate instruments for making money. A bit of a shortcut for: we are hired to deliver what the managers ask, that may just as well mean that they want us to ensure continuity at the cost of making money (today) and Data Scientists can also be working for non commercial organisations.

Indeed, The laws of aerodynamics will keep a plane in the air, not faith. Point in case is that one can invent a new way to harness the laws of aerodynamics. That is something that will require faith to then implement. In short, faith is anywhere when the future is designed differently then the past. That is what sales people thrive on.

Expand full comment

I've rarely done an analysis where there were no qualifiers, or where the answer was so clear it needed no explanation. To me, most of the job for data teams is getting information into people's hands and hopefully helping them understand it. Generally you'll be confirming existing impressions but sometimes you might show something they didn't expect.

Expand full comment
author

The qualifiers thing is interesting to me, because that's where this whole issue gets tricky. On one hand, explaining those qualifiers and making sure people see the uncertainty and doubt in what we're doing seems right, both logically and morally. On the other hand, those qualifiers are exactly the kinds of things that put cracks in people's commitment, and can make success less likely.

Expand full comment
Feb 11, 2023Liked by Benn Stancil

The real world has thousands of variables, and we can measure one at a time with A/B testing. We can confirm a few ideas, but creative work will always reach beyond, and intuition should not be blocked by analytics.

Stats is like a handgun: a tool for settling petty arguments.

Expand full comment
author

I like that analogy, because it also suggests another thing about the power dynamic it creates. Being data driven or whatever isn't really a great equalizer; it just favors people who wield it well. (It's similar to the point in this video: https://www.tiktok.com/@bookbearexpress/video/7196810079386029317)

Expand full comment
Feb 11, 2023Liked by Benn Stancil

always amazed how someone can completely destroy the nature of this profession and give it respect all in one essay. Also, reminds me of an essay I wrote a few years back: https://segah.me/post/after-looking-at-data-of-80-tech-companies-what-have-i-learned-part-1/

Expand full comment
author

There's something kind of sinister in this point: "The problem is that data is often just an excuse to transfer decision making from those who are not as capable of at interpreting data."

It's still a mushy idea for me, but I suspect data has facilitated a kind of long-term kind of organizational capture by the quants, where we've made the language of data the language that wins arguments (and conversely, the language of emotion, etc, the language that loses them). And I'm not sure if that makes us as well off as we might like to think.

Expand full comment
Feb 13, 2023Liked by Benn Stancil

Good for incomes of middle managers and ICs in quantitative roles; most certainly negative for executives/founders/and investors. The best and most accurate arguments should win - not the most quantitative ones. To hide behind sophistication is an old trick.

Expand full comment
author

It's not just data though. You can hide behind plenty of debate-style sophistication too. Which is kind of a solution-less problem, since any type of argument can get gamed.

Expand full comment

For some reason, the line 'nor do they care that they don’t' in your footnote #4 bothers me tremendously. Not because I suspect it to be false, but maybe because it rings so true. And perhaps especially because it calls into question what folks' true motives are.

If most start with their 'preferred outcome' and work backwards into a data argument from there, how exactly do they arrive at that desired result *without* using data and logic? It would seem that, if their preferred outcome was what was in alignment with what is best for the organization, there would be no need to presuppose one -- but rather determine it from the best data and logic we have to bear on the question.

Your argument paints people as so Machiavellian and self-serving. Is that really how you think most people in power operate? I hate to begrudgingly agree, but after over 20 years working in organizations of all shapes and sizes I struggle to locate any objective data or logic to refute it. :-(

Expand full comment
author

Ahh, so your comment and another one from Twitter is helping me put this together a bit better. I think we often apply a Machiavellian connotation to not using data and working backwards from a desired outcome, where there's an honest way of working, and then there's being political, self-serving, and so on.

But maybe their cherry-picking and narrative crafting isn't about being self-serving, but is actually them doing what will make the desired outcome most likely? Sure, other people have incentives to put their fingers on the scale, and get frustrated with us when we're like, you're wrong, here's some numerical nuance. But couldn't they have the same reaction to us? They aren't dumb or evil; they want nudge the scale for a reason. If the goal is to make the company successful, maybe *we're* wrong - and being dogmatic for our own benefit - by *not* doing it.

(And true, in some theoretical world, their desires and the data should line up. I don't think that's that realistic though. A lot of "business" is probably more art than anything, and people's unquantified instincts are just as good as what the data says, just like how a director's instincts will often make a better movie than the data team at Netflix.)

Expand full comment

Appreciate your thoughtful reply, Benn. I'm willing to agree that there are likely limits to the power of logic and data (even when its good) in helping us make the best decisions. However, I find it quite disingenuous that someone would obfuscate a deeply-held conviction or instinct by wrapping it inside some artful "data" language. But, then again, maybe they feel they have no choice, given our "data" obsessed business culture -- or perhaps that is your point? ;-)

Expand full comment
author

Hmmm, so that's an interesting question. I think people have a mostly bad habit of not telling people how they feel about stuff, and instead trying to justify it as though it's entirely logical and rational. Most of the time, I suspect we'd actually be better off if we just said "I want this thing, and here are some arguments for it." So it's wrapped in "data" language, but not in a deceitful way.

The counterargument to that is that the rational arguments might not be as persuasive if people know you want something. But in saying that, I might think the opposite is true - everyone assumes we all do this anyway, so exposing your bias might not hurt.

Expand full comment

... or, even better, maybe we could work on reducing that self-serving bias a bit, so we can be transparent about what we want and why -- and we can figure out the best way forward for the organization *together* -- because it truly won't just benefit ourselves :-)

(i know, I'm dreaming here... but, as Lennon said... "imagine.")

Expand full comment
Feb 14, 2023·edited Feb 14, 2023Liked by Benn Stancil

I recommend the arduous work of the historian who is (maybe more akin to the Data Scientist than the latter wants to accept) marred by incomplete data and trying hard to stay away from bias, knowing that is virtually impossible. The toolkit used is old(almost 2 centuries) and seasoned: https://en.wikipedia.org/wiki/Source_criticism

Expand full comment
Feb 10, 2023Liked by Benn Stancil

Brilliant as always. This why I think “data science” is the wrong framing to use for the value we provide. I prefer the term “Decisioneering.”

Though it IS important to distinguish between the accurate info we need BEFORE a decision and the actionable data we need AFTER...

Expand full comment
author

Yeah, that's why the emphasis on decisions feels off to me. I suspect that the decision we make is a much smaller part of what makes something successful than we assume.

Expand full comment

One thing I always tell academics when I interview or discuss industry jobs is that we're not in the business of seeking out Truth w/ the big T, just the small t. Because we just have to provide arguments and viewpoints that will lead to a group of decision makers make better decisions. That usually means a reduced level of rigor, but also implies we do other weird things. If that's stuff to keep people sticking to the plan, and contributing analysis to making The Plan work out in the end, that's how it rolls. There's plenty of ways to do that without actively misleading people or doing anything particularly sketchy (which is what sometimes people think reduced rigor can imply)

Expand full comment
author

Yeah, I think that last point is what's often misrepresented. Using data for anything other than clear-eyed fact finding often has this connotation of manipulation and dishonesty, and I don't think that's at all necessary. There's nothing capital W Wrong about using data to advance a point, in the same way there's nothing Wrong about using persuasive language to do it. (And as I type that, I like this analogy even more, because we also realize that, with language, there's no absolute ground truth in what we way. We have to editorialize, even if that editorial choice is to say things in as plain language as we can. Sharing analysis is the same - there's no exact ground truth, it's full of editorializing, and there's a lot of space between persuasion and lying.)

Expand full comment
Feb 13, 2023Liked by Benn Stancil

"there's no exact ground truth, it's full of editorializing, and there's a lot of space between persuasion and lying." - And I think this is why we like to throw around terms such as "data storytelling" and "narrative" as those lend themselves to something more fluid, and less demanding of rigor as Randy mentions.

Expand full comment