24 Comments
Feb 10, 2023Liked by Benn Stancil

Benn, how did you break into my brain and steal the article I'm currently writing?

I think this is such an important insight about data that I've personally struggled with. The action our data inspires is as much or sometimes more important than the truths and insights it reveals. We can sigh and say a dashboard objectively worthless all we want, but if it makes people happy and focused, it's having a material impact.

Expand full comment
Feb 20, 2023Liked by Benn Stancil

Great post, Benn, definitely hit on some key points and issues with value-add. There may be some areas where we might slightly disagree in terms of terminology and characterization. As someone who has worked in both the quantitative and qualitative sides of analysis, I've come to appreciate the value of each tool set, and I think there lies some of those issues you describe.

I think the main issue you are highlighting is that some data team are too strict in their adherence to what the data says (both in input and output) and perceived objectivity, while ignoring or discounting context, intuition, opinion/judgment on the principle that they are some sort of *bias*. We forget that data isn't everything, and there is value in qualitative analysis and analyst judgment. Things like argumentation, pragmatism, intuition, and context. At times, some data teams may lean too heavy on their data tools and discount the value of qualitative analysis. I'd argue it's more "lean into the squishy stuff" rather than "lean into bias".

I don't think we lose our role as truth seekers, but rather, we should not lose sight that we are a part of the rest of the company, aligned with the strategy and policy put forth by leadership and the data and analysis that is produced should support that strategy or policy, but always honest about the performance of that strategy or policy.

Not everything can be explained via clean data or the results of a model, but there is value in analytical judgment. Data teams develop a deep intuitive understanding of the underlying subject matter that comes from wading knee deep in the data day in and day out, and should lean into that more. There is deep and useful knowledge in the minds of the data team that can be synthesized qualitatively and with analytical judgment and in conjunction with leadership and company strategists.

In terms of Hinkie's strategy, I don't believe the data team should be "selling" it, per se, but rather providing analytical support to help decision-makers understand its progress. This requires being selective and using analytical judgment to determine what information is most relevant to share. We need to be analytical entities working in conjunction with decision-makers to provide them with the truth, or as close to it as possible, to ensure success.

While we may discount initial reports showing a decline in performance, we should still provide them with the analytical caveat that this is expected in the initial stages of the strategy and it is important to recognize that it may take time before we start seeing actual signals in the data.

Expand full comment
Feb 14, 2023Liked by Benn Stancil

Benn, Great content, great subject. Some remarks:

... corporate instruments for making money. A bit of a shortcut for: we are hired to deliver what the managers ask, that may just as well mean that they want us to ensure continuity at the cost of making money (today) and Data Scientists can also be working for non commercial organisations.

Indeed, The laws of aerodynamics will keep a plane in the air, not faith. Point in case is that one can invent a new way to harness the laws of aerodynamics. That is something that will require faith to then implement. In short, faith is anywhere when the future is designed differently then the past. That is what sales people thrive on.

Expand full comment

I've rarely done an analysis where there were no qualifiers, or where the answer was so clear it needed no explanation. To me, most of the job for data teams is getting information into people's hands and hopefully helping them understand it. Generally you'll be confirming existing impressions but sometimes you might show something they didn't expect.

Expand full comment
Feb 11, 2023Liked by Benn Stancil

The real world has thousands of variables, and we can measure one at a time with A/B testing. We can confirm a few ideas, but creative work will always reach beyond, and intuition should not be blocked by analytics.

Stats is like a handgun: a tool for settling petty arguments.

Expand full comment
Feb 11, 2023Liked by Benn Stancil

always amazed how someone can completely destroy the nature of this profession and give it respect all in one essay. Also, reminds me of an essay I wrote a few years back: https://segah.me/post/after-looking-at-data-of-80-tech-companies-what-have-i-learned-part-1/

Expand full comment

For some reason, the line 'nor do they care that they don’t' in your footnote #4 bothers me tremendously. Not because I suspect it to be false, but maybe because it rings so true. And perhaps especially because it calls into question what folks' true motives are.

If most start with their 'preferred outcome' and work backwards into a data argument from there, how exactly do they arrive at that desired result *without* using data and logic? It would seem that, if their preferred outcome was what was in alignment with what is best for the organization, there would be no need to presuppose one -- but rather determine it from the best data and logic we have to bear on the question.

Your argument paints people as so Machiavellian and self-serving. Is that really how you think most people in power operate? I hate to begrudgingly agree, but after over 20 years working in organizations of all shapes and sizes I struggle to locate any objective data or logic to refute it. :-(

Expand full comment
Feb 10, 2023Liked by Benn Stancil

Brilliant as always. This why I think “data science” is the wrong framing to use for the value we provide. I prefer the term “Decisioneering.”

Though it IS important to distinguish between the accurate info we need BEFORE a decision and the actionable data we need AFTER...

Expand full comment

One thing I always tell academics when I interview or discuss industry jobs is that we're not in the business of seeking out Truth w/ the big T, just the small t. Because we just have to provide arguments and viewpoints that will lead to a group of decision makers make better decisions. That usually means a reduced level of rigor, but also implies we do other weird things. If that's stuff to keep people sticking to the plan, and contributing analysis to making The Plan work out in the end, that's how it rolls. There's plenty of ways to do that without actively misleading people or doing anything particularly sketchy (which is what sometimes people think reduced rigor can imply)

Expand full comment