Jan 20Liked by Benn Stancil

I think the problem is hiding in the idea of an outsourced ‘data team’, seperate and independent from ‘the business’. Businesses with a lot of data have for decades had technology teams managing data and tech stacks, and trusted advisors and analysts in their own teams. This model worked well (but variably, as it often depended on an individual) and depended on the tech stack not changing too much (perhaps upgrading the version of oracle). But these advisors built trust through thier relationship with key business figures. They didn’t use a ticketing system, because they knew what was important to their leaders. Their leaders held the information with regard, because it was delivered by someone who they felt truly understood their problem (and was probably regularly slipped into different projects, outside their area of expertise, Eg. Business requirements for rewards programs, or delivering a new market segmentation including staffing sales channels).

We used to deliver customised insights to help an individual make a decision, because we knew how they made decisions. Now we use best practice, design frameworks, the latest shiny tools (where we lose a whole bunch of our time and effort) to deliver insights for a “business”.

The big consulting firms understand this, and while they have some great data teams, results from those are often repackaged in PowerPoint for the senior leaders that hired them. They build relationships with individuals.

Expand full comment
Jan 21Liked by Benn Stancil

Thought-provoking. Thanks for writing this.

From what I can tell there's some priors you're bringing that you're not being explicit about. It might be useful to see if we agree on what those are to draw conclusions from on the overall post.

1. I think you're largely talking about the concept of a "data team" at a digital native business. The entire conversation about "what is the appropriate role of / interface to the 'data team'?" isn't really being had inside of most enterprises. The enterprises that I talk to have a very well-understood org structure (understood inside the organization, anyway) and I don't perceive any lack of clarity around who is "buying" their services and why. There is no lack of PMF for data in the enterprises I talk to; rather, it's the opposite. That's not to say that enterprise data is some shining beacon on a hill (they have their own problems for sure)...they just don't experience this particular problem in this particular way.

2. I think you're really talking about the "analysis / insights / strategy" function of a data team, not the "pipelining / modeling" function of a data team. I don't think you're making the case that companies don't care about _data_, rather, that companies aren't buying the "data team as strategic partner" model.

If those two things are true, then I agree with you. If not, we can fight about that in some other forum :)

And if those two things are true, then it sharpens the question. The question becomes (I think):

> Why should we expect data professionals, who by definition specialize in technical--as opposed to functional--skills, to know more about strategy in a given functional area than their peers who have actually built many-years-long careers inside of that functional area?

Sure, data people do and should build competencies in the functional areas that they are partnered with. But after spending 7 years as a marketer, I can tell you that it will be very unlikely that a data analyst (however skilled) is going to understand the marketing data they're looking at better than I do. Maybe if they're working in a sufficiently-narrow and well-defined problem domain. But if you all the sudden observe a drop-off in a particular conversion rate, the instincts that will get you closer to an answer _fast_ are marketing instincts, not data instincts. Having access to data to answer this question is critical (and so pipelines and modeling and metrics are critical), but the specific expertise required to diagnose and fix this problem is marketing expertise.

The hardest part about data in a digital native business is to figure out how to get two skillsets--functional and data (technical)--sharing a headspace. Long-lived embeds of data people on specific product teams are the model that many of the most successful companies have learned to do this. But this team structure is unusual...perhaps because it's more expensive than most companies can afford. These companies do it because their scale and value of data more than justifies it, and because the atomic group--the two pizza engineering/product team--is very receptive to the addition of this person.

IMO there is not a single right answer to this, but there are wrong ones. If you're expecting a really small data team to support all of the functional areas of a business and to be able to offer "strategic insights" that's likely just not a real thing unless you have some really unusually exceptional data people or unless you focus them really narrowly. But IMO that's fine. That just means that data teams need to be really clear about their mandate given the particular context of their business. And analysis / insights / strategy may or may not be a part of it. And that's ok.

Expand full comment
Jan 20Liked by Benn Stancil

I normally enjoy taking thing the left-hand path, but I'm going to break right here. Our customer base has an intense desire for "insights"; they can't articulate what these are or how they'll specifically help the business, but they believe they're out there. Data and analytics teams aren't failing these customers because we're not fulfilling a known need; plenty of ad-hoc SQL gets run, and plenty of dashboards are built. But we aren't consistently delivering "insights." To extend your metaphor, our product isn't ready for prime-time.

I think our problem is less that we don't listen to our customers and more that we don't understand them, and that's a mutual issue. In much the same way that there's a great deal of intuition and minutiae in data work, other areas of the business are deceptively inaccessible. The gaps in understanding (data for the business, and the business for data teams) aren't small but significant. Like a rocket that can't quite hit escape velocity, a lot of data initiatives fall a little short from driving real business value. Almost only counts in horseshoes.

I'm going to choose to be optimistic that we can find a way to bridge the chasm. That between the business's hunger for insights and our hunger for relevance, we'll find a language that we can both speak and deliver on some of these promises.

Expand full comment
Jan 30Liked by Benn Stancil

Reporting and analytics requires understanding business objectives, and the processes related to them, and creating data models that reflect those processes and objectives. Developing these models is typically an iterative process of mapping objectives and processes to source data, analyzing the source data, and delivering the source data in a new structure that makes reporting and analytics possible, or more effective. Reports and analytics are designed to support business objectives by presenting data in a way that provides insight or information that can be acted upon.

So in reverse, a report or dashboard can be designed to support a specific business objective (profit, for example). The report containing revenue and expenses can be mapped back to specific processes like orders and the general ledger and the specific data created by those processes. That data is then extracted, engineered with relevant features and delivered to a physical data model in the database using ETL (dbt) where it is used by the report.

This development cycle requires a mix of skills and expertise that crosses several technical disciplines, requires a deep understanding of the business and data analysis and critical thinking. More often than not there are too many people on these teams, and their skills are too siloed to work effectively together. Meanwhile, they are looking for some rare gem with sophisticated tools and technical, and statistical elegance that is not well understood, and they never find it.

The truth is effective analytics is not as sexy as we want it to be very often. Effective analytics mean we need to learn what drives the business and the parts of the business we are responsible for, and this rarely requires sophisticated statistical models. What it requires is working with customers, both internal and external to understand what they want to accomplish, assuming they want or know or care. Once we understand the key drivers we track the performance of those things and give them context to understand if what we are looking at is good or bad. That could be historical comparison, or benchmark comparisons. From there we need to understand what we can do to influence those results. Rinse and repeat. Go back to the benchmarks. How are the good performers achieving their results? Hopefully that’s another metric on the dashboard.

All this, however, is hard work, tedious and doesn’t come from a tool. Tools make this more efficient, maybe even make this work possible, but they can’t replace the hard work of thinking and understanding the business, and doing the work of developing the analytics that can help improve the business.

Expand full comment

Fantastic post! Love your writing style and the points are spot on. Consider me a new fan. :)

Expand full comment
Jan 23Liked by Benn Stancil

The interesting thing to me is that the direction data analytics is moving towards - business outcomes and decisions - is what Finance journals have been talking about for decades (more if I'm including my Finance/Acct textbooks with 1st editions from the 50s). The data community has focused A LOT on technology, talked AT LENGTH about specific problems (like marketing attribution, et al), and in LIMITED CIRCLES have talked about the business (credit due to those who have).

If data people want to have a bigger impact, I think the answer is that you should spend more time with your Finance people. I'd even go so far as to say that with the shifted conversation focused around decision support, those people should probably report directly to a CFO. All businesses are different, and they're run by different people with different skills, and as a result you have different ways they're organized - so no recommendation is universal. Still, in an overwhelming majority of businesses the CFO is the one being asked by the check-writers on how resources are being utilized, whether the business is performing, how much it will grow, and what else can be done to accelerate it. The CEO is there too, but the CFO is the one who feels the pressure. If you can bring the CFO answers to aid in decisions on capital allocation anywhere within the business, you will be listened to and folded into the process of where decisions are made, which is often in small groups behind closed doors.

Expand full comment
Jan 20Liked by Benn Stancil

Delightfully provocative, as always!

I’m going to one-up you on cynicism: I wonder if the very concept of “data team” is the problem. In fact, an oxymoron!

I’ve been thinking a lot about status roles, and in particular how my own status blindness is (correctly!) perceived as implicitly claiming I am high status enough to not have to worry about status.

The problem with data is that is literally the antithesis of status. What if the real “crime” of data teams is trying to give data practitioners status and agency, without realizing that was a direct threat to those in power? And that old-school analysts fit well within the organization precisely *because* they had no status or agency -- but were fine with it.

What if our prototypical data teams worked well because they served high-ego managers or status-blind engineers who didn’t feel threatened, which made them the exception rather than the rule?

What if the “right data product” is (as you’ve danced around earlier) actually a “service:”data teams as independent agencies acting as confidential partners advising clients -- rather than employees second-guessing their bosses?

A bitter pill if true, but I’d argue the likelihood I’m right is probably north of 40%. Still, the optimistic take is that we actually ARE right to push for taking data seriously, but we are currently stuck in an obsolete paradigm.

The hopeful take is that the modern data team is actually the forerunner of a new kind of human organization, as different from 1950s corporations as they were from the British East India Company. And that this new form will inevitably outcompete the old ones.

Here’s hoping we can stay solvent longer than the market stays irrational...

Expand full comment