17 Comments

I think the problem is hiding in the idea of an outsourced ‘data team’, seperate and independent from ‘the business’. Businesses with a lot of data have for decades had technology teams managing data and tech stacks, and trusted advisors and analysts in their own teams. This model worked well (but variably, as it often depended on an individual) and depended on the tech stack not changing too much (perhaps upgrading the version of oracle). But these advisors built trust through thier relationship with key business figures. They didn’t use a ticketing system, because they knew what was important to their leaders. Their leaders held the information with regard, because it was delivered by someone who they felt truly understood their problem (and was probably regularly slipped into different projects, outside their area of expertise, Eg. Business requirements for rewards programs, or delivering a new market segmentation including staffing sales channels).

We used to deliver customised insights to help an individual make a decision, because we knew how they made decisions. Now we use best practice, design frameworks, the latest shiny tools (where we lose a whole bunch of our time and effort) to deliver insights for a “business”.

The big consulting firms understand this, and while they have some great data teams, results from those are often repackaged in PowerPoint for the senior leaders that hired them. They build relationships with individuals.

Expand full comment

I've got a draft somewhere that's asks why Excel has been so sticky, but this makes me think that's actually the wrong question. What we should be asking is why is PowerPoint (/Google slides) still around?

I think there's something interesting in that, that as much as we want to do fancy stuff, and build data apps and all that, decisions still get made through slides. A simple readable story is far more powerful than anything else we can make. Though I think we have some intuition that that's the case, the durability of powerpoint — *powerpoint!* — really makes me feel that.

Expand full comment

That’s a powerful insight. Data outside of a trusted relationship is noise.

Expand full comment

Thought-provoking. Thanks for writing this.

From what I can tell there's some priors you're bringing that you're not being explicit about. It might be useful to see if we agree on what those are to draw conclusions from on the overall post.

1. I think you're largely talking about the concept of a "data team" at a digital native business. The entire conversation about "what is the appropriate role of / interface to the 'data team'?" isn't really being had inside of most enterprises. The enterprises that I talk to have a very well-understood org structure (understood inside the organization, anyway) and I don't perceive any lack of clarity around who is "buying" their services and why. There is no lack of PMF for data in the enterprises I talk to; rather, it's the opposite. That's not to say that enterprise data is some shining beacon on a hill (they have their own problems for sure)...they just don't experience this particular problem in this particular way.

2. I think you're really talking about the "analysis / insights / strategy" function of a data team, not the "pipelining / modeling" function of a data team. I don't think you're making the case that companies don't care about _data_, rather, that companies aren't buying the "data team as strategic partner" model.

If those two things are true, then I agree with you. If not, we can fight about that in some other forum :)

And if those two things are true, then it sharpens the question. The question becomes (I think):

> Why should we expect data professionals, who by definition specialize in technical--as opposed to functional--skills, to know more about strategy in a given functional area than their peers who have actually built many-years-long careers inside of that functional area?

Sure, data people do and should build competencies in the functional areas that they are partnered with. But after spending 7 years as a marketer, I can tell you that it will be very unlikely that a data analyst (however skilled) is going to understand the marketing data they're looking at better than I do. Maybe if they're working in a sufficiently-narrow and well-defined problem domain. But if you all the sudden observe a drop-off in a particular conversion rate, the instincts that will get you closer to an answer _fast_ are marketing instincts, not data instincts. Having access to data to answer this question is critical (and so pipelines and modeling and metrics are critical), but the specific expertise required to diagnose and fix this problem is marketing expertise.

The hardest part about data in a digital native business is to figure out how to get two skillsets--functional and data (technical)--sharing a headspace. Long-lived embeds of data people on specific product teams are the model that many of the most successful companies have learned to do this. But this team structure is unusual...perhaps because it's more expensive than most companies can afford. These companies do it because their scale and value of data more than justifies it, and because the atomic group--the two pizza engineering/product team--is very receptive to the addition of this person.

IMO there is not a single right answer to this, but there are wrong ones. If you're expecting a really small data team to support all of the functional areas of a business and to be able to offer "strategic insights" that's likely just not a real thing unless you have some really unusually exceptional data people or unless you focus them really narrowly. But IMO that's fine. That just means that data teams need to be really clear about their mandate given the particular context of their business. And analysis / insights / strategy may or may not be a part of it. And that's ok.

Expand full comment

On the two assumptions, I think that's mostly right, though I don't think that narrows the question quite as much as you're suggesting. Specifically, what are those data pipelines for? Part of it is to feed the "analysis / insights / strategy" set, so if the latter's not valuable, the former wouldn't (entirely) be either.

To split it another way, would it be fair to say that we can divide data teams into the "analysis" half and the "reporting / operational functions (eg, email the right people a promotion)" half? The latter half has pretty clear value, which 1) is what enterprises focus on, and 2) still needs data modeling.

I would agree with that, though it does raise a bit of a question to me if we overdo the modeling work to support what may be overvalued analytical work. (Ie, would analytics engineers do things differently if the only thing on the other end of their work was some dashboards?) But that's a question about data modeling of degree rather than kind.

To your sharper question, I think you're right - we can't expect them to be good at that. Which is where I think it gets interesting. Rather than casting it out as overvalued, is there a different way to, as you say, get data in the headspace of the operational folks?

I think your point that the only way to get there is the long-lived embedded model, though it's interesting that most companies (except enterprises, it seems) don't fully commit to that. There's a lot of specialized analysts who are still part of a data team, but they're more forward deployed than hired by the functional team themselves. So maybe part of this is to go all out in that direction: Have a centralized data / analytics engineering team, and any other data person just comes from the headcount of the team themselves.

But most of all, this just proves the godwin's law of data: https://twitter.com/bennstancil/status/1426276715383468033

Expand full comment

I normally enjoy taking thing the left-hand path, but I'm going to break right here. Our customer base has an intense desire for "insights"; they can't articulate what these are or how they'll specifically help the business, but they believe they're out there. Data and analytics teams aren't failing these customers because we're not fulfilling a known need; plenty of ad-hoc SQL gets run, and plenty of dashboards are built. But we aren't consistently delivering "insights." To extend your metaphor, our product isn't ready for prime-time.

I think our problem is less that we don't listen to our customers and more that we don't understand them, and that's a mutual issue. In much the same way that there's a great deal of intuition and minutiae in data work, other areas of the business are deceptively inaccessible. The gaps in understanding (data for the business, and the business for data teams) aren't small but significant. Like a rocket that can't quite hit escape velocity, a lot of data initiatives fall a little short from driving real business value. Almost only counts in horseshoes.

I'm going to choose to be optimistic that we can find a way to bridge the chasm. That between the business's hunger for insights and our hunger for relevance, we'll find a language that we can both speak and deliver on some of these promises.

Expand full comment

I don't disagree with that, but I think it raises the question of how accessible delivering insights actually is. If we could consistently do that, for sure, problem solved. But we can't now, and I'm not convinced that better tools, or processes, or whatever, meaningfully makes that easier.

Which is the trap, I think. We see the clear possibility, and keep hammering away at trying to make it affordable and accessible. But at some point, we gotta ask if we're actually gonna get there or not.

Expand full comment

Reporting and analytics requires understanding business objectives, and the processes related to them, and creating data models that reflect those processes and objectives. Developing these models is typically an iterative process of mapping objectives and processes to source data, analyzing the source data, and delivering the source data in a new structure that makes reporting and analytics possible, or more effective. Reports and analytics are designed to support business objectives by presenting data in a way that provides insight or information that can be acted upon.

So in reverse, a report or dashboard can be designed to support a specific business objective (profit, for example). The report containing revenue and expenses can be mapped back to specific processes like orders and the general ledger and the specific data created by those processes. That data is then extracted, engineered with relevant features and delivered to a physical data model in the database using ETL (dbt) where it is used by the report.

This development cycle requires a mix of skills and expertise that crosses several technical disciplines, requires a deep understanding of the business and data analysis and critical thinking. More often than not there are too many people on these teams, and their skills are too siloed to work effectively together. Meanwhile, they are looking for some rare gem with sophisticated tools and technical, and statistical elegance that is not well understood, and they never find it.

The truth is effective analytics is not as sexy as we want it to be very often. Effective analytics mean we need to learn what drives the business and the parts of the business we are responsible for, and this rarely requires sophisticated statistical models. What it requires is working with customers, both internal and external to understand what they want to accomplish, assuming they want or know or care. Once we understand the key drivers we track the performance of those things and give them context to understand if what we are looking at is good or bad. That could be historical comparison, or benchmark comparisons. From there we need to understand what we can do to influence those results. Rinse and repeat. Go back to the benchmarks. How are the good performers achieving their results? Hopefully that’s another metric on the dashboard.

All this, however, is hard work, tedious and doesn’t come from a tool. Tools make this more efficient, maybe even make this work possible, but they can’t replace the hard work of thinking and understanding the business, and doing the work of developing the analytics that can help improve the business.

Expand full comment

I think that gets at the crux of this, which is that a lot of data teams probably oversold their promise (maybe). It's popular to say that reporting is a stepping stone to bigger and better things, which...may not be true. Reporting may be the bigger and better thing that we can, and for most of us, the "more impactful" work that we're trying to get to is actually considerably less useful than the thing we're trying to get away from.

That said, my big question about that is, is that because reporting is actually the best thing we can do, or is it because we got enamored with "insight" and stopped looking for other things to do instead? Sure, in the battle between reporting and insight, we may want insight to win, but it's really reporting that is better. But does that mean that we should just do reporting, or that we need to go find a better fighter than "insight?"

Expand full comment

Fantastic post! Love your writing style and the points are spot on. Consider me a new fan. :)

Expand full comment

Thanks, welcome! Glad you like it.

Expand full comment

The interesting thing to me is that the direction data analytics is moving towards - business outcomes and decisions - is what Finance journals have been talking about for decades (more if I'm including my Finance/Acct textbooks with 1st editions from the 50s). The data community has focused A LOT on technology, talked AT LENGTH about specific problems (like marketing attribution, et al), and in LIMITED CIRCLES have talked about the business (credit due to those who have).

If data people want to have a bigger impact, I think the answer is that you should spend more time with your Finance people. I'd even go so far as to say that with the shifted conversation focused around decision support, those people should probably report directly to a CFO. All businesses are different, and they're run by different people with different skills, and as a result you have different ways they're organized - so no recommendation is universal. Still, in an overwhelming majority of businesses the CFO is the one being asked by the check-writers on how resources are being utilized, whether the business is performing, how much it will grow, and what else can be done to accelerate it. The CEO is there too, but the CFO is the one who feels the pressure. If you can bring the CFO answers to aid in decisions on capital allocation anywhere within the business, you will be listened to and folded into the process of where decisions are made, which is often in small groups behind closed doors.

Expand full comment

I'd half agree with that. I think there's a real value to the CFO mindset, which I agree is pretty manically focused on business performance and whatnot. However, I think there's still a big problem with reporting to the CFO, because they still plays a bit of a drive-by role in trying to solve problems. They view the business through a very particular lens, and can still sometimes operate as an external consultant (though in their case, more like an external PE overlord). I don't think that's necessarily a great way to do this because it's very cost focused and it often lacks the full perspective of the business unit itself.

What I think would be better is data folks being more embedded inside of departments, where they have some of the tendencies of a CFO to really hammer on asking if this business program is working. But they do it 1) more as teammates to that business than auditors, and 2) are embedded enough to think about the problem as a marketing / sales / product etc problem rather than a finance problem within the product org.

Expand full comment

I'm very supportive of functions having embedded analytics in their team. I'm advocating for another central team of technical analytics folks, who are customer minded but also economic minded at the firm level. My key point of differentiation is that the team reports into the CFO though, as that's the office a lot of high level and impactful decisions are run through anyhow - addressing the premise of the article - "how can analytics teams be more impactful". I'm raising in comments because one possible solution I see missing from the article's list is "org structure".

These folks should be dedicated to partnering with the business to solve their problems, always be available (not drive-by), and advise the CFO/Executive team in decision making. The team maintains an independence that's valuable and a balance is struck between self-motivated forces of Functional analytics versus proximity to the problem by working with Functional analysts and decision makers.

I mostly raise this because it's not something I see advocated for very frequently and I'd love to see it enter the conversation. I feel it's impeded though because Analytics=sexy, Finance=boring. Maybe analytics should be a little more boring.

Expand full comment

Yeah, I've got no problem with boring. And I don't mind this org structure, provided that the team is a little bit sheltered from some of the usual ways that finance teams operate. That's not because it's boring; it's that finance tends to have a very frame for thinking about problems - very cost focused, often wants to directly every action to revenue, etc. Which is obviously is important (and something that should be on everyone's mind at some level), but I think it's a bit of a narrow way to look at things. Some stuff is hard to measure, or can't always be tied to revenue (eg, investments in brand). So as long as data teams have a little flexibility about measuring work like that without it always coming from a finance perspective, I don't mind data teams being under a CFO.

Expand full comment

Delightfully provocative, as always!

I’m going to one-up you on cynicism: I wonder if the very concept of “data team” is the problem. In fact, an oxymoron!

I’ve been thinking a lot about status roles, and in particular how my own status blindness is (correctly!) perceived as implicitly claiming I am high status enough to not have to worry about status.

The problem with data is that is literally the antithesis of status. What if the real “crime” of data teams is trying to give data practitioners status and agency, without realizing that was a direct threat to those in power? And that old-school analysts fit well within the organization precisely *because* they had no status or agency -- but were fine with it.

What if our prototypical data teams worked well because they served high-ego managers or status-blind engineers who didn’t feel threatened, which made them the exception rather than the rule?

What if the “right data product” is (as you’ve danced around earlier) actually a “service:”data teams as independent agencies acting as confidential partners advising clients -- rather than employees second-guessing their bosses?

A bitter pill if true, but I’d argue the likelihood I’m right is probably north of 40%. Still, the optimistic take is that we actually ARE right to push for taking data seriously, but we are currently stuck in an obsolete paradigm.

The hopeful take is that the modern data team is actually the forerunner of a new kind of human organization, as different from 1950s corporations as they were from the British East India Company. And that this new form will inevitably outcompete the old ones.

Here’s hoping we can stay solvent longer than the market stays irrational...

Expand full comment

I kinda could see that? On the edges, for sure, I think there are places where data is seen as a threat to the powers that be (in sports, for instance, old school coaches often scoff at "the analytics"). But if data lived up to its promise, they could just as use it to entrench themselves further as they would be dislodged by it.

Take finance for instance. I suspect plenty of the pre-quant old guard finance people actually did just fine in the quant world—they were in charge, and once quant finance proved to be valuable, they started using it themselves.

Business leaders seem the same to me. Rather than assuming their irrational, we should assume that they are rational. And they've made the decision that they don't need data to maintain their power. To me, that's says more about the data than it does about them.

Expand full comment