The crazy part is this can apply WITHIN products… try doubling down on Power BI/Fabric… the paradox of choice will get you given the vast array of ways of doing things within the that “product” alone… not to mention cloud means you can’t hold back the tide by staying on an old version… you just come in to work one day and the product has changed.
The proliferation of tools and patterns has screwed a lot of people career wise as well… Imagine you’re 40, just got out of a 15 year gig at a conservative company that did everything on prem in SQL Server, SSAS and maybe this new fangled Power BI thing… then you are interviewed by a 26 year old head of analytics with 3 years experience who has the hots for their flavour of the MDS, half of which you’ve never heard of…
It's always sort of weird to me how tools serve as that sort of signaling. It's like applying to a job with a @aol.com email address or something. I don't think it actually means much, but people definitely make a bunch of judgements about it.
Well I feel seen, as the too many miles of data career behind them in the room. It's funny about doubling down within products is that I think what ends up happening is a lot of just because you can, doesn't mean you should. It's arguable whether it actually gives anything a competitive edge.
My favorite Deming quote “Ninety-five percent of changes made by management today make no improvement.”
Does this apply to BI and other tooling? - ya probably so.
Rather than actually being tied into long term contracts - I think trying to make decisions for 10 years might be a good step. I don’t want to be locked in for 10 years, but should think like does this decision still works 10 years from now?
Eh, that to me feels like it works in the other direction though. I think it's more like, "95% of what we do doesn't matter, so let's now worry about it too much, make a decision, and try to make that decision work." I'd be worried that if you ask things like "will this still be the right thing in 10 years?," you'll become even more paralyzed.
Interesting - ya - I was thinking of it from - nobody can possible know 10 yrs from now - so just decide and really stick to it until there is a truly compelling reason to not.
Reminds me a bit of the Bill Gross argument that your product needs to be 10x better for anyone to consider switching. As products/spaces mature it's impossible to be 10x better barring an industry shift (AI anyone?) so then you're in essence locked in.
Yeah, though I'm not sure that applies to BI as much. I was just talking about this with someone today - BI (and all end-user apps, really) is such a preference-based thing, I don't think tools actually need to be that much better. They just need to be more in line with someone's preference. It's like buying a new car. You probably don't switch from a Ford F150 to a Toyota Tacoma because the Tacoma is 10x better; I bet you switch because you like the style of the new Tacoma. I think that's why the BI market is constantly churning out new products - you can always create something in a different style, and some subset of the market will probably like it.
I get the argument but don't like the comparison to cars since those have a natural upgrade cycle and it's a much more personal decision vs thinking on behalf of a company. I used to think BI was a commodity and still do to an extent but heard a pretty good argument for why there are so many BI tools which adds to your point:
1. Budget owners understand what they're buying. Senior folk understand BI and use it so are willing to spend money vs some arcane tech wanted by a data engineering team.
2. Lots of users. People like the idea of everyone being data driven and BI tools generally will be used by anyone who wants at a company vs the buried tech tools.
3. Multiple BI tools in one company. Sure there might be a primary tool but different teams/people will end up having their own.
BI tools these days are trying to get more and more lockin though - both by trying to absorb the semantic layer but also org with with dashboards/permissions/etc that end up being annoying to replace.
I think I disagree with most of that, actually. I'm not convinced budget owners understand what they're buying at all. At big companies, it's a mix of checkbox Gartner MQ stuff, and of people buying something that they have often barely used in production. In my experience, most enterprise BI deals are sold by sales pitches, not terribly informed buyers.
That's part of why I think preference (and to some extent, a sense of upgrade cycle) is the underlying issue here. To me, the process goes like this:
1. IT gets sold the primary BI tool, but it's not really vetted by the people who use it (partly because it can't be, because, to your point, that's a very big group).
2. Some subgroup doesn't like it, and instead wants the thing that fits their preferred flavor. They bring in a new tool.
3. Repeat step 2 a bunch with other groups.
4. Every BI vendor sees this, tries to expand their footprint, do more stuff, etc. The tool starts to feel bloated and old.
5. So IT decides that they want to reevaluate, because they've had stuff for a while and there's a kind of natural upgrade cycle to enterprise SaaS software, where people sign 2-3 deals and often do some tire kicking around the end of the deals.
That's so depressing! I do like the framework though. Step 6-1 has a lower cycle frequency than 2-3-4 and maybe at some point you get so big that you can't actually justify the 6-1 move so you're stuck with fragmentation caused by 2-3-4.
I don't know how it works at the enterprise level but at my former company (~600 people) the budget owner was usually the manager of the team and would either be using tools directly or managing the team that used to the tools so there was more incentive alignment than at other companies.
On the other hand we did try to run a company wide process to figure out a "project management" tool and it turned out every team had their own rating rubric and some teams ended up being unhappy with the result.
That's fair, that's probably how Qlik is still a $500m business or whatever.
In my experience with Mode, that size (500-1000 people) is about when it gets messy. Less than 500, and it seemed like most data teams owned the decision, or were at least very involved in it. More than 1000, though, and it becomes this kind of decentralized IT thing, where every process goes kind of like the project management thing. The exception to that seemed to be when a data team was really well respected and pretty firm on their opinions, and they could wrangle the cats to keep things organized. But usually, it would become more a political choice as much as a product choice at that point.
I could have very easily written footnote 8 myself, less eloquently naturally, but this is generally my take on such things. Anytime companies "agree" to things being "universal" it's usually in the form of some performative collaboration, when really they are trying to edge each other out for advantage, even if it's scraps of NPS or conversion rates. What's interesting about SQL is that every company does its own flavour of it to a certain extent, but overall it's pretty universal. So if I need to write a query in any of the various database platforms, there may be slight differences (which to be fair can be maddening, in the micro-aggressive sort of way) but I generally don't have to start from scratch with a whole new skill set, and language learning.
I was thinking about SQL during the week too. I remember being annoyed by switching from pl/sql to t-sql to psql as a software dev… but it is nothing compared to the pain of constant relearning I’ve had to do in Data over the last 15 years. Some enduring core standards sure would be nice.
To that point, I guess if you had to choose between companies kinda sorta consolidating around something like Iceberg or them *not* doing that, the first is better, even if it's not entirely genuine. It'd be tough to create something completely different after making such a show about being on board with this open source standard, and I'd guess that makes learning it all a little easier than learning totally new things.
I agree that seeking optionality for the sake of optionality is actually negative utility. The argument I would make for optionality is that, as a buyer, I want the ability to change tools if the existing provider decides to price gouge me at renewal time. To the point you’re making, that does mean buyers should be more willing to engage in multi year deals.
Yeah, that's another reason why multi-year deals might be good for buyers. It's not just that you can contractually lock in prices; you also have a lot more negotiating power (and like, relationship-based good will) if you're a loyal customer. Vendors are a lot more willing to jack up prices on people they don't know who seem flaky than the long term customers.
Embrace and extend is by now quite an old strategy; I doubt if Iceberg will be much different. But it's at least a step in the right direction.
As for tooling, I find myself often going back to the painfully bureaucratic processes that large orgs use: analyzing multiple alternatives, bidding, re-qualifying tools regularly even if migration is not viable (the vendors don't know that), and careful contract and vendor management. My last gig honored these in the breach, and I watched a lot of money go up in flames.
While these ideas might seem too old school, they're no different from spending months defining OEC's and running A/B tests to make one feature decision -- you need to be as data-driven with tools as with product features.
Sure, though I'd argue that a lot of companies overthink a lot of stuff that they A/B test (which might be what you're saying?), and the same would apply here. Just choosing something and moving on is probably better than trying to be really particular about what you choose.
The crazy part is this can apply WITHIN products… try doubling down on Power BI/Fabric… the paradox of choice will get you given the vast array of ways of doing things within the that “product” alone… not to mention cloud means you can’t hold back the tide by staying on an old version… you just come in to work one day and the product has changed.
The proliferation of tools and patterns has screwed a lot of people career wise as well… Imagine you’re 40, just got out of a 15 year gig at a conservative company that did everything on prem in SQL Server, SSAS and maybe this new fangled Power BI thing… then you are interviewed by a 26 year old head of analytics with 3 years experience who has the hots for their flavour of the MDS, half of which you’ve never heard of…
It's always sort of weird to me how tools serve as that sort of signaling. It's like applying to a job with a @aol.com email address or something. I don't think it actually means much, but people definitely make a bunch of judgements about it.
Well I feel seen, as the too many miles of data career behind them in the room. It's funny about doubling down within products is that I think what ends up happening is a lot of just because you can, doesn't mean you should. It's arguable whether it actually gives anything a competitive edge.
My favorite Deming quote “Ninety-five percent of changes made by management today make no improvement.”
Does this apply to BI and other tooling? - ya probably so.
Rather than actually being tied into long term contracts - I think trying to make decisions for 10 years might be a good step. I don’t want to be locked in for 10 years, but should think like does this decision still works 10 years from now?
Eh, that to me feels like it works in the other direction though. I think it's more like, "95% of what we do doesn't matter, so let's now worry about it too much, make a decision, and try to make that decision work." I'd be worried that if you ask things like "will this still be the right thing in 10 years?," you'll become even more paralyzed.
Interesting - ya - I was thinking of it from - nobody can possible know 10 yrs from now - so just decide and really stick to it until there is a truly compelling reason to not.
Ah, yeah, fair. If you think of that way, I think that makes sense.
Reminds me a bit of the Bill Gross argument that your product needs to be 10x better for anyone to consider switching. As products/spaces mature it's impossible to be 10x better barring an industry shift (AI anyone?) so then you're in essence locked in.
Yeah, though I'm not sure that applies to BI as much. I was just talking about this with someone today - BI (and all end-user apps, really) is such a preference-based thing, I don't think tools actually need to be that much better. They just need to be more in line with someone's preference. It's like buying a new car. You probably don't switch from a Ford F150 to a Toyota Tacoma because the Tacoma is 10x better; I bet you switch because you like the style of the new Tacoma. I think that's why the BI market is constantly churning out new products - you can always create something in a different style, and some subset of the market will probably like it.
I get the argument but don't like the comparison to cars since those have a natural upgrade cycle and it's a much more personal decision vs thinking on behalf of a company. I used to think BI was a commodity and still do to an extent but heard a pretty good argument for why there are so many BI tools which adds to your point:
1. Budget owners understand what they're buying. Senior folk understand BI and use it so are willing to spend money vs some arcane tech wanted by a data engineering team.
2. Lots of users. People like the idea of everyone being data driven and BI tools generally will be used by anyone who wants at a company vs the buried tech tools.
3. Multiple BI tools in one company. Sure there might be a primary tool but different teams/people will end up having their own.
BI tools these days are trying to get more and more lockin though - both by trying to absorb the semantic layer but also org with with dashboards/permissions/etc that end up being annoying to replace.
I think I disagree with most of that, actually. I'm not convinced budget owners understand what they're buying at all. At big companies, it's a mix of checkbox Gartner MQ stuff, and of people buying something that they have often barely used in production. In my experience, most enterprise BI deals are sold by sales pitches, not terribly informed buyers.
That's part of why I think preference (and to some extent, a sense of upgrade cycle) is the underlying issue here. To me, the process goes like this:
1. IT gets sold the primary BI tool, but it's not really vetted by the people who use it (partly because it can't be, because, to your point, that's a very big group).
2. Some subgroup doesn't like it, and instead wants the thing that fits their preferred flavor. They bring in a new tool.
3. Repeat step 2 a bunch with other groups.
4. Every BI vendor sees this, tries to expand their footprint, do more stuff, etc. The tool starts to feel bloated and old.
5. So IT decides that they want to reevaluate, because they've had stuff for a while and there's a kind of natural upgrade cycle to enterprise SaaS software, where people sign 2-3 deals and often do some tire kicking around the end of the deals.
6. Go back to step 1, repeat forever.
That's so depressing! I do like the framework though. Step 6-1 has a lower cycle frequency than 2-3-4 and maybe at some point you get so big that you can't actually justify the 6-1 move so you're stuck with fragmentation caused by 2-3-4.
I don't know how it works at the enterprise level but at my former company (~600 people) the budget owner was usually the manager of the team and would either be using tools directly or managing the team that used to the tools so there was more incentive alignment than at other companies.
On the other hand we did try to run a company wide process to figure out a "project management" tool and it turned out every team had their own rating rubric and some teams ended up being unhappy with the result.
That's fair, that's probably how Qlik is still a $500m business or whatever.
In my experience with Mode, that size (500-1000 people) is about when it gets messy. Less than 500, and it seemed like most data teams owned the decision, or were at least very involved in it. More than 1000, though, and it becomes this kind of decentralized IT thing, where every process goes kind of like the project management thing. The exception to that seemed to be when a data team was really well respected and pretty firm on their opinions, and they could wrangle the cats to keep things organized. But usually, it would become more a political choice as much as a product choice at that point.
I could have very easily written footnote 8 myself, less eloquently naturally, but this is generally my take on such things. Anytime companies "agree" to things being "universal" it's usually in the form of some performative collaboration, when really they are trying to edge each other out for advantage, even if it's scraps of NPS or conversion rates. What's interesting about SQL is that every company does its own flavour of it to a certain extent, but overall it's pretty universal. So if I need to write a query in any of the various database platforms, there may be slight differences (which to be fair can be maddening, in the micro-aggressive sort of way) but I generally don't have to start from scratch with a whole new skill set, and language learning.
I was thinking about SQL during the week too. I remember being annoyed by switching from pl/sql to t-sql to psql as a software dev… but it is nothing compared to the pain of constant relearning I’ve had to do in Data over the last 15 years. Some enduring core standards sure would be nice.
To that point, I guess if you had to choose between companies kinda sorta consolidating around something like Iceberg or them *not* doing that, the first is better, even if it's not entirely genuine. It'd be tough to create something completely different after making such a show about being on board with this open source standard, and I'd guess that makes learning it all a little easier than learning totally new things.
Yeah yeah, whatever, Benn. I'm still switching the org to Jira.
As long as you promise to do it until 2034, I'll allow it.
I agree that seeking optionality for the sake of optionality is actually negative utility. The argument I would make for optionality is that, as a buyer, I want the ability to change tools if the existing provider decides to price gouge me at renewal time. To the point you’re making, that does mean buyers should be more willing to engage in multi year deals.
Yeah, that's another reason why multi-year deals might be good for buyers. It's not just that you can contractually lock in prices; you also have a lot more negotiating power (and like, relationship-based good will) if you're a loyal customer. Vendors are a lot more willing to jack up prices on people they don't know who seem flaky than the long term customers.
Enjoyed this post!
Embrace and extend is by now quite an old strategy; I doubt if Iceberg will be much different. But it's at least a step in the right direction.
As for tooling, I find myself often going back to the painfully bureaucratic processes that large orgs use: analyzing multiple alternatives, bidding, re-qualifying tools regularly even if migration is not viable (the vendors don't know that), and careful contract and vendor management. My last gig honored these in the breach, and I watched a lot of money go up in flames.
While these ideas might seem too old school, they're no different from spending months defining OEC's and running A/B tests to make one feature decision -- you need to be as data-driven with tools as with product features.
Sure, though I'd argue that a lot of companies overthink a lot of stuff that they A/B test (which might be what you're saying?), and the same would apply here. Just choosing something and moving on is probably better than trying to be really particular about what you choose.