The crazy part is this can apply WITHIN products… try doubling down on Power BI/Fabric… the paradox of choice will get you given the vast array of ways of doing things within the that “product” alone… not to mention cloud means you can’t hold back the tide by staying on an old version… you just come in to work one day and the product has changed.
The proliferation of tools and patterns has screwed a lot of people career wise as well… Imagine you’re 40, just got out of a 15 year gig at a conservative company that did everything on prem in SQL Server, SSAS and maybe this new fangled Power BI thing… then you are interviewed by a 26 year old head of analytics with 3 years experience who has the hots for their flavour of the MDS, half of which you’ve never heard of…
It's always sort of weird to me how tools serve as that sort of signaling. It's like applying to a job with a @aol.com email address or something. I don't think it actually means much, but people definitely make a bunch of judgements about it.
Well I feel seen, as the too many miles of data career behind them in the room. It's funny about doubling down within products is that I think what ends up happening is a lot of just because you can, doesn't mean you should. It's arguable whether it actually gives anything a competitive edge.
I agree that seeking optionality for the sake of optionality is actually negative utility. The argument I would make for optionality is that, as a buyer, I want the ability to change tools if the existing provider decides to price gouge me at renewal time. To the point you’re making, that does mean buyers should be more willing to engage in multi year deals.
Yeah, that's another reason why multi-year deals might be good for buyers. It's not just that you can contractually lock in prices; you also have a lot more negotiating power (and like, relationship-based good will) if you're a loyal customer. Vendors are a lot more willing to jack up prices on people they don't know who seem flaky than the long term customers.
Embrace and extend is by now quite an old strategy; I doubt if Iceberg will be much different. But it's at least a step in the right direction.
As for tooling, I find myself often going back to the painfully bureaucratic processes that large orgs use: analyzing multiple alternatives, bidding, re-qualifying tools regularly even if migration is not viable (the vendors don't know that), and careful contract and vendor management. My last gig honored these in the breach, and I watched a lot of money go up in flames.
While these ideas might seem too old school, they're no different from spending months defining OEC's and running A/B tests to make one feature decision -- you need to be as data-driven with tools as with product features.
Sure, though I'd argue that a lot of companies overthink a lot of stuff that they A/B test (which might be what you're saying?), and the same would apply here. Just choosing something and moving on is probably better than trying to be really particular about what you choose.
My favorite Deming quote “Ninety-five percent of changes made by management today make no improvement.”
Does this apply to BI and other tooling? - ya probably so.
Rather than actually being tied into long term contracts - I think trying to make decisions for 10 years might be a good step. I don’t want to be locked in for 10 years, but should think like does this decision still works 10 years from now?
Reminds me a bit of the Bill Gross argument that your product needs to be 10x better for anyone to consider switching. As products/spaces mature it's impossible to be 10x better barring an industry shift (AI anyone?) so then you're in essence locked in.
I could have very easily written footnote 8 myself, less eloquently naturally, but this is generally my take on such things. Anytime companies "agree" to things being "universal" it's usually in the form of some performative collaboration, when really they are trying to edge each other out for advantage, even if it's scraps of NPS or conversion rates. What's interesting about SQL is that every company does its own flavour of it to a certain extent, but overall it's pretty universal. So if I need to write a query in any of the various database platforms, there may be slight differences (which to be fair can be maddening, in the micro-aggressive sort of way) but I generally don't have to start from scratch with a whole new skill set, and language learning.
I was thinking about SQL during the week too. I remember being annoyed by switching from pl/sql to t-sql to psql as a software dev… but it is nothing compared to the pain of constant relearning I’ve had to do in Data over the last 15 years. Some enduring core standards sure would be nice.
The crazy part is this can apply WITHIN products… try doubling down on Power BI/Fabric… the paradox of choice will get you given the vast array of ways of doing things within the that “product” alone… not to mention cloud means you can’t hold back the tide by staying on an old version… you just come in to work one day and the product has changed.
The proliferation of tools and patterns has screwed a lot of people career wise as well… Imagine you’re 40, just got out of a 15 year gig at a conservative company that did everything on prem in SQL Server, SSAS and maybe this new fangled Power BI thing… then you are interviewed by a 26 year old head of analytics with 3 years experience who has the hots for their flavour of the MDS, half of which you’ve never heard of…
It's always sort of weird to me how tools serve as that sort of signaling. It's like applying to a job with a @aol.com email address or something. I don't think it actually means much, but people definitely make a bunch of judgements about it.
Well I feel seen, as the too many miles of data career behind them in the room. It's funny about doubling down within products is that I think what ends up happening is a lot of just because you can, doesn't mean you should. It's arguable whether it actually gives anything a competitive edge.
Yeah yeah, whatever, Benn. I'm still switching the org to Jira.
As long as you promise to do it until 2034, I'll allow it.
I agree that seeking optionality for the sake of optionality is actually negative utility. The argument I would make for optionality is that, as a buyer, I want the ability to change tools if the existing provider decides to price gouge me at renewal time. To the point you’re making, that does mean buyers should be more willing to engage in multi year deals.
Yeah, that's another reason why multi-year deals might be good for buyers. It's not just that you can contractually lock in prices; you also have a lot more negotiating power (and like, relationship-based good will) if you're a loyal customer. Vendors are a lot more willing to jack up prices on people they don't know who seem flaky than the long term customers.
Enjoyed this post!
Embrace and extend is by now quite an old strategy; I doubt if Iceberg will be much different. But it's at least a step in the right direction.
As for tooling, I find myself often going back to the painfully bureaucratic processes that large orgs use: analyzing multiple alternatives, bidding, re-qualifying tools regularly even if migration is not viable (the vendors don't know that), and careful contract and vendor management. My last gig honored these in the breach, and I watched a lot of money go up in flames.
While these ideas might seem too old school, they're no different from spending months defining OEC's and running A/B tests to make one feature decision -- you need to be as data-driven with tools as with product features.
Sure, though I'd argue that a lot of companies overthink a lot of stuff that they A/B test (which might be what you're saying?), and the same would apply here. Just choosing something and moving on is probably better than trying to be really particular about what you choose.
My favorite Deming quote “Ninety-five percent of changes made by management today make no improvement.”
Does this apply to BI and other tooling? - ya probably so.
Rather than actually being tied into long term contracts - I think trying to make decisions for 10 years might be a good step. I don’t want to be locked in for 10 years, but should think like does this decision still works 10 years from now?
Reminds me a bit of the Bill Gross argument that your product needs to be 10x better for anyone to consider switching. As products/spaces mature it's impossible to be 10x better barring an industry shift (AI anyone?) so then you're in essence locked in.
I could have very easily written footnote 8 myself, less eloquently naturally, but this is generally my take on such things. Anytime companies "agree" to things being "universal" it's usually in the form of some performative collaboration, when really they are trying to edge each other out for advantage, even if it's scraps of NPS or conversion rates. What's interesting about SQL is that every company does its own flavour of it to a certain extent, but overall it's pretty universal. So if I need to write a query in any of the various database platforms, there may be slight differences (which to be fair can be maddening, in the micro-aggressive sort of way) but I generally don't have to start from scratch with a whole new skill set, and language learning.
I was thinking about SQL during the week too. I remember being annoyed by switching from pl/sql to t-sql to psql as a software dev… but it is nothing compared to the pain of constant relearning I’ve had to do in Data over the last 15 years. Some enduring core standards sure would be nice.