Discussion about this post

User's avatar
Nathaniel's avatar

Cool insight! Gen Patten came to a similar conclusion. I might have to start running a timer for myself now. While this is definitely a great way to reframe self evaluation, or maybe compare individuals, do you think that this would be a good metric for a company to measure its analytics teams performance? Do you think this could introduce something like response time quotas that could get decoupled from realistic expectations as sales quotas often do?

Expand full comment
ASF's avatar

One issue that I can think is about each team data maturity. For example, if I've a business that is starting to analyze their data / dont have much reporting, their questions normally tend to be simpler, there is tons of things that we can optimize (because nobody was looking). As the business complexity grows, a lot of the simpler decisions are automated (dashboards / data cubes / etc...) and the questions tend to be more and more complex.

If I measure the time by measuring how long the data team are taking to reply to a question, I could be measuring the increase in business complexity instead of measuring the data team performance. Also, how can I measure the efficiency provided by questions that are automated via dashboards/alerts/algorithms?

Expand full comment
13 more comments...

No posts