Discussion about this post

User's avatar
John Wessel's avatar

“If extracting an answer from a BI tool takes too long—i.e., minutes instead of seconds—people aren’t able to cycle through enough trials to cover meaningful ground. Rather of recognizing this incomplete analysis, we score it as bad analysis, and put guardrails around it, to protect people from making mistakes. Do the mundanes stuff, we say, but data teams will do the meaningful work. Shoot your own home videos, and leave the important content to the professionals.”

That’s gold! I got just a taste of this when Tableau first came out - when I got relatively good at Tableau there were brief moments I could move at the speed of thought, only to crash and burning a few minutes later stuck trying to figure out how to do something silly like format an axis. I agree that LLMs have to potential to have a more sustainable “speed of thought” interface with less crashing and burning.

Expand full comment
Ian Thomas's avatar

Ironically it seems that even the world of Microwaves cannot escape the lure of AI:

https://twitter.com/eumweek/status/1671934323048562710

Expand full comment
11 more comments...

No posts