9 Comments
User's avatar
Mark Hovde's avatar

AI is very good where your query can be answered using a region of robust tokens. It is not very good when it has to rely on thin training data. Maddeningly, it speaks equally confidently in both cases!

8Lee's avatar
1dEdited

It is absolutely wild to me how many folks literally took the "AI is like an intern" mantra and never even considered test-driving that sentiment for themselves.

Turns out, AI is more than an intern and for those that have worked with it in any material way, we know that we can never, EVER, go back.

For instance, after successfully building a ~20 step build and compilation process across 8+ different software languages that updates libraries, reviews core documentation, sends out the binary for digital signatures and certifications (i.e. macOS signing) into a verification pipeline with contextual reviews and check for silent failures... ... ... in a SINGLE COMMAND, I will never go back.

And why would I? This has saved me countless hours since deploying.

Anyone who says "AI is like an ___[fill-in-the-blank]___." is immediately suspect to me. If you ask them how they know they fold like bad hand.

It's just funny, that's all.

DJ's avatar

AI is like the HR specialist who informs you of the RIF.

8Lee's avatar

I literally LOL'd. Thank you.

melee_warhead's avatar

I don't feel like AI has surpassed me, but I can imagine a world where life gets hard. AI may not be smarter than me, but it is faster than me. Even if I am better at articulating the right thing for the right reason, AI (if empowered) could probably do 20 things in less than the time it takes for me to do 1 thing.

Clare Frances's avatar

nothing comment:

my job is weird and specific so i don’t really have this experience of the tools. but i also can’t imagine my job without them now

and to be fair interns are usually smarter than the boss and do most of the work

Bat's avatar

What does it say about people that we would rather take directions from a robot than make decisions for ourselves? Why are we doing so much work whose details we don’t actually care about? What are we accomplishing, if we don’t want to be involved in it?

If your job and life can be replaced by AI, it’s probably a signal that you should be doing higher-order work. Regardless of whether AI is like an intern or a team of genius specialists, it needs to be managed.

Tony Siewert's avatar

Might be a good time to start checking out the english translation of the Magnum Opus of Günther Anders: The Obsolescence of the Human. It came out 70 years ago (!), and was finally translated into English last december.

The core thesis: we've been obsolete for quite a time now. At least since the atomic bomb, which nowadays is already pretty much "old news". Technology has taken the place of our common divinity, after God was proclaimed dead. Technology is the driver of history and it has been for quite a while.

So, all AI is doing currently, it's just raising the questions that have been around for decades. It's just now it's so obvious that humanity is obsolete, that you can read about it everywhere. Except, most of these thoughts and ideas are still, seemingly, afraid of going deep.

If you really start thinking about it, we have been playing catchup for a long time. And will continue to do so, some slightly more succesful, some less so.

Jose Nilo's avatar

Sometimes I wonder about the parallel between using an LLM and a calculator. The latter is much more capable than I am at solving simple math calculations. I ‘agree’ to save my brainpower by ‘delegating’ those tasks to the calculator: you know, square roots, net present value, those things. Calculators save me energy. (thinking like an economist, in macroeconomic terms)