Discussion about this post

User's avatar
Drew Beaupre's avatar

Our brains aren’t very good at comprehending how much knowledge that can fit into a 200k token context window. In terms of football fields and 747s, each question to an LLM can contain nearly 2 complete Lord of the Rings.

https://open.substack.com/pub/drewbeaupre/p/give-the-ai-the-full-picture

So what if we embrace this? What changes if we can 10x the context size?

Expand full comment
Wen's avatar

Using the thing we currently call AI to build things full-on feels a lot like trying to scale with hardware vs. logic/design.

Expand full comment
15 more comments...

No posts