Discussion about this post

User's avatar
M A's avatar

+1 for Krazam

Drew Beaupre's avatar

Our brains aren’t very good at comprehending how much knowledge that can fit into a 200k token context window. In terms of football fields and 747s, each question to an LLM can contain nearly 2 complete Lord of the Rings.

https://open.substack.com/pub/drewbeaupre/p/give-the-ai-the-full-picture

So what if we embrace this? What changes if we can 10x the context size?

15 more comments...

No posts

Ready for more?