The scaffolding era is over. LlamaIndex, a leader in retrieval-augmented generation (RAG) frameworks, argues that context is the new moat. As developers witness the collapse of traditional scaffolding layers—indexing, query engines, and agent loops—LlamaIndex CEO Jerry Liu insists this shift is intentional and beneficial. For engineers and founders navigating the evolving landscape, understanding this transition is crucial.
What LlamaIndex Does
LlamaIndex connects private, custom, and domain-specific data to large language models (LLMs). Historically, developers relied on complex frameworks to build deterministic workflows. But according to Liu, the need for such frameworks is diminishing. Modern LLMs are increasingly adept at reasoning over massive amounts of unstructured data, offering capabilities like self-correction and multi-step planning. With tools like the Modern Context Protocol (MCP) and Claude Agent Skills plug-ins, these models can autonomously discover and utilize tools, bypassing the need for individual integrations.
Liu points out that coding agents are now proficient in writing code, reducing the reliance on extensive libraries. An impressive 95% of LlamaIndex’s code is AI-generated, highlighting a shift where natural language becomes the new programming language.
Competitive Context and Market Landscape
As the stack collapses, the differentiator becomes context. LlamaIndex is betting on its agentic document processing capabilities, including optical character recognition (OCR), to provide higher accuracy and cost-effective parsing. The ability to extract relevant information from various file formats is increasingly vital. Liu emphasizes that regardless of whether developers use OpenAI Codex or Claude Code, the critical element is context.
However, there’s a growing concern about companies like Anthropic locking in session data. Liu advocates for modularity and agnosticism in tech stacks. Enterprises should avoid overbuilding and ensure their code bases remain adaptable, acknowledging that parts of the stack may need to be discarded as new models emerge.
Implications for Founders and Engineers
For founders and engineers, the message is clear: adaptability is key. As agent patterns move toward a "managed agent diagram," with consolidated harness layers and tool integrations, the focus should be on maintaining flexibility. Liu advises against betting on a single model or overcomplicating components, as the landscape is continuously shifting with each new model release.
Looking ahead, the emphasis on context will reshape how developers and companies approach AI integration. As the scaffolding layers dissolve, the opportunity lies in leveraging context to unlock data previously trapped in file formats. For those in the tech industry, staying informed about these shifts and preparing to pivot quickly will be essential.
For engineers, this means honing skills in natural language interactions and understanding the nuances of context-driven AI. For founders, it’s about ensuring their tech stack remains modular and ready to adapt to new advancements. Investors should watch companies that prioritize context and modularity, as these will likely lead the next wave of AI innovation.



















