“Context Engineering” A Better Term Than Prompt Engineering, Say Tech Leaders

The rapid pace of the development of AI means that the terms that are used to describe it are changing just as rapidly too.

Several tech leaders have said that they prefer the term “context engineering’ over “prompt engineering”. Prompt engineering refers to the the creating of a detailed prompt to give an AI system to get the desired output. But some now believe that a better term for the task is “context engineering”.

“I really like the term “context engineering” over prompt engineering,” wrote Shopify CEO Toby Lutke on X. “It describes the core skill better: the art of providing all the context for the task to be plausibly solvable by the LLM,” he added.

Former Tesla Director of AI Andrej Karpathy — who’d previously coined the term Vibe Coding — gave his stamp of approval to the new name. “+1 for “context engineering” over “prompt engineering”,” he wrote.

“People associate prompts with short task descriptions you’d give an LLM in your day-to-day use. When in every industrial-strength LLM app, context engineering is the delicate art and science of filling the context window with just the right information for the next step. Science because doing this right involves task descriptions and explanations, few shot examples, RAG, related (possibly multimodal) data, tools, state and history, compacting… Too little or of the wrong form and the LLM doesn’t have the right context for optimal performance. Too much or too irrelevant and the LLM costs might go up and performance might come down. Doing this well is highly non-trivial. And art because of the guiding intuition around LLM psychology of people spirits. On top of context engineering itself, an LLM app has to:

– break up problems just right into control flows

– pack the context windows just right

– dispatch calls to LLMs of the right kind and capability

– handle generation-verification UIUX flows

– a lot more – guardrails, security, evals, parallelism, prefetching

So context engineering is just one small piece of an emerging thick layer of non-trivial software that coordinates individual LLM calls (and a lot more) into full LLM apps. The term “ChatGPT wrapper” is tired and really, really wrong,” Karpathy added.

These are good arguments. Prompt engineering had begun being used a bit derisively by a section of people, implying that it didn’t take much effort or skill to be able to prompt an AI system correctly. But as AI models’ contexts have grown — Gemini allows a context of as much as 1 million tokens — prompts have also become quite detailed, and require telling the computer what exactly needs to be done in a way that it can understand and respond to. And with “context engineering” winning Karpathy’s vote, it appears that this term might end up being a descriptor of what working with AI systems could be called in the coming years.

Posted in AI