The Percolator

The Percolator

Brewed for Work

From Hype to Obsolescence: The Prompt Engineer’s Short Reign

Explore how prompt engineering surged in 2023 and rapidly declined by 2025—and why context and interface engineering are now the true AI frontier.

The Percolator's avatar
The Percolator
Jul 15, 2025
∙ Paid

In early 2023, “Prompt Engineer – $300K” glittered atop Glassdoor listings like a golden ticket. Startups advertised in-house “Prompt Labs” where wordsmiths tweaked ChatGPT queries for peak performance. Venture capitalists shelled out for boutique prompt agencies, and LinkedIn brimmed with humanities majors turned AI whisperers.

The logic was simple: if you couldn’t build an LLM, you could at least master its language.

Two years later, search “prompt engineer” on any job board and you’ll find nothing but echoes. The discipline that once commanded six-figure salaries has been relegated to footnotes. As

Andrej Karpathy
famously quipped on X, “The hottest new programming language is English” — a rallying cry for prompt-craft in January 2023. Now, Karpathy and Shopify’s Tobi Lütke have both endorsed “context engineering” as the true skill of 2025: “I really like the term ‘context engineering’ over prompt engineering,” Lütke tweeted, “the art of providing all the context for the task to be plausibly solvable by the LLM”.

This isn’t hype fatigue or a paradigm reversal—it’s evolution.

Early LLMs were brittle, punishing any imprecision. Prompt engineers patched those gaps with handcrafted token sequences. But modern models internalize reasoning chains, handle vast context windows, and integrate tool calls natively. The “keyboard” of prompting is fading; in its place emerges a robust engineering discipline that treats LLMs as components in larger, stateful systems.

Generated image

In this issue of Brewed for Work, we will examine prompt engineering’s meteoric ascent as the must‑have skill of 2023, chart its rapid decline by 2025, and illuminate the shift toward context and interface engineering. We’ll revisit real‑world case studies, cite industry leaders like Andrej Karpathy and Tobi Lütke, and dissect the technological advances—larger context windows, retrieval frameworks, and tool orchestration—that rendered prompt tinkering obsolete. Prepare for a forward‑looking blueprint to master AI’s next frontier.

So grab your favorite mug, and let's get brewing!

Today’s Issue at a Glance:
  • The Golden Reign of Prompt Engineers

  • Cracks in the Facade

  • The Fall of the Mighty

  • Beyond Prompt: The Rise of Interface & Context Thinking

Welcome to Brewed for Work, 🔒subscribers-only🔒 offering by The Percolator dedicated to professional growth and upskilling. Each week we share essays, insights and resources to aid you in your work-life.

🚀

Now, you can Upgrade your Subscription for Free when you Invite your Friends to Subscribe to The Percolator

Share The Percolator

Prompt engineering began as guerrilla UX for nascent large language models. In mid-2020, developers on Reddit and Twitter discovered that a simple nudge—“Let’s think step by step”—could unlock far more coherent reasoning from GPT-3. Chain-of-thought prompting, which interleaves example reasoning steps with queries, soon became a staple technique, celebrated in research papers and The Guardian alike as a breakthrough for complex tasks. Practitioners packaged these hacks into marketplaces like PromptBase and taught them in $200 Udemy courses, marketing “Prompt Mastery” to anyone desperate for AI leverage.

Enterprises rushed in. Boardrooms with no in-house AI talent saw prompt engineering as an expedient bridge to value: outsource the “linguistic interface” rather than build your own models. Humanities grads pivoted into gig work, crafting prompts for legal briefs, marketing copy, and even software scaffolding.

LLMs, for all their statistical power, needed precise coaxing—and prompt engineers delivered.

Yet every stopgap carries the seeds of obsolescence. As OpenAI, Anthropic, and Google rolled out GPT-4, Claude 3, and PaLM 2, models gained robust reasoning, integrated function calling, and vastly expanded context windows. No longer did you need elaborate prompt gymnastics; you could feed structured data, invoke APIs, and let the model’s internal architecture handle logic. Prompt engineering shifted from art to maintenance—tweaking dozens of brittle templates whenever performance dipped.

Wikipedia today notes that “the job of prompt engineer has become obsolete due to models that better intuit user intent”. Indeed, leading practitioners concede that the high-stakes game of prompt crafting was always meant as a transitional UX paradigm—useful only until models and tooling matured. We now stand at that pivot point.

Prompt engineering peaked as a temporary lingua franca for early LLMs and has been superseded by context engineering—a discipline that assembles system messages, memory stores, tool orchestration, and retrieval pipelines into coherent, end-to-end workflows.

Keep reading with a 7-day free trial

Subscribe to The Percolator to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 The Percolator
Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture