Cloudflare Says It Used AI To Rebuild Next.js In 1 Week For $1,100

AI isn’t just helping out with coding — it’s helping complete entire projects at a pace and price-point that would’ve been unthinkable even a year ago.

Cloudflare this week published a blog post detailing how one of its engineers rebuilt Next.js — the most widely used framework for building React-based web applications — from scratch in under seven days, spending roughly $1,100 in AI model costs along the way. The result is an open-source project called vinext (pronounced “vee-next”), and the story behind its creation says something significant about where AI-assisted software development is headed.

Why Next.js Is Such a Big Deal — and Such a Hard Target

To understand the scale of what Cloudflare pulled off, it helps to understand what Next.js actually is. Built by Vercel and used by millions of developers worldwide, Next.js is the dominant framework for building modern web applications with React. It handles a staggering range of responsibilities: file-based routing (so that creating a file automatically creates a web page), server-side rendering, React Server Components (a newer architecture for running parts of your app on the server), streaming, caching, middleware, and a full development server with hot-reloading.

This is not a small codebase. Rebuilding it would normally be described as a multi-year, multi-team undertaking. Several engineering teams have attempted portions of it before. The complexity is one reason why an entire separate project — called OpenNext — exists just to adapt Next.js’s output for deployment on platforms other than Vercel, Cloudflare’s main competitor in the hosting space. OpenNext itself requires substantial ongoing engineering effort and still struggles to keep pace with changes in Next.js between versions.

Cloudflare has been working on OpenNext too. But the company ultimately decided the adapter approach was fragile, and wondered: what if they just reimplemented the whole thing?

The $1,100 Rebuild

On February 13, 2026, a Cloudflare engineering manager sat down with an AI coding assistant and started building. By the end of that first evening, both of Next.js’s main routing systems (the older Pages Router and the newer App Router) had basic server-side rendering working, along with support for middleware and server actions. By the following afternoon, the new framework was rendering 10 out of 11 routes in Next.js’s own official demo application. By day three, a single command was shipping complete web applications to Cloudflare’s global infrastructure.

The project, vinext, is built on top of Vite — a widely adopted build tool that powers frameworks like Nuxt, SvelteKit, and Astro, but which Next.js has notably never used. Rather than wrapping or patching Next.js, vinext reimplements its API surface entirely: the same file structure, the same configuration format, the same component conventions, but built on a completely different technical foundation.

Over the course of the project, Cloudflare ran more than 800 AI coding sessions. Total cost in API tokens: approximately $1,100.

“A project like this would normally take a team of engineers months, if not years,” the post reads. “We tried once at Cloudflare. The scope is just enormous.”

What the Numbers Look Like

Early benchmarks show vinext building production applications up to 4.4x faster than Next.js 16 when using Vite 8’s Rolldown bundler (which is written in Rust, a systems programming language known for speed). Client-side JavaScript bundles — the code that gets sent to users’ browsers — come in 57% smaller on average in testing.

These numbers come from a single 33-route test application and Cloudflare is transparent that they’re directional, not definitive. But the structural reasons for the performance gap are real: Vite’s architecture, and especially Rolldown, has fundamental advantages in build speed over Turbopack, the custom build toolchain Vercel developed specifically for Next.js.

Cloudflare is also deploying a feature called Traffic-aware Pre-Rendering (TPR) that takes a genuinely clever approach to one of Next.js’s biggest pain points. Large sites built with Next.js can have 10,000, 50,000, or even hundreds of thousands of pages that all need to be pre-rendered at build time — a process that scales linearly and can push build times to 30 minutes or more. vinext’s TPR solves this by querying Cloudflare’s own traffic analytics at deploy time and only pre-rendering the pages that actually receive meaningful traffic. For most large sites, 90% of traffic goes to 50–200 pages. Those get pre-rendered in seconds. Everything else is generated on-demand and cached after the first visit.

How the AI Actually Worked

The engineering manager behind the project, Steve Faulkner, is candid about the process. It wasn’t simply typing a prompt and receiving a finished framework. The workflow was more structured: define a specific task (for example, “implement the navigation module with these specific functions”), let the AI write the code and tests, run the test suite, and either merge or feed the error output back to the AI for another iteration.

Critically, the project was set up with the kinds of guardrails that force quality — over 1,700 automated unit tests and 380 end-to-end browser tests, many ported directly from Next.js’s own test suite. Continuous integration ran all of them on every code change. AI agents also handled code review, opening a loop where one agent wrote code, another reviewed it, and a third addressed the review comments.

“When you give AI good direction, good context, and good guardrails, it can be very productive,” Faulkner writes. “But the human still has to steer.”

He notes that the AI occasionally produced plausible-looking code that didn’t actually match real Next.js behavior — confidently wrong rather than obviously broken. Architecture decisions, prioritization, and recognizing when the AI was heading down a dead end all required human judgment. The AI was the hands; the human was the head.

What made this particular project especially well-suited for AI assistance was a confluence of factors: Next.js is extensively documented and widely discussed, meaning AI models have seen enormous amounts of material about how it works. It has a comprehensive test suite that could serve as a mechanical specification. Vite provided a strong, well-understood foundation so the AI didn’t have to invent core infrastructure. And current frontier models, according to Faulkner, can hold a large codebase’s architecture coherently in context in a way that earlier models couldn’t.

“We don’t think this would have been possible even a few months ago,” he writes.

What This Means Beyond One Project

This isn’t the first time that AI has helped complete a coding project which could’ve originally taken months. Just last month, Cursor had announced that it had managed to build a whole browser in one week of uninterrupted use using GPT 5.2 Rebuilding next.js is also a similar achievement. Faulkner reflects on this directly in the post: most of the abstraction layers in modern software exist because individual humans couldn’t hold entire complex systems in their heads at once. Frameworks got built on top of frameworks. Wrapper libraries proliferated. Enormous amounts of “glue code” connected one layer to the next. Much of this complexity, he argues, was a response to human cognitive limits rather than a reflection of what the problem actually required.

AI doesn’t have the same limits. Given a specification, a solid foundation, and good guardrails, it can write everything in between — without needing intermediate scaffolding designed to help humans stay organized.

“It’s not clear yet which abstractions are truly foundational and which ones were just crutches for human cognition,” Faulkner writes. “That line is going to shift a lot over the next few years.”

For software developers, that’s either an exciting statement or an unsettling one. Probably both.

Posted in AI