Maintaining a codebase with AI | The Standup
Based on The PrimeTime's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
Vex is built to run Next.js on Cloudflare while keeping Next’s API surface aligned, using a different runtime underneath.
Briefing
Cloudflare’s “Vex” (V Next) is built to make Next.js easier to deploy on Cloudflare by matching Next’s API surface while swapping in a Cloudflare-friendly runtime. The project’s momentum is tied directly to AI-assisted maintenance: AI bots triage and review pull requests, run security checks, and even track changes in the upstream Next.js repo to open issues when relevant commits land. That combination—strict compatibility plus AI-driven upkeep—has helped the team keep an open-source codebase moving without burning out maintainers.
The origin story traces back to a long-running customer request: make Next.js deployments work smoothly on Cloudflare’s architecture, which differs from traditional hosting. An initial intern effort focused on the “pages router” surface area and produced a working baseline, but the work was paused because maintaining the full compatibility layer would require sustained effort. The project later restarted when a new internal push added more automation and, crucially, leaned into AI to handle the ongoing review and security burden.
A key decision was to avoid turning the effort into a divergent “fork” of Next.js. The team positions Vex as an API service that follows Next.js one-for-one, rejecting feature requests that would break that contract. Even when community demand appears—such as calls to restore deprecated Next features like “getInitialProps”—the default stance is to hold the line on current Next behavior (the discussion referenced keeping alignment with Next 16). The team frames “forking” as healthy in general, but argues the cost and complexity of maintaining a radically different surface area isn’t worth it for this product.
The conversation also highlights where compatibility gets tricky: community packages sometimes rely on undocumented Next.js internals, and those integrations can break when moved to Vex. The team hasn’t supported importing from Next’s internal distribution paths (and says it’s “never say never”), but the practical takeaway is that compatibility isn’t just about documented APIs—it’s also about how third-party tooling hooks into the framework.
On the “AI maintenance” side, the team credits a layered testing strategy: ported Next.js tests, unit and end-to-end tests, plus smoke tests that run against production deployments. Still, AI introduces its own failure modes. One example described “slop” from a large AI-generated template string that became unmaintainable, requiring a human-led cleanup sprint to modularize the logic. Another recurring theme: even with automation, maintainers can feel like “glorified AI babysitters,” juggling multiple workspaces and stepping in when agents drift down weird paths.
Guardrails are therefore as much process as tooling. The team runs scheduled agent updates that review PRs and comments, uses linting/formatting to keep diffs sane, and relies on internal “engineering codeex” rules—an RFC-driven set of “ten commandments” expanded into more—that an AI reviewer checks against (including rules like “no long HTML strings”). The broader claim is pragmatic: AI can accelerate maintenance and reduce tech debt, but humans still need to enforce architectural judgment.
Finally, the reception is described as strongly positive, driven less by Twitter discourse and more by usage metrics: a major spike in new users and a steady stream of merged PRs and active contributors. The team argues that Vex’s appeal is also a reflection of Vite’s strength—most frameworks already standardize on Vite, while Next’s historical choice of its own runtime left a “pent-up” desire for a simpler path. The result is an experiment that’s moving toward production readiness, with the next big milestone framed as “proper pre-rendering for everything,” plus ongoing work on smaller behavioral mismatches like navigation semantics and rendering details.
Cornell Notes
Cloudflare’s Vex aims to run Next.js on Cloudflare by keeping Next’s API surface intact while changing the underlying runtime. The project’s sustainability hinges on AI-assisted maintenance: bots triage and review PRs, perform security checks, and monitor upstream Next.js changes to open issues. Compatibility is enforced by refusing requests that would diverge from Next’s behavior, though community packages that depend on undocumented Next internals can still break. Despite strong test coverage and guardrails, AI can still generate “slop” (like unmaintainable template strings), requiring human cleanup and architectural judgment. The team frames Vex as a practical, open-source experiment that’s gaining traction through real usage and contributor activity.
What is Vex trying to achieve, and why does Cloudflare’s architecture make that non-trivial?
Why did the project pause after an intern prototype, and what changed later?
How does Vex avoid becoming a divergent fork of Next.js?
Where do compatibility problems most often show up?
What guardrails keep AI-driven development from degrading code quality?
What does “moving from experiment to production” require next?
Review Questions
- What mechanisms does Vex use to keep compatibility with Next.js while still changing the runtime?
- Describe two ways AI assistance can improve maintenance and two ways it can create new risks that require human intervention.
- Why might a third-party package work on Vercel but fail on Vex, even if it targets Next.js?
Key Points
- 1
Vex is built to run Next.js on Cloudflare while keeping Next’s API surface aligned, using a different runtime underneath.
- 2
AI is central to long-term maintenance: bots handle PR triage/review, security checks, and upstream Next.js change tracking.
- 3
The project stays “hard on same surface area,” generally rejecting changes that would diverge from Next’s documented/current behavior.
- 4
Compatibility issues often arise from third-party packages relying on undocumented Next.js internals or internal import paths.
- 5
Test coverage is layered: ported Next.js tests, unit and end-to-end tests, plus production smoke tests.
- 6
AI can still generate unmaintainable code patterns (e.g., huge template strings), so humans must enforce architectural judgment and refactor when needed.
- 7
The next production-readiness milestone is proper pre-rendering for everything, along with resolving smaller behavioral mismatches (like navigation semantics).