Get AI summaries of any video or article — Sign up free
Maintaining a codebase with AI | The Standup thumbnail

Maintaining a codebase with AI | The Standup

The PrimeTime·
6 min read

Based on The PrimeTime's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Vex is built to run Next.js on Cloudflare while keeping Next’s API surface aligned, using a different runtime underneath.

Briefing

Cloudflare’s “Vex” (V Next) is built to make Next.js easier to deploy on Cloudflare by matching Next’s API surface while swapping in a Cloudflare-friendly runtime. The project’s momentum is tied directly to AI-assisted maintenance: AI bots triage and review pull requests, run security checks, and even track changes in the upstream Next.js repo to open issues when relevant commits land. That combination—strict compatibility plus AI-driven upkeep—has helped the team keep an open-source codebase moving without burning out maintainers.

The origin story traces back to a long-running customer request: make Next.js deployments work smoothly on Cloudflare’s architecture, which differs from traditional hosting. An initial intern effort focused on the “pages router” surface area and produced a working baseline, but the work was paused because maintaining the full compatibility layer would require sustained effort. The project later restarted when a new internal push added more automation and, crucially, leaned into AI to handle the ongoing review and security burden.

A key decision was to avoid turning the effort into a divergent “fork” of Next.js. The team positions Vex as an API service that follows Next.js one-for-one, rejecting feature requests that would break that contract. Even when community demand appears—such as calls to restore deprecated Next features like “getInitialProps”—the default stance is to hold the line on current Next behavior (the discussion referenced keeping alignment with Next 16). The team frames “forking” as healthy in general, but argues the cost and complexity of maintaining a radically different surface area isn’t worth it for this product.

The conversation also highlights where compatibility gets tricky: community packages sometimes rely on undocumented Next.js internals, and those integrations can break when moved to Vex. The team hasn’t supported importing from Next’s internal distribution paths (and says it’s “never say never”), but the practical takeaway is that compatibility isn’t just about documented APIs—it’s also about how third-party tooling hooks into the framework.

On the “AI maintenance” side, the team credits a layered testing strategy: ported Next.js tests, unit and end-to-end tests, plus smoke tests that run against production deployments. Still, AI introduces its own failure modes. One example described “slop” from a large AI-generated template string that became unmaintainable, requiring a human-led cleanup sprint to modularize the logic. Another recurring theme: even with automation, maintainers can feel like “glorified AI babysitters,” juggling multiple workspaces and stepping in when agents drift down weird paths.

Guardrails are therefore as much process as tooling. The team runs scheduled agent updates that review PRs and comments, uses linting/formatting to keep diffs sane, and relies on internal “engineering codeex” rules—an RFC-driven set of “ten commandments” expanded into more—that an AI reviewer checks against (including rules like “no long HTML strings”). The broader claim is pragmatic: AI can accelerate maintenance and reduce tech debt, but humans still need to enforce architectural judgment.

Finally, the reception is described as strongly positive, driven less by Twitter discourse and more by usage metrics: a major spike in new users and a steady stream of merged PRs and active contributors. The team argues that Vex’s appeal is also a reflection of Vite’s strength—most frameworks already standardize on Vite, while Next’s historical choice of its own runtime left a “pent-up” desire for a simpler path. The result is an experiment that’s moving toward production readiness, with the next big milestone framed as “proper pre-rendering for everything,” plus ongoing work on smaller behavioral mismatches like navigation semantics and rendering details.

Cornell Notes

Cloudflare’s Vex aims to run Next.js on Cloudflare by keeping Next’s API surface intact while changing the underlying runtime. The project’s sustainability hinges on AI-assisted maintenance: bots triage and review PRs, perform security checks, and monitor upstream Next.js changes to open issues. Compatibility is enforced by refusing requests that would diverge from Next’s behavior, though community packages that depend on undocumented Next internals can still break. Despite strong test coverage and guardrails, AI can still generate “slop” (like unmaintainable template strings), requiring human cleanup and architectural judgment. The team frames Vex as a practical, open-source experiment that’s gaining traction through real usage and contributor activity.

What is Vex trying to achieve, and why does Cloudflare’s architecture make that non-trivial?

Vex is designed to make Next.js easier to deploy on Cloudflare by matching Next.js’s API surface while running on a different runtime suited to Cloudflare’s constraints. The team highlights that Cloudflare’s “region everywhere deploy” approach changes trade-offs compared with traditional environments, so Next’s assumptions don’t always map cleanly. That mismatch is the core problem Vex targets: preserve Next’s developer experience while adapting the runtime to work well on Cloudflare.

Why did the project pause after an intern prototype, and what changed later?

The intern work initially focused on the pages router and produced a baseline that “kind of worked,” but the team shelved it because full compatibility would require ongoing maintenance investment. Later, the restart leaned on AI to make that maintenance feasible: AI bots handle triage, PR review, security review, and upstream tracking (opening issues when relevant Next.js commits appear). AI is presented as the mechanism that makes long-term upkeep sustainable.

How does Vex avoid becoming a divergent fork of Next.js?

Vex is positioned as an API service that follows Next.js “one for one,” not a new feature set. The team says it’s hard on maintaining the same surface area and generally won’t accept PRs or feature requests that fall outside the official Next API contract. Even when users ask for deprecated behavior (the discussion referenced “getInitialProps” being deprecated), the default is to stay aligned with current Next behavior (notably keeping alignment with Next 16).

Where do compatibility problems most often show up?

The biggest friction comes from community packages that hook into Next.js internals—sometimes knowingly, sometimes unknowingly. The transcript mentions trouble around importing from Next’s internal distribution paths (e.g., “next/disc” was referenced as an internal import that tends to cause issues). Vex hasn’t supported those internal imports yet, and the team treats undocumented internal dependencies as a common source of breakage.

What guardrails keep AI-driven development from degrading code quality?

Guardrails combine tests and process. The team ported a large suite of Next.js tests (plus unit and end-to-end tests) and added smoke tests against production deployments. It also uses linting/formatting to keep diffs manageable and runs scheduled AI agent processes that review PRs and comments to prevent repeated mistakes. A concrete failure mode was described: AI generated a ~2,000-line template string with too much logic, which a human then refactored into modules during a cleanup sprint.

What does “moving from experiment to production” require next?

The team’s near-term milestone is “proper pre-rendering for everything,” including accommodating people who don’t want percentage-based pre-rendering. It also involves resolving smaller behavioral mismatches where Next and Vex/Turboac-style approaches don’t map cleanly—examples included navigation semantics (hard navigation vs Next’s soft navigation) and other routing/hydration/rendering details. The overall message is that most core routing/hydration/SSR works, but pre-rendering and edge behaviors are the remaining gaps.

Review Questions

  1. What mechanisms does Vex use to keep compatibility with Next.js while still changing the runtime?
  2. Describe two ways AI assistance can improve maintenance and two ways it can create new risks that require human intervention.
  3. Why might a third-party package work on Vercel but fail on Vex, even if it targets Next.js?

Key Points

  1. 1

    Vex is built to run Next.js on Cloudflare while keeping Next’s API surface aligned, using a different runtime underneath.

  2. 2

    AI is central to long-term maintenance: bots handle PR triage/review, security checks, and upstream Next.js change tracking.

  3. 3

    The project stays “hard on same surface area,” generally rejecting changes that would diverge from Next’s documented/current behavior.

  4. 4

    Compatibility issues often arise from third-party packages relying on undocumented Next.js internals or internal import paths.

  5. 5

    Test coverage is layered: ported Next.js tests, unit and end-to-end tests, plus production smoke tests.

  6. 6

    AI can still generate unmaintainable code patterns (e.g., huge template strings), so humans must enforce architectural judgment and refactor when needed.

  7. 7

    The next production-readiness milestone is proper pre-rendering for everything, along with resolving smaller behavioral mismatches (like navigation semantics).

Highlights

Vex’s sustainability claim is unusual: AI bots don’t just write code—they triage, review, and security-check PRs, plus monitor upstream Next.js changes to open issues.
The team’s compatibility stance is strict: Vex aims to match Next.js one-for-one and resists requests to reintroduce deprecated Next behaviors.
Undocumented Next.js internals are a recurring failure point—community packages that depend on them can break when moved to Vex.
Even with automation, maintainers still step in for “slop” cases, including a described cleanup of a ~2,000-line AI-generated template string.
The biggest remaining gap toward production readiness is framed as proper pre-rendering for everything, not just routing or hydration.

Topics

Mentioned