Get AI summaries of any video or article — Sign up free
First Block: Interview with Jesse Zhang, Co-Founder and CEO of Decagon thumbnail

First Block: Interview with Jesse Zhang, Co-Founder and CEO of Decagon

Notion·
5 min read

Based on Notion's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Decagon positions AOPs as natural-language, SOP-like instructions that make agent behavior transparent and easier to update than brittle conversation trees.

Briefing

Decagon’s core bet is that customer service automation can move from brittle, hand-built chatbot trees to reliable, business-ready AI by using “agent operating procedures” (AOPs)—natural-language instructions that teams can read, update, and test like standard operating procedures. The approach targets a recurring enterprise pain point: even when models can generate text, traditional automation systems often break under real-world edge cases and become expensive to maintain. By letting teams specify workflows in plain language, Decagon aims to make AI behavior transparent and iteratable, so deployments can improve quickly without requiring constant technical re-engineering.

That design philosophy is tied to how Decagon avoids the fate many AI pilots face: stalling before production or failing to prove ROI. Jesse Zhang frames the company’s path as customer-first rather than narrative-driven. Early on, the prevailing industry assumption was that incumbents would eventually copy the idea and undercut new entrants; Decagon’s counter was to focus on customers from the ground up and treat the market as still unsolved—because strong customers willing to partner signal there’s real work left to do. In hindsight, the company’s production success is attributed to two practical advantages in customer support: impact can be measured with clear operational metrics like deflection rates and customer satisfaction, and AI can be “derisked” with a structural fallback to human escalation. That combination makes it easier to build a business case and reduces the pressure to get every interaction perfect before going live.

Decagon’s product performance is also linked to model capability and iteration speed. Zhang points to “Jedi models” as well-suited for conversational and transactional problem-solving, while AOPs enable faster iteration across complex workflows and edge cases—an advantage that shows up in reported outcomes such as 70–80% deflection and roughly 3x customer satisfaction improvements at customers including Duolingo, URA, ClassPass, and Notion.

The company’s growth story reflects a broader operating style shaped by earlier startup lessons. Zhang’s first company, Loki, went through YC and struggled for about two years to find a useful direction before landing on video capture for games, which led to acquisition by Niantic in 2021. The key takeaway was not just talking to customers, but getting better at it—turning ideation into a systematic, customer-validated process.

On enterprise trust and scaling, Zhang credits short-term execution early: rapidly deploying to large customers, building a team that could ship, and letting the product “mold” around real usage. As the customer base solidified, the company shifted toward longer-term infrastructure work. Hiring strategy followed a similar logic—early generalists from trusted networks, then more specialization as the company grows. Today the team is nearing 200 people, up from roughly a dozen about a year and a half earlier.

For founders, Zhang’s advice is to avoid overindexing on generic scaling playbooks and instead focus on personal strengths, timing, and product fit. Rebuilding Decagon from scratch would mainly mean hiring key roles—especially marketing—earlier, since waiting until demand is obvious can cost months of recruiting and ramp time. Overall, Decagon’s story ties together measurable ROI, maintainable agent design, and disciplined execution as the path from pilot to production.

Cornell Notes

Decagon builds AI customer service agents that can handle conversations, look up data, reason through steps, and guide users through workflows. Its central mechanism is “agent operating procedures” (AOPs): natural-language instructions that make agent behavior transparent, easier to update, and faster to iterate than brittle, tree-based automation. Zhang links production success to two enterprise realities—customer service metrics like deflection and satisfaction are easy to quantify, and human escalation provides a structural fallback that reduces risk. The company also credits speed in early execution: deploying quickly to large customers and letting real usage shape the product. Over time, Decagon has shifted from short-term shipping to longer-term infrastructure and more specialized hiring.

Why does Decagon treat AOPs as more than just a prompt-writing trick?

AOPs are positioned as a maintainable system for agent behavior. Traditional automation often becomes brittle because it relies on large, hand-built conversation trees and expensive technical maintenance when workflows change. AOPs replace that with natural-language “SOP-like” instructions that teams can read and modify. That transparency matters operationally: new team members can understand how the AI is configured, run experiments, and update behavior without redesigning an entire node graph.

What makes customer support a uniquely workable use case for AI agents compared with other domains?

Two pillars show up repeatedly. First, impact is measurable: teams can track deflection rates and customer satisfaction, which makes ROI easier to quantify and communicate. Second, there’s a built-in risk reducer—structural fallback to human escalation. That means the system can start handling parts of the workload and escalate the rest, rather than requiring near-perfect performance on every interaction before going live.

How does Zhang connect Decagon’s success to avoiding the “AI pilots fail” pattern?

Rather than designing around failure narratives, Decagon focuses on customer-first direction and measurable outcomes. Zhang points to the distraction caused by broad industry assumptions (e.g., incumbents will copy and crush new entrants). Decagon’s counter is to work with customers who still need help, then use quantifiable metrics and human escalation to move pilots into production with a credible business case.

What role do models and iteration speed play in the reported performance gains?

Zhang attributes results to both model capability and the iteration loop. “Jedi models” are described as strong at conversation and transactional problem-solving. Meanwhile, AOPs let teams iterate quickly across complex workflows and edge cases, which accelerates improvements in operational stats like deflection and satisfaction.

How did the Loki experience shape the approach to ideation and customer discovery at Decagon?

Loki’s two-year ideation grind ended with the realization that the initial direction wasn’t useful, followed by a lucky pivot to video capture for games and eventual acquisition by Niantic in 2021. The reflection emphasized being more intentional and systematic during ideation and improving customer conversations—talking to customers is necessary, but doing it well is what changes outcomes.

What hiring philosophy supports Decagon’s scaling from early stage to enterprise deployment?

Early hiring leans toward generalists, often from trusted networks, because competition for talent is intense and early teams are vulnerable to adverse selection. As the company grows, roles become more specialized, and the team shifts toward staff engineers and newer grads for specific needs. Zhang also highlights the importance of speed: early hires and team structure were optimized for deploying large customers quickly, then evolving infrastructure and functions later.

Review Questions

  1. How do AOPs change the maintenance and update process compared with tree-based chatbot systems?
  2. Which two factors make ROI easier to prove in customer service, and how do they reduce deployment risk?
  3. What does Zhang say founders should do differently when choosing ideas and building go-to-market plans?

Key Points

  1. 1

    Decagon positions AOPs as natural-language, SOP-like instructions that make agent behavior transparent and easier to update than brittle conversation trees.

  2. 2

    Customer support deployments are derisked through measurable metrics (deflection rate and customer satisfaction) and a structural fallback to human escalation.

  3. 3

    Decagon’s path to production success is linked to customer-first direction rather than industry narratives about incumbents copying the idea.

  4. 4

    Reported outcomes at customers are attributed to both model suitability for conversational/transactional tasks and faster iteration enabled by AOPs.

  5. 5

    Zhang’s earlier startup experience (Loki) reinforced the need for a more systematic, intentional ideation phase and better customer conversations.

  6. 6

    Enterprise trust and scaling early on were driven by short-term execution: rapid deployment to large customers and building the product around real usage.

  7. 7

    Hiring strategy evolves from early generalists hired through trusted networks to later specialization as the company grows.

Highlights

AOPs aim to replace brittle, expensive-to-maintain chatbot trees with natural-language procedures teams can read, update, and test.
Customer service is framed as uniquely measurable: deflection and satisfaction metrics make ROI easier to build and defend.
Human escalation is treated as a structural fallback that lets AI handle parts of the workload without requiring perfection upfront.
Decagon’s performance gains are credited to both “Jedi models” and the ability to iterate quickly across complex workflows using AOPs.
Zhang’s founder advice centers on resisting generic scaling playbooks and optimizing for personal strengths, timing, and product fit.

Topics

  • AI Customer Service Agents
  • Agent Operating Procedures (AOPs)
  • Pilot to Production ROI
  • Enterprise Deployment
  • Hiring and Team Scaling

Mentioned