Get AI summaries of any video or article — Sign up free
AI AGENTS From Zero to Production in 35 Minutes - FULL TUTORIAL thumbnail

AI AGENTS From Zero to Production in 35 Minutes - FULL TUTORIAL

All About AI·
5 min read

Based on All About AI's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Use Supabase as shared memory so agents can run on separate schedules without direct dependencies.

Briefing

A complete “autonomous finance brief” system is built from scratch: one scheduled agent pulls Bitcoin prices, another fetches macro and crypto-related news, and a final agent compiles both into a short, correlation-focused email (with a generated price chart) sent automatically every morning. The practical takeaway is that AI agents don’t need to be tightly coupled—each can run on its own schedule, write to a shared database, and let a downstream agent assemble the context.

The workflow starts with a simple data-ingestion script that calls the CoinGecko API to fetch the current Bitcoin price. After setting up a Python virtual environment and installing dependencies, the script writes each fetched price into a Supabase table (with fields for ID, price, and created_at). The tutorial then verifies the pipeline by rerunning the script and confirming new rows appear in the database.

Next comes the “information agent,” which uses OpenAI plus function calling to gain tool-like access to Brave Search. The agent is configured to search for finance-relevant news, then store results in a second Supabase table (Eco news) with a timestamp and a text field for the finance info. Early runs hit a failure that’s traced to a typo in the table name, after which the agent successfully stores multiple news items. The prompt is then adjusted to run two searches—one focused on Bitcoin/crypto and another on broader finance/macro—so the database accumulates a richer context set for later analysis.

With both datasets accumulating independently, the final “email agent” pulls recent context from Supabase: the latest BTC prices and the most recent news entries. It sends that context to OpenAI to generate a highly concise professional email addressed to “Chris,” aiming to identify correlations between market-moving events and Bitcoin price movement. The email is delivered via the Mailgun API, and the tutorial confirms successful delivery by opening the received message.

To make the briefing more actionable, the system is extended to attach a chart. Using Matplotlib, the email agent generates a PNG line graph from recent stored Bitcoin prices and includes it as an attachment in the Mailgun email. The tutorial notes that the chart may be “not to scale” during testing, but the attachment mechanism works end-to-end.

Finally, the project is moved into production using Heroku. Dependencies are added to a requirements file, a runtime configuration is set, and secrets (API keys for Supabase, OpenAI, Brave, and Mailgun) are stored in Heroku config variables. The tutorial disables background workers and instead relies on Heroku Scheduler jobs: the Bitcoin price agent runs every 10 minutes, the news agent runs on its own cadence, and the email agent runs shortly after so it can compile the latest database context. After deployment, logs confirm searches and database writes, and the scheduled email arrives with both the generated text and the chart attachment—demonstrating a fully autonomous, database-driven agent pipeline suitable for real-world automation.

Cornell Notes

The system builds a scheduled, database-driven AI workflow for finance updates. One agent fetches Bitcoin prices from CoinGecko and stores them in Supabase; a second agent uses OpenAI function calling with Brave Search to collect macro/crypto news and stores it in a separate Supabase table. A third agent reads the latest rows from both tables, prompts OpenAI to produce a short correlation-focused email, and sends it via Mailgun. The email agent also generates a Matplotlib chart from recent BTC prices and attaches it as a PNG. Running everything on Heroku Scheduler keeps agents independent while still producing a coherent daily briefing.

How does the pipeline ensure each agent can run independently without breaking the final email?

Each agent writes to shared state in Supabase rather than calling other agents directly. The BTC agent inserts new rows into a BTC price table (ID, price, created_at). The info agent inserts news items into an Eco news table (ID, time stamp, finance info text). The email agent later queries both tables for the latest N entries (e.g., last five BTC prices and last ten news items) and uses that combined context to generate the email. This decoupling lets Heroku Scheduler trigger jobs on separate schedules.

What tool-using mechanism lets the news agent search the web reliably?

OpenAI function calling is used to connect the model to a Brave Search tool. The tutorial pulls Brave Search API documentation and defines a function that performs web searches, then passes results back into the model. The agent’s system prompt instructs it to run searches for finance/macro and Bitcoin/crypto news, then store the retrieved items into the Supabase Eco news table.

Why did the first attempt at storing news fail, and what fixed it?

The initial run stored nothing because the database table name was wrong—an “info” vs “in” style typo in the table reference. After correcting the table name, rerunning the info agent successfully stored multiple news items in Supabase, confirming the tool-to-database write path worked.

How does the email agent turn raw database rows into a short, correlation-focused message?

The email agent fetches recent BTC price rows and recent finance/news rows from Supabase, then embeds that context into an OpenAI prompt. The prompt instructs the model to produce a “highly educated professional” analysis that is “very short and concise,” specifically aiming to find correlations between the latest events and Bitcoin price movement. The output is then sent via the Mailgun API to the owner’s email address.

How is the chart attachment generated and delivered?

Matplotlib is used to create a PNG line chart from recent BTC prices stored in Supabase. The email agent then attaches the generated PNG to the Mailgun email. The tutorial verifies the attachment by opening the received email and confirming the chart appears as an attached image.

What production setup choices make the system run autonomously on Heroku?

Heroku Scheduler is used to run each agent on a timed cadence (BTC agent every 10 minutes; info agent and email agent on staggered schedules). Background workers are scaled down to zero so the system relies on scheduled jobs. Heroku config variables store secrets like API keys, and the app is deployed with a requirements file that includes dependencies such as requests, supabase, openai, and matplotlib.

Review Questions

  1. If the email agent runs before the news agent finishes its scheduled job, what data gaps could appear in the email—and how would you adjust scheduling or prompts to mitigate them?
  2. What changes would be needed to store not just news text but also structured fields (e.g., source, timestamp, sentiment) and then use those fields in the correlation step?
  3. How would you redesign the BTC ingestion to fetch historical prices (e.g., hourly candles) while keeping the email chart generation consistent?

Key Points

  1. 1

    Use Supabase as shared memory so agents can run on separate schedules without direct dependencies.

  2. 2

    Fetch Bitcoin prices via CoinGecko and persist them with created_at timestamps for time-series charting.

  3. 3

    Use OpenAI function calling to connect the news agent to Brave Search, then store results in a dedicated Supabase table.

  4. 4

    Generate the email by combining the latest BTC rows and the latest news rows into a single OpenAI prompt.

  5. 5

    Send emails through Mailgun and include a Matplotlib-generated PNG chart as an attachment.

  6. 6

    Deploy to Heroku and rely on Heroku Scheduler for autonomous execution; store all API keys in Heroku config variables.

Highlights

A three-agent architecture (price → news → email) works cleanly when each step writes to Supabase and the final step reads from both tables.
Function calling bridges the gap between an LLM and real-time web search by turning Brave Search into a tool the model can invoke.
Mailgun attachments let the system deliver not only text analysis but also a generated BTC price chart.