AI AGENTS From Zero to Production in 35 Minutes - FULL TUTORIAL
Based on All About AI's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
Use Supabase as shared memory so agents can run on separate schedules without direct dependencies.
Briefing
A complete “autonomous finance brief” system is built from scratch: one scheduled agent pulls Bitcoin prices, another fetches macro and crypto-related news, and a final agent compiles both into a short, correlation-focused email (with a generated price chart) sent automatically every morning. The practical takeaway is that AI agents don’t need to be tightly coupled—each can run on its own schedule, write to a shared database, and let a downstream agent assemble the context.
The workflow starts with a simple data-ingestion script that calls the CoinGecko API to fetch the current Bitcoin price. After setting up a Python virtual environment and installing dependencies, the script writes each fetched price into a Supabase table (with fields for ID, price, and created_at). The tutorial then verifies the pipeline by rerunning the script and confirming new rows appear in the database.
Next comes the “information agent,” which uses OpenAI plus function calling to gain tool-like access to Brave Search. The agent is configured to search for finance-relevant news, then store results in a second Supabase table (Eco news) with a timestamp and a text field for the finance info. Early runs hit a failure that’s traced to a typo in the table name, after which the agent successfully stores multiple news items. The prompt is then adjusted to run two searches—one focused on Bitcoin/crypto and another on broader finance/macro—so the database accumulates a richer context set for later analysis.
With both datasets accumulating independently, the final “email agent” pulls recent context from Supabase: the latest BTC prices and the most recent news entries. It sends that context to OpenAI to generate a highly concise professional email addressed to “Chris,” aiming to identify correlations between market-moving events and Bitcoin price movement. The email is delivered via the Mailgun API, and the tutorial confirms successful delivery by opening the received message.
To make the briefing more actionable, the system is extended to attach a chart. Using Matplotlib, the email agent generates a PNG line graph from recent stored Bitcoin prices and includes it as an attachment in the Mailgun email. The tutorial notes that the chart may be “not to scale” during testing, but the attachment mechanism works end-to-end.
Finally, the project is moved into production using Heroku. Dependencies are added to a requirements file, a runtime configuration is set, and secrets (API keys for Supabase, OpenAI, Brave, and Mailgun) are stored in Heroku config variables. The tutorial disables background workers and instead relies on Heroku Scheduler jobs: the Bitcoin price agent runs every 10 minutes, the news agent runs on its own cadence, and the email agent runs shortly after so it can compile the latest database context. After deployment, logs confirm searches and database writes, and the scheduled email arrives with both the generated text and the chart attachment—demonstrating a fully autonomous, database-driven agent pipeline suitable for real-world automation.
Cornell Notes
The system builds a scheduled, database-driven AI workflow for finance updates. One agent fetches Bitcoin prices from CoinGecko and stores them in Supabase; a second agent uses OpenAI function calling with Brave Search to collect macro/crypto news and stores it in a separate Supabase table. A third agent reads the latest rows from both tables, prompts OpenAI to produce a short correlation-focused email, and sends it via Mailgun. The email agent also generates a Matplotlib chart from recent BTC prices and attaches it as a PNG. Running everything on Heroku Scheduler keeps agents independent while still producing a coherent daily briefing.
How does the pipeline ensure each agent can run independently without breaking the final email?
What tool-using mechanism lets the news agent search the web reliably?
Why did the first attempt at storing news fail, and what fixed it?
How does the email agent turn raw database rows into a short, correlation-focused message?
How is the chart attachment generated and delivered?
What production setup choices make the system run autonomously on Heroku?
Review Questions
- If the email agent runs before the news agent finishes its scheduled job, what data gaps could appear in the email—and how would you adjust scheduling or prompts to mitigate them?
- What changes would be needed to store not just news text but also structured fields (e.g., source, timestamp, sentiment) and then use those fields in the correlation step?
- How would you redesign the BTC ingestion to fetch historical prices (e.g., hourly candles) while keeping the email chart generation consistent?
Key Points
- 1
Use Supabase as shared memory so agents can run on separate schedules without direct dependencies.
- 2
Fetch Bitcoin prices via CoinGecko and persist them with created_at timestamps for time-series charting.
- 3
Use OpenAI function calling to connect the news agent to Brave Search, then store results in a dedicated Supabase table.
- 4
Generate the email by combining the latest BTC rows and the latest news rows into a single OpenAI prompt.
- 5
Send emails through Mailgun and include a Matplotlib-generated PNG chart as an attachment.
- 6
Deploy to Heroku and rely on Heroku Scheduler for autonomous execution; store all API keys in Heroku config variables.