How to Build & Deploy Remote MCP Servers | MCP Trilogy | CampusX
Based on CampusX's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
Remote MCP servers centralize tools on an internet-accessible machine, enabling multiple clients to share one server while accepting higher latency than local MCP.
Briefing
Remote MCP servers let teams run MCP tools from a different machine—often a more powerful server on the internet—so multiple clients can share the same capabilities. The trade-off is speed: local MCP stays fast because communication happens on one machine, while remote MCP typically runs over the network and can feel slower. Still, the practical upside is clear for real deployments: enterprise setups are expected to be remote, and remote servers make it possible to centralize compute and share one toolset across many users.
The walkthrough builds a remote MCP server end-to-end using the MCP library (from the earlier local setup) and then deploys it so others can use it. First comes a minimal “test remote MCP server” with basic tools: adding two numbers, generating a random number within a range, plus a simple resource that returns server information. The key technical change from local to remote is in the MCP run configuration: instead of binding to a local-only transport, the server uses Streamable HTTP and binds to 0.0.0.0 on a chosen port, making it reachable from outside the host machine. After starting the server, the setup is verified using the MCP Inspector by connecting over Streamable HTTP, listing resources, and running the random-number tool to confirm correct behavior.
Deployment happens through FastMCP Cloud, a free service at the time of the tutorial. The process is: create a GitHub repository, push the code, then use FastMCP Cloud’s “Deploy from your own code” flow to build and publish the server. Once deployed, the server’s URL can be copied and shared. On the client side, users open Cloud Desktop, connect via connectors, and—depending on plan—add a custom remote MCP connector using the provided URL. The tutorial demonstrates that with the deployed server, a remote client can call the tools (e.g., generate a random number between two bounds) and receive results.
The main goal then shifts from a toy server to a remote expense tracker. Rather than rebuilding everything, the expense-tracker MCP code from the prior local version is inserted into the same project, along with a required categories.json file. The updated code is tested in the MCP Inspector, pushed to GitHub, and redeployed so the new tools appear in Cloud Desktop. A deployment issue surfaces: the SQLite database ends up read-only on the server, preventing new expense inserts. The fix is to adjust the code to create a writable directory (using a suggested change) so the database can be updated after deployment.
Finally, two limitations are addressed. First, the expense tracker initially runs synchronously, which blocks concurrent users—so the code is updated to use async/await patterns and AIOSQLite instead of SQLite, enabling parallel handling of tool calls and database operations. Second, free-plan users can’t add custom connectors directly, so a workaround uses a local proxy MCP server that connects Cloud Desktop to the remote server indirectly. The tutorial also flags a deeper logical flaw: without authentication, any user can see every other user’s expenses because the database lacks user scoping and there’s no reliable way to verify who is calling. The next steps are framed around adding authentication and building a custom MCP client rather than relying solely on Cloud Desktop connectors.
Cornell Notes
Remote MCP servers run on a different machine (often internet-accessible), enabling multiple clients to share one centralized toolset—at the cost of higher latency versus local MCP. The tutorial first creates a minimal remote MCP server (add two numbers, generate random numbers) by switching the transport to Streamable HTTP and binding to 0.0.0.0, then verifies it with MCP Inspector. It deploys the server via FastMCP Cloud by pushing code to GitHub and publishing from that repository, producing a shareable URL. The expense tracker is then converted into a remote MCP server by swapping in the expense-tracker code and categories.json, fixing a read-only SQLite deployment issue by writing to a writable directory. The final improvements include async support using AIOSQLite to avoid blocking concurrent users and a proxy workaround for free-plan connector limitations, while authentication remains the major unresolved requirement for multi-user privacy.
What concrete change turns a local MCP server into a remote-accessible one in this setup?
How is the remote server validated before deployment?
Why did adding expenses fail after deploying the expense tracker, and what was the fix?
What performance problem appears with the initial remote MCP server, and how is it addressed?
How do free-plan users connect to the remote MCP server if custom connectors aren’t available?
What major security/privacy flaw remains, and why?
Review Questions
- What network and transport settings are required for the MCP server to accept remote requests, and why do they matter?
- How does switching from synchronous code to async/await plus AIOSQLite change the server’s ability to handle concurrent users?
- What two separate issues prevent a production-ready multi-user remote expense tracker in this tutorial, and how are they planned to be solved?
Key Points
- 1
Remote MCP servers centralize tools on an internet-accessible machine, enabling multiple clients to share one server while accepting higher latency than local MCP.
- 2
Switching to Streamable HTTP and binding to 0.0.0.0 is the core configuration step for making an MCP server reachable remotely.
- 3
FastMCP Cloud deployment is driven by pushing code to GitHub and using “Deploy from your own code,” which produces a shareable server URL.
- 4
A deployed SQLite database may become read-only; creating/using a writable directory in code is necessary for write operations like adding expenses.
- 5
Synchronous MCP tool/database handling blocks concurrent users; converting to async/await and using AIOSQLite improves concurrency.
- 6
Free-plan connector limits can be bypassed with a local proxy MCP server that forwards requests to the remote server.
- 7
Without authentication and user scoping in the database, any user can potentially view all expenses—making auth a required next step.