Get AI summaries of any video or article — Sign up free
How to build MCP Client using LangGraph | Agentic AI using LangGraph | CampusX thumbnail

How to build MCP Client using LangGraph | Agentic AI using LangGraph | CampusX

CampusX·
6 min read

Based on CampusX's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Custom tool wrappers tied to external APIs break when upstream services change endpoints or response fields, creating repeated maintenance work across tools and chatbots.

Briefing

Agentic AI tool integrations get brittle fast when every chatbot hard-codes custom “tool” wrappers for each external service. MCP (Model Context Protocol) is presented as a cleaner, more maintainable way to connect LLM apps to tools by separating the heavy, service-specific logic on a server from the lightweight client configuration inside the LangGraph app—so API changes on the tool side don’t force repeated client rewrites.

The walkthrough starts with a practical pain point. A LangGraph chatbot already uses three tools: an internet search tool, a calculator tool, and a “get stock price” tool. When a manager asks for GitHub-backed question answering—like listing pull requests from GitHub repositories—the usual approach is to create a custom user-defined tool. That tool needs inputs such as repository owner, repository URL, pull request state (open/closed), and how many pull requests to return. It also requires GitHub authentication via a token and additional headers, then calls GitHub’s REST API, parses JSON, and formats results.

The fragility shows up when GitHub changes its API. A major version shift (from API 1.0 to API 2.0) can alter URL paths and response fields (for example, renaming attributes). With the “tool code inside the chatbot” approach, the chatbot breaks immediately and developers must update the tool wrapper. The maintenance burden multiplies: one API change can require edits across many tool implementations, and even more so across multiple chatbots and additional integrations like Gmail, Slack, or other services.

MCP is introduced as the fix for that maintenance problem. Instead of embedding service-specific code directly in each chatbot, MCP runs tool logic on an MCP server and exposes standardized tool definitions to clients. The LangGraph side holds only configuration needed to connect to the MCP server. When the server updates for upstream API changes, the client configuration remains stable, because the client doesn’t depend on GitHub’s evolving API details.

After the conceptual case, the coding section demonstrates building an MCP client inside LangGraph. The existing LangGraph chatbot is first converted from synchronous execution to asynchronous execution because the MCP client library used for integration works in async mode. The original calculator tool node is then replaced with an MCP client that connects to a locally running MCP math server (built with a library such as “langchain mcp servers”). The client fetches the available tools from the server—addition, subtraction, multiplication, division, power, and modulus—then binds those tools to the LangGraph LLM.

A second demo adds a remote MCP server for expense tracking (deployed over HTTP with a “streamable HTTP” transport). The same LangGraph chatbot can now list, add, and summarize expenses by simply extending the MCP client configuration with the remote server URL and transport settings—no new tool code inside the chatbot.

Finally, the lesson expands into a mixed architecture: the chatbot can use both traditional LangGraph tools and MCP-based tools together. A larger Streamlit + LangGraph + SQLite setup is adapted to support MCP clients, with additional async-compatible components (like an async SQLite layer) and an async streaming function. The result is a chatbot that can answer questions using both web/search tools and MCP-served capabilities, with a strong emphasis on future-proofing and reducing integration churn.

Cornell Notes

MCP (Model Context Protocol) is positioned as a standardized way to connect LLM apps to external tools without hard-coding brittle service-specific wrappers inside each chatbot. The transcript contrasts a custom “tool per integration” approach—where GitHub API changes can break the chatbot—with MCP’s separation of concerns: tool logic runs on an MCP server, while the LangGraph app keeps only lightweight client configuration. In the implementation, the LangGraph chatbot is converted to async because the MCP client library requires async execution. The calculator tool is replaced by an MCP client that fetches tool definitions from a local MCP math server, then binds those tools to the LLM. The same pattern extends to remote MCP servers (e.g., an expense tracker over HTTP), enabling new capabilities by configuration rather than rewriting chatbot tool code.

Why do custom “tool wrappers” become a maintenance problem as integrations grow?

The transcript uses GitHub as the example. A chatbot adds a user-defined GitHub tool that calls GitHub’s API, parses JSON, and formats pull request data. When GitHub changes from API 1.0 to API 2.0—altering URL structure and response attribute names—the chatbot’s tool code breaks. Fixing it requires updating the wrapper code. If the chatbot has many tools, or a company runs multiple chatbots across departments, each upstream API change forces repeated edits across many places (and potentially across multiple chatbots).

What is the core MCP idea that prevents client-side breakage?

MCP separates tool execution from tool consumption. The MCP server owns the service-specific logic (e.g., how to call GitHub and interpret its responses). The LangGraph app (MCP client) only stores configuration to connect to the server and receives standardized tool definitions. When upstream APIs change, developers update the MCP server; the client configuration doesn’t need to change because it no longer depends on the upstream API’s internal details.

Why does the LangGraph code need to be converted to async before adding an MCP client?

The transcript notes that the MCP client library used for integration works only in async mode. The existing LangGraph chatbot code is synchronous (no async/await used). To integrate MCP, the chatbot’s invocation path is converted to async: the graph nodes that call the LLM are updated so the LLM invocation becomes async, and the main entry point uses async invocation (e.g., switching from a sync invoke to an async invoke with await).

How does the MCP client in LangGraph get the tools it can call?

After creating an MCP client instance (e.g., using a class like MultiServerMCPClient from langchain mcp adapters), the client connects to the MCP server and retrieves the list of available tools and their definitions. In the demo, the local MCP math server exposes tools for addition, subtraction, multiplication, division, power, and modulus. The client fetches these tool definitions (via a call like get_tools) and then binds them to the LangGraph LLM so the model can call them during conversation.

What changes when adding a second MCP server (local vs remote)?

The client configuration changes, not the chatbot’s tool logic. For a remote expense-tracking MCP server, the configuration includes the remote server URL and a transport setting appropriate for remote calls (the transcript mentions “streamable HTTP”). Once added to the MCP client, the chatbot can use new tools like add_expense, list_expenses, and summarize expenses without writing new LangGraph tool wrappers.

Can a chatbot mix traditional LangGraph tools with MCP tools?

Yes. The transcript explicitly supports a hybrid approach: some capabilities can remain as standard LangGraph tools (like stock price retrieval), while other capabilities come from MCP servers (like math operations or expense tracking). The final demo shows a chatbot that can route requests to both kinds of tools, then stream responses through an async-compatible frontend/backend setup.

Review Questions

  1. What specific failure mode occurs when an external service’s API changes under the “custom tool wrapper inside the chatbot” approach?
  2. How does MCP’s separation of concerns reduce the number of code locations that must be updated after upstream changes?
  3. What async requirement affects LangGraph when integrating an MCP client, and where does that requirement show up in the code structure?

Key Points

  1. 1

    Custom tool wrappers tied to external APIs break when upstream services change endpoints or response fields, creating repeated maintenance work across tools and chatbots.

  2. 2

    MCP reduces brittleness by moving service-specific tool logic to an MCP server and keeping the LangGraph app as a lightweight MCP client with stable configuration.

  3. 3

    The LangGraph app must be async-compatible because the MCP client library used for integration operates only in async mode.

  4. 4

    In LangGraph, the MCP client can fetch tool names and tool definitions from an MCP server, then bind those tools to the LLM for tool calling.

  5. 5

    Adding additional MCP servers (including remote ones) typically requires only updating the MCP client configuration (URL and transport), not rewriting chatbot tool code.

  6. 6

    A single chatbot can combine traditional LangGraph tools with MCP-provided tools, enabling incremental adoption of MCP for more future-proof integrations.

Highlights

MCP’s biggest selling point is maintenance: API changes on the tool side require updates on the MCP server, not repeated edits across every chatbot client.
The LangGraph integration replaces a hard-coded calculator tool with an MCP client that discovers tools from a math server and binds them to the LLM.
Remote MCP servers (like an expense tracker over HTTP) can be added through configuration, immediately expanding chatbot capabilities without new tool wrapper code.
The transcript demonstrates a hybrid setup where MCP tools and standard LangGraph tools work together in the same chat flow.

Topics

  • MCP Client
  • LangGraph Integration
  • Tool Maintenance
  • Async Conversion
  • Remote MCP Servers

Mentioned