Model Context Protocol (MCP) is an open protocol designed to enable AI agents to interact with external tools and data in a standardized way. MCP is composed of three components: server, client, and host.
MCP host
The MCP host acts as the interface between the user and the agent (such as Claude Desktop or IDE) and plays the role of connecting to external tools or data through MCP clients and servers. Previously, Anthropic’s Claude Desktop was introduced as a host, but it required a separate desktop app, license, and API key management, leading to dependency on the Claude ecosystem. mcp-use is an open-source Python/Node package that connects LangChain LLMs (e.g., GPT-4, Claude, Groq) to MCP servers in just six lines of code, eliminating dependencies and supporting multi-server and multi-model setups.
MCP Client
The MCP client manages the MCP protocol within the host and is responsible for connecting to MCP servers that provide the necessary functions for the host’s services. The MCP client uses a JSON file, which lists the server names and connection methods, to handle connections and supply tools or data as requested by the MCP host.
MCP server
The MCP server supports the actual connection and management of resources (such as databases, web data, local files) needed by the MCP host. It exposes available tools and manages session-level context. In this example, we use Astral UV to build a Python runtime and package management environment. UV is written in Rust, offering fast installation and dependency resolution, and allows immediate script execution with uv run
, making it ideal for container and CI environments.
Overall Architecture
![]() |
Agent architecture with MCP |
Development Environment Setup
As shown in the system architecture, this document explains how to set up an environment using UV to serve the MCP server and build the MCP host and client using the open-source solution mcp-use instead of Claude Desktop.
# uv install
curl -LsSf https://astral.sh/uv/install.sh | sh
# initialize uv project
uv init "Your MCP Server Project Directory"
# install required python packages for the uv project
uv add "mcp[cli]" mcp-use langchain-openai langchain-community python-dotenv
server.py – A Simple MCP Server for Wikipedia or Arxiv Search
This code shows how to implement an MCP server called "doc-server" that uses Wikipedia and Arxiv API wrappers provided by LangChain to search for and return content related to a user’s query.
# file name is 'server.py'
from mcp.server.fastmcp import FastMCP
from langchain_community.utilities import WikipediaAPIWrapper, ArxivAPIWrapper
from langchain_community.tools import WikipediaQueryRun, ArxivQueryRun
mcp = FastMCP("doc-server")
wiki_wrapper = WikipediaAPIWrapper(top_k_results=1, doc_content_chars_max=500)
wiki_tool = WikipediaQueryRun(api_wrapper=wiki_wrapper)
arxiv_wrapper = ArxivAPIWrapper(top_k_results=1, doc_content_chars_max=500)
arxiv_tool = ArxivQueryRun(api_wrapper=arxiv_wrapper)
@mcp.tool()
async def get_info(searchterm: str) -> str:
try:
result = wiki_tool.run(searchterm)
return result
except Exception as e:
return f"Error fetching Wikipedia information: {str(e)}"
@mcp.tool()
async def get_research_paper(searchterm: str) -> str:
try:
result = arxiv_tool.run(searchterm)
return result
except Exception as e:
return f"Error fetching research paper information: {str(e)}"
if __name__=="__main__":
mcp.run(transport='sse') # for SSE serving
# mcp.run(transport='STDIO') for local serving
Running doc-server with uv
You can run the above server.py
using the uv run
command. This will expose the functions defined in the file so they can be used by an MCP client. In this example, uv
serves the doc-server on port 8000 of localhost.
(mcp_env)$ uv run server.py
INFO: Started server process [79863]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
Writing the doc_server.json File
Here is an example of a JSON file required to create an MCP client that uses functions from the 'doc-server'. You could also define this as a dictionary directly within your client code for the same result.
{
"mcpServers": {
"doc_server": {
"command": "uv",
"url": "http://127.0.0.1:8000/sse",
"transport": "sse"
}
}
}
Creating MCP Host and Client Code Using mcp-use
This example demonstrates how to build an MCP host and client using mcp-use and Azure OpenAI as the GenAI backend. As shown below, mcp-use allows you to quickly set up an MCP host and client that can invoke custom or external MCP server functions.
import os
from dotenv import load_dotenv
from langchain_openai import AzureChatOpenAI
from mcp_use import MCPAgent, MCPClient
# Load environment variables
load_dotenv(".env")
# Create MCPClient from config file
client = MCPClient.from_config_file(
"doc_server.json"
)
# Create LLM
llm = AzureChatOpenAI(
azure_deployment=os.getenv("AZURE_OPENAI_DEPLOYMENT"),
azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"),
api_version=os.getenv("AZURE_OPENAI_API_VERSION"),
verbose=False,
temperature=0.0,
)
# Create agent with the client
agent = MCPAgent(llm=llm, client=client, max_steps=30)
# Run the query
result = await agent.run(
"Let me know ReciproCAM paper.",
max_steps=30,
)
print(f"\nResult: {result}")
Output Example
Conclusion
mcp-use is an open-source, multi-LLM, multi-server oriented host solution that removes dependency on Claude Desktop. UV offers an ultra-fast package management and execution environment, making it ideal for reproducible MCP server deployments.
Comments
Post a Comment