Integrating external data sources with AI models often requires complex and time-consuming custom coding. The Model Context Protocol (MCP) simplifies this by offering a standardised framework for seamless interaction.
In this guide, we will walk through building an MCP server and MCP client with Server-Sent Events (SSE), providing step-by-step instructions to set up and run both.
MCP serves as a universal interface that allows AI tools to interact seamlessly with content repositories, business platforms, and development environments. By providing a standardized framework, MCP enhances the relevance and context-awareness of AI applications.
It enables developers to build modular, secure, and flexible integrations without the need for separate connectors for each data source.
With MCP, developers can:
Before running the MCP server and client, we need to install the required dependencies and set up environment variables.
Create virtual environment and run the following command to install the required dependencies:
python -m venv venv
source venv/bin/activate
pip install "mcp[cli]" anthropic python-dotenv requests
Create a .env file in the project directory and add your API keys:
SERPER_API_KEY=your_serper_api_key_here
ANTHROPIC_API_KEY=your_anthropic_api_key_here
This ensures sensitive credentials remain secure.
Let's begin by creating an MCP server that provides two functionalities:
from mcp.server.fastmcp import FastMCP
import requests
import os
from dotenv import load_dotenv
load_dotenv()
mcp = FastMCP()
Experience seamless collaboration and exceptional results.
Configuring Tools in MCP
In MCP, each function wrapped with the @mcp.tool() decorator is considered a tool. This makes it easy to modularise functionalities. The description and input schema of the tool help the LLM decide which tool to use based on the user’s query.
For example:
API_KEY = os.getenv("SERPER_API_KEY")
API_URL = "https://google.serper.dev/search"
@mcp.tool()
def serper_search(query: str) -> dict:
"""Search the web using Serper API for user queries"""
headers = {"X-API-KEY": API_KEY, "Content-Type": "application/json"}
data = {"q": query}
try:
response = requests.post(API_URL, json=data, headers=headers)
response.raise_for_status()
result = response.json()
print(f"Search result for '{query}': {result}")
return result
except requests.exceptions.RequestException as e:
print(f"Error: {e}")
return {"error": str(e)}
@mcp.tool()
def add(a: int, b: int) -> int:
"""Add two numbers"""
print(f"Adding {a} and {b}")
return a + b
if __name__ == "__main__":
print("MCP server is running on port 8000")
mcp.run(transport="sse")
SSE transport enables server-to-client streaming with HTTP POST requests for client-to-server communication.
The client will:
Create a file named client.py and save the following code.
import asyncio
from typing import Optional
from contextlib import AsyncExitStack
from mcp import ClientSession
from mcp.client.sse import sse_client
from anthropic import Anthropic
from dotenv import load_dotenv
load_dotenv()
MCP_SERVER_URL = "http://localhost:8000/sse"
class MCPClient:
def __init__(self):
self.session: Optional[ClientSession] = None
self.exit_stack = AsyncExitStack()
self.anthropic = Anthropic()
async def connect_to_server(self, url: str):
"""Connect to an MCP SSE server"""
streams = await self.exit_stack.enter_async_context(sse_client(url=url))
self.session = await self.exit_stack.enter_async_context(ClientSession(*streams))
await self.session.initialize()
response = await self.session.list_tools()
tools = response.tools
print("\nConnected to server with tools:", [tool.name for tool in tools]
Experience seamless collaboration and exceptional results.
async def process_query(self, query: str) -> str:
messages = [{"role": "user", "content": query}]
response = await self.session.list_tools()
available_tools = [
{"name": tool.name, "description": tool.description, "input_schema": tool.inputSchema}
for tool in response.tools
]
response = self.anthropic.messages.create(
model="claude-3-5-sonnet-20241022",
max_tokens=1000,
messages=messages,
tools=available_tools
)
tool_results = []
final_text = []
for content in response.content:
if content.type == "text":
final_text.append(content.text)
elif content.type == "tool_use":
tool_name = content.name
tool_args = content.input
result = await self.session.call_tool(tool_name, tool_args)
tool_results.append({"call": tool_name, "result": result})
final_text.append(f"[Calling tool {tool_name} with args {tool_args}]")
messages.append({"role": "user", "content": result.content})
response = self.anthropic.messages.create(
model="claude-3-5-sonnet-20241022",
max_tokens=1000,
messages=messages,
)
final_text.append(response.content[0].text)
return "\n".join(final_text)
async def chat_loop(self):
print("\nMCP SSE Client Started!")
print("Type your queries or 'quit' to exit.")
while True:
query = input("\nQuery: ").strip()
if query.lower() == "quit":
break
response = await self.process_query(query)
print("\n" + response)
Once the server is running, start the client:
python client.py
Type queries like:
Query: Add 9 and 11
To exit, type:
Query: quit
<alt description> : Output Terminal
Congratulations! You've just unlocked a powerful way to build AI systems that communicate via server-sent events. By implementing MCP with SSE transport, you've gained the ability to create real-time, streaming connections between your AI models and external tools. This tutorial demonstrated:
MCP is a powerful way to standardize, modularize, and secure AI interactions.