Integrating MCP (Model Context Protocol)
In earlier chapters, your agent learned to follow instructions, ground itself in your own data using File Search (RAG), and call custom tools.
In this final chapter, we'll connect your agent to a live MCP server — giving it access to external capabilities like live menus, toppings, and order management through a standard, secure protocol.
What Is MCP and Why Use It?
MCP (Model Context Protocol) is an open standard for connecting AI agents to external tools, data sources, and services through interoperable MCP servers.
Instead of integrating with individual APIs, you connect once to an MCP server and automatically gain access to all the tools that server exposes.
Benefits of MCP
- 🧩 Interoperability: a universal way to expose tools from any service to any MCP-aware agent.
- 🔐 Security & governance: centrally manage access and tool permissions.
- ⚙️ Scalability: add or update server tools without changing your agent code.
- 🧠 Simplicity: keep integrations and business logic in the server; keep your agent focused on reasoning.
Update Your Imports
Update your imports in agent.py to include MCPTool:
import json
import os
import glob
from dotenv import load_dotenv
from azure.identity import DefaultAzureCredential
from azure.ai.projects import AIProjectClient
from azure.ai.projects.models import PromptAgentDefinition, FileSearchTool, FunctionTool, MCPTool, Tool
from openai.types.responses.response_input_param import FunctionCallOutput, ResponseInputParamThe Contoso Pizza MCP Server
For Contoso Pizza, the MCP server exposes APIs for:
- 🧀 Pizzas: available menu items and prices
- 🍅 Toppings: categories, availability, and details
- 📦 Orders: create, view, and cancel customer orders
You'll connect your agent to this server and grant it access to use the tools for these operations.
Add the MCP Tool
Add this code after your Function Calling Tool section and before creating the toolset:
## -- MCP -- ##
mcpTool = MCPTool(
server_label="contoso-pizza-mcp",
server_url="https://ca-pizza-mcp-sc6u2typoxngc.graypond-9d6dd29c.eastus2.azurecontainerapps.io/sse",
require_approval="never"
)
## -- MCP -- ##Parameters Explained
| Parameter | Description |
|---|---|
| server_label | A human-readable name for logs and debugging. |
| server_url | The MCP server endpoint. |
| require_approval | Defines whether calls require manual approval ("never" disables prompts). |
TIP
💡 In production, use more restrictive approval modes for sensitive operations.
Update the Toolset
Add the MCP tool to your toolset:
## Define the toolset for the agent
toolset: list[Tool] = []
toolset.append(FileSearchTool(vector_store_ids=[vector_store.id]))
toolset.append(func_tool)
toolset.append(mcpTool)Add a User ID
To place orders, the agent must identify the customer.
Get your User ID
Visit this URL to register a customer:
https://nice-dune-07e53ec0f.2.azurestaticapps.net/Update your
instructions.txtwith your user details or pass the GUID in chat.
## User details:
Name: <YOUR NAME>
UserId: <YOUR USER GUID>- (Optional) View your order dashboard:
https://ambitious-stone-0f6b9760f.2.azurestaticapps.net/
Trying It Out
Now it's time to test your connected agent!
Run the agent and try out these prompts:
Show me the available pizzas.What is the price for a pizza hawai?Place an order for 2 large pepperoni pizzas.The agent will automatically call the appropriate MCP tools, retrieve data from the live Contoso Pizza API, and respond conversationally — following your instructions.txt rules (e.g., tone, local currency, and time zone conversions).
Best Practices for MCP Integration
- 🔒 Principle of least privilege: only allow tools the agent truly needs.
- 📜 Observability: log all tool calls for traceability and debugging.
- 🔁 Resilience: handle connection errors gracefully and retry failed tool calls.
- 🧩 Versioning: pin MCP server versions to prevent breaking changes.
- 👩💼 Human-in-the-loop: use approval modes for sensitive actions (like order placement).
Recap
In this chapter, you:
- Learned what MCP is and why it matters for scalable agent design.
- Added the MCPTool to connect to the Contoso Pizza MCP Server.
- Tested real-time integration with menu, toppings, and order tools.
🎉 Congratulations — you've completed the workshop!
Your agent can now:
✅ Follow system instructions
✅ Access and reason over private data (RAG)
✅ Call custom tools
✅ Interact with live services via MCP
Your Contoso PizzaBot is now a fully operational, intelligent, and extensible AI assistant.
Final code sample
import json
import os
import glob
from dotenv import load_dotenv
from azure.identity import DefaultAzureCredential
from azure.ai.projects import AIProjectClient
from azure.ai.projects.models import PromptAgentDefinition, FileSearchTool, FunctionTool, MCPTool, Tool
from openai.types.responses.response_input_param import FunctionCallOutput, ResponseInputParam
load_dotenv()
vector_store_id = "" # Set to your vector store ID if you already have one
## Configure Project Client
project_client = AIProjectClient(
endpoint=os.environ["PROJECT_ENDPOINT"],
credential=DefaultAzureCredential(),
)
openai_client = project_client.get_openai_client()
## -- FILE SEARCH -- ##
if vector_store_id:
vector_store = openai_client.vector_stores.retrieve(vector_store_id)
print(f"Using existing vector store (id: {vector_store.id})")
else:
# Create vector store for file search
vector_store = openai_client.vector_stores.create(name="ContosoPizzaStores")
print(f"Vector store created (id: {vector_store.id})")
# Upload file to vector store
for file_path in glob.glob("documents/*.md"):
file = openai_client.vector_stores.files.upload_and_poll(
vector_store_id=vector_store.id, file=open(file_path, "rb")
)
print(f"File uploaded to vector store (id: {file.id})")
## -- FILE SEARCH -- ##
## -- Function Calling Tool -- ##
func_tool = FunctionTool(
name="get_pizza_quantity",
parameters={
"type": "object",
"properties": {
"people": {
"type": "integer",
"description": "The number of people to order pizza for",
},
},
"required": ["people"],
"additionalProperties": False,
},
description="Get the quantity of pizza to order based on the number of people.",
strict=True,
)
def get_pizza_quantity(people: int) -> str:
"""Calculate the number of pizzas to order based on the number of people.
Assumes each pizza can feed 2 people.
Args:
people (int): The number of people to order pizza for.
Returns:
str: A message indicating the number of pizzas to order.
"""
print(f"[FUNCTION CALL:get_pizza_quantity] Calculating pizza quantity for {people} people.")
return f"For {people} you need to order {people // 2 + people % 2} pizzas."
## -- Function Calling Tool -- ##
## -- MCP -- ##
mcpTool = MCPTool(
server_label="contoso-pizza-mcp",
server_url="https://pizza-mcp-server.prouddune-f79ccb2b.westeurope.azurecontainerapps.io/mcp",
require_approval="never"
)
## -- MCP -- ##
## Define the toolset for the agent
toolset: list[Tool] = []
toolset.append(FileSearchTool(vector_store_ids=[vector_store.id]))
toolset.append(func_tool)
toolset.append(mcpTool)
## Create a Foundry Agent
agent = project_client.agents.create_version(
agent_name="hello-world-agent",
definition=PromptAgentDefinition(
model=os.environ["MODEL_DEPLOYMENT_NAME"],
instructions=open("instructions.txt").read(),
tools=toolset,
),
)
print(f"Agent created (id: {agent.id}, name: {agent.name}, version: {agent.version})")
## Create a conversation for the agent interaction
conversation = openai_client.conversations.create()
print(f"Created conversation (id: {conversation.id})")
## Chat with the agent
while True:
# Get the user input
user_input = input("You: ")
if user_input.lower() in ["exit", "quit"]:
print("Exiting the chat.")
break
# Get the agent response
response = openai_client.responses.create(
conversation=conversation.id,
input=user_input,
extra_body={"agent": {"name": agent.name, "type": "agent_reference"}},
)
# Handle function calls in the response
input_list: ResponseInputParam = []
for item in response.output:
if item.type == "function_call":
if item.name == "get_pizza_quantity":
# Execute the function logic for get_pizza_quantity
pizza_quantity = get_pizza_quantity(**json.loads(item.arguments))
# Provide function call results to the model
input_list.append(
FunctionCallOutput(
type="function_call_output",
call_id=item.call_id,
output=json.dumps({"pizza_quantity": pizza_quantity}),
)
)
if input_list:
response = openai_client.responses.create(
previous_response_id=response.id,
input=input_list,
extra_body={"agent": {"name": agent.name, "type": "agent_reference"}},
)
# Print the agent response
print(f"Assistant: {response.output_text}")