Spaces:
Sleeping
A newer version of the Gradio SDK is available:
5.44.1
title: gradio-test
app_file: app.py
sdk: gradio
sdk_version: 5.42.0
Gradio MCP Integration
Gradio allows developers to create UIs for their models with just a few lines of Python code. It's particularly useful for:
- Creating demos and prototypes
- Sharing models with non-technical users
- Testing and debugging model behavior
With the addition of MCP support, Gradio now offers a straigntforward way to expose AI model capabilities through standardized MCP protocol.
Preqrequisites
Install Gradio with the MCP extra:
pip install "gradio[mcp]"
You'll also need an LLM application that supports tool calling using the MCP protocol, such as Cursor (known as "MCP Hosts").
Creating a MCP Server with Gradio
- Set up the virtual environment:
python -m venv .venv
- Start the virtual environment:
source .venv/bin/activate
- Install Gradio with the mcp extra:
pip install "gradio[mcp]"
- Run the Gradio app with the MCP server:
python app.py
The server will return something like this:
* Running on local URL: http://127.0.0.1:7860
* To create a public link, set `share=True` in `launch()`.
🔨 Launching MCP server:
** Streamable HTTP URL: http://127.0.0.1:7860/gradio_api/mcp/
* [Deprecated] SSE URL: http://127.0.0.1:7860/gradio_api/mcp/sse
With this setup, your letter counter function is now accessible through:
- A traditional Gradio web interface for direct human interaction
- An MCP Server that can be connected to compatible clients
The MCP server will be accessibile at:
http://your-server:port/gradio_api/mcp/sse
How It Works Behind the Scenes
When you set 'mcp_server=True' in 'launch()', several things happen:
- Gradio functions are automatically converted to MCP Tools
- Input components map to toll argument schemas
- Output components determine the response format
- The Gradio server now also listens for MCP protocol messages
- JSON-RPC over HTTP+SSE is set up for client-server communication
Key Features of the Gradio <> MCP Integration
- Tool Conversion: Each API endpoint in your Gradio app is automatically converted into an MCP tool with a corresponding name, description, and input schema. To view the tools and schemas, visit 'http://your-server:port/gradio_api/mcp/schema' or go to the "View API" link in the footer of your Gradio app, and then click on "MCP".
You will see a URL like this: 'https://abidlabs-mcp-tools.hf.space/gradio_api/mcp/sse'
To add this MCP to clients that support SSE (e.g. Cursor, Windsurf, Cline), simply add the following configuration to your MCP config:
{
"mcpServers": {
"gradio": {
"url": "https://abidlabs-mcp-tools.hf.space/gradio_api/mcp/sse"
}
}
}
- You can also run the server locally:
python app.py
And set up:
{
"mcpServers": {
"Gradio MCP Server": {
"type": "http",
"url": "http://127.0.0.1:7860/gradio_api/mcp/"
}
}
}
The url with 'sse' also works: 'http://127.0.0.1:7860/gradio_api/mcp/sse'
Experimental stdio support: For clients that only support stdio, first install Node.js. Then, you can use the following command:
{
"mcpServers": {
"gradio": {
"command": "npx",
"args": ["mcp-remote", "https://abidlabs-mcp-tools.hf.space/gradio_api/mcp/sse"]
}
}
}
NOTE: This configuration did not work on Cursor. The MCP server does not load its tools.
- Environment Variable Support: There are two ways to enable the MCP server functionality:
- Using the 'mcp_server" parameter in launch():
demo.launch(mcp_server=True)
- Using environment variables:
export GRADIO_MCP_SERVER=True
- File Handling: The server automatically handles file data conversions, including:
- Converting base64-encoded strings to file data
- Processing image files and returning them in the correct format
- Managing temporary file storage
It is strongly recommended that input images and files be passed as full URLs ("http://..." or "https://...") as MCP Clients do not always handle local files correctly.
Hosted MCP Servers on Hugging Face Spaces: You can publish your Gradio application for free on Hugging Face Spaces, which allow you to have a free hosted MCP server. Here's an example of such a Space: https://huggingface.co/spaces/abidlabs/mcp-tools
Deploy the application and server to Hugging Face Spaces:
gradio deploy
IMPORTANT: You must be already logged in with your hugging face cli, then just confirm the required information on the steps and the Spaces URL will be informed, like this example: 'https://huggingface.co/spaces/layers2024/gradio-test'
Troubleshooting timestamp
Type Hints and Docstrings: Ensure you provide type hints and valid docstrings for your functions. The docstring should include "Args:" block with indented parameters names.
String Input: When in doubt, accept input arguments as str and covert them to the desired type inside the function.
SSE Support: Some MCP Hosts don't support SSE-based MCP Servers. In those cases, you can use mcp-remote:
{
"mcpServers": {
"gradio": {
"command": "npx",
"args": ["mcp-remote", "http://your-server:port/gradio_api/mcp/sse"]
}
}
}
- Restart: If you encounter connection issues, try restarting both your MCP Client and MCP Server.