Spaces:
Paused
Paused
File size: 13,131 Bytes
1ec8d33 561151f 1ec8d33 f647629 7b2da21 f647629 8820c6c f647629 8820c6c f647629 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 7b2da21 8820c6c 7b2da21 8820c6c 7b2da21 8820c6c 7b2da21 8820c6c 7b2da21 8820c6c 7b2da21 8820c6c 7b2da21 8820c6c 7b2da21 8820c6c 7b2da21 8820c6c 7b2da21 8820c6c 7b2da21 8820c6c 7b2da21 8820c6c 561151f 8820c6c 7b2da21 8820c6c 7b2da21 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 7b2da21 f647629 8820c6c f647629 8820c6c 7b2da21 8820c6c f647629 7b2da21 8820c6c 7b2da21 8820c6c f647629 7b2da21 8820c6c 7b2da21 8820c6c f647629 7b2da21 8820c6c 7b2da21 8820c6c 7b2da21 8820c6c 7b2da21 8820c6c 7b2da21 8820c6c 7b2da21 8820c6c f647629 7b2da21 8820c6c 7b2da21 8820c6c 7b2da21 8820c6c ef4e95e 8820c6c f647629 8820c6c ef4e95e f647629 8820c6c f647629 8820c6c 561151f 8820c6c 561151f 8820c6c f647629 7b2da21 8820c6c 7b2da21 f647629 8820c6c 7b2da21 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e4410 8820c6c 40e1a91 8820c6c 40e1a91 8820c6c 40e1a91 8820c6c 40e1a91 8820c6c 40e1a91 8820c6c 40e1a91 8820c6c f647629 7b2da21 8820c6c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 |
---
title: Weights & Biases MCP Server
emoji: πͺπ
colorFrom: yellow
colorTo: gray
sdk: docker
app_file: app.py
pinned: false
---
<p align="center">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/wandb/wandb/main/assets/logo-dark.svg">
<source media="(prefers-color-scheme: light)" srcset="https://raw.githubusercontent.com/wandb/wandb/main/assets/logo-light.svg">
<img src="https://raw.githubusercontent.com/wandb/wandb/main/assets/logo-light.svg" width="600" alt="Weights & Biases">
</picture>
</p>
# W&B MCP Server
Query and analyze your Weights & Biases data using natural language through the Model Context Protocol.
<div align="center">
<a href="https://cursor.com/en/install-mcp?name=wandb&config=eyJ0cmFuc3BvcnQiOiJodHRwIiwidXJsIjoiaHR0cHM6Ly9tY3Aud2l0aHdhbmRiLmNvbS9tY3AiLCJoZWFkZXJzIjp7IkF1dGhvcml6YXRpb24iOiJCZWFyZXIge3tXQU5EQl9BUElfS0VZfX0iLCJBY2NlcHQiOiJhcHBsaWNhdGlvbi9qc29uLCB0ZXh0L2V2ZW50LXN0cmVhbSJ9fQ%3D%3D"><img src="https://cursor.com/deeplink/mcp-install-dark.svg" alt="Cursor" height="28"/></a>
<a href="#claude-desktop"><img src="https://img.shields.io/badge/Claude-6B5CE6?logo=anthropic&logoColor=white" alt="Claude" height="28"/></a>
<a href="#openai"><img src="https://img.shields.io/badge/OpenAI-412991?logo=openai&logoColor=white" alt="OpenAI" height="28"/></a>
<a href="#gemini-cli"><img src="https://img.shields.io/badge/Gemini-4285F4?logo=google&logoColor=white" alt="Gemini" height="28"/></a>
<a href="#mistral-lechat"><img src="https://img.shields.io/badge/LeChat-FF6B6B?logo=mistralai&logoColor=white" alt="LeChat" height="28"/></a>
<a href="#vscode"><img src="https://img.shields.io/badge/VSCode-007ACC?logo=visualstudiocode&logoColor=white" alt="VSCode" height="28"/></a>
</div>
---
## What Can This Server Do?
<details open>
<summary><strong>Example Use Cases</strong> (click command to copy)</summary>
<table>
<tr>
<td width="25%">
**Analyze Experiments**
```text
Show me the top 5 runs
by eval/accuracy in
wandb-smle/hiring-agent-demo-public?
```
</td>
<td width="25%">
**Debug Traces**
```text
How did the latency of
my hiring agent predict traces
evolve over the last months?
```
</td>
<td width="25%">
**Create Reports**
```text
Generate a wandb report
comparing the decisions made
by the hiring agent last month
```
</td>
<td width="25%">
**Get Help**
```text
How do I create a leaderboard
in Weave - ask SupportBot?
```
</td>
</tr>
</table>
</details>
<details>
<summary><strong>Available Tools</strong> (6 powerful tools)</summary>
| Tool | Description | Example Query |
|------|-------------|---------------|
| **query_wandb_tool** | Query W&B runs, metrics, and experiments | *"Show me runs with loss < 0.1"* |
| **query_weave_traces_tool** | Analyze LLM traces and evaluations | *"What's the average latency?"* |
| **count_weave_traces_tool** | Count traces and get storage metrics | *"How many traces failed?"* |
| **create_wandb_report_tool** | Create W&B reports programmatically | *"Create a performance report"* |
| **query_wandb_entity_projects** | List projects for an entity | *"What projects exist?"* |
| **query_wandb_support_bot** | Get help from W&B documentation | *"How do I use sweeps?"* |
</details>
<details>
<summary><strong>Usage Tips</strong> (best practices)</summary>
**β Provide your W&B project and entity name**
LLMs are not mind readers, ensure you specify the W&B Entity and W&B Project to the LLM.
**β Avoid asking overly broad questions**
Questions such as "what is my best evaluation?" are probably overly broad and you'll get to an answer faster by refining your question to be more specific such as: "what eval had the highest f1 score?"
**β Ensure all data was retrieved**
When asking broad, general questions such as "what are my best performing runs/evaluations?" it's always a good idea to ask the LLM to check that it retrieved all the available runs. The MCP tools are designed to fetch the correct amount of data, but sometimes there can be a tendency from the LLMs to only retrieve the latest runs or the last N runs.
</details>
---
## Quick Start
We recommend using our **hosted server** at `https://mcp.withwandb.com` - no installation required!
> π Get your API key from [wandb.ai/authorize](https://wandb.ai/authorize)
### Cursor
<details>
<summary>One-click installation</summary>
1. Open Cursor Settings (`β,` or `Ctrl,`)
2. Navigate to **Features** β **Model Context Protocol**
3. Click **"Install from Registry"** or **"Add MCP Server"**
4. Search for "wandb" or enter:
- **Name**: `wandb`
- **URL**: `https://mcp.withwandb.com/mcp`
- **API Key**: Your W&B API key
For local installation, see [Option 2](#option-2-local-development-stdio) below.
</details>
### Claude Desktop
<details>
<summary>Configuration setup</summary>
Add to your Claude config file:
```bash
# macOS
open ~/Library/Application\ Support/Claude/claude_desktop_config.json
# Windows
notepad %APPDATA%\Claude\claude_desktop_config.json
```
```json
{
"mcpServers": {
"wandb": {
"url": "https://mcp.withwandb.com/mcp",
"apiKey": "YOUR_WANDB_API_KEY"
}
}
}
```
Restart Claude Desktop to activate.
For local installation, see [Option 2](#option-2-local-development-stdio) below.
</details>
### OpenAI Response API
<details>
<summary>Python client setup</summary>
```python
from openai import OpenAI
import os
client = OpenAI()
resp = client.responses.create(
model="gpt-4o",
tools=[{
"type": "mcp",
"server_url": "https://mcp.withwandb.com/mcp",
"authorization": os.getenv('WANDB_API_KEY'),
}],
input="How many traces are in my project?"
)
print(resp.output_text)
```
> **Note**: OpenAI's MCP is server-side, so localhost URLs won't work. For local servers, see [Option 2](#option-2-local-development-stdio) with ngrok.
</details>
### Gemini CLI
<details>
<summary>One-command installation</summary>
```bash
# Set your API key
export WANDB_API_KEY="your-api-key-here"
# Install the extension
gemini extensions install https://github.com/wandb/wandb-mcp-server
```
The extension will use the configuration from `gemini-extension.json` pointing to the hosted server.
For local installation, see [Option 2](#option-2-local-development-stdio) below.
</details>
### Mistral LeChat
<details>
<summary>Configuration setup</summary>
In LeChat settings, add an MCP server:
- **URL**: `https://mcp.withwandb.com/mcp`
- **API Key**: Your W&B API key
For local installation, see [Option 2](#option-2-local-development-stdio) below.
</details>
### VSCode
<details>
<summary>Settings configuration</summary>
```bash
# Open settings
code ~/.config/Code/User/settings.json
```
```json
{
"mcp.servers": {
"wandb": {
"url": "https://mcp.withwandb.com/mcp",
"headers": {
"Authorization": "Bearer YOUR_WANDB_API_KEY"
}
}
}
}
```
For local installation, see [Option 2](#option-2-local-development-stdio) below.
</details>
---
## General Installation Guide
<details>
<summary><strong>Option 1: Hosted Server (Recommended)</strong></summary>
The hosted server provides a zero-configuration experience with enterprise-grade reliability. This server is maintained by the W&B team, automatically updated with new features, and scales to handle any workload. Perfect for teams and production use cases where you want to focus on your ML work rather than infrastructure.
### Using the Public Server
The easiest way is using our hosted server at `https://mcp.withwandb.com`.
**Benefits:**
- β
Zero installation
- β
Always up-to-date
- β
Automatic scaling
- β
No maintenance
Simply use the configurations shown in [Quick Start](#quick-start).
</details>
<details>
<summary><strong>Option 2: Local Development (STDIO)</strong></summary>
Run the MCP server locally for development, testing, or when you need full control over your data. The local server runs directly on your machine with STDIO transport for desktop clients or HTTP transport for web-based clients. Ideal for developers who want to customize the server or work in air-gapped environments.
### Manual Configuration
Add to your MCP client config:
```json
{
"mcpServers": {
"wandb": {
"command": "uvx",
"args": [
"--from",
"git+https://github.com/wandb/wandb-mcp-server",
"wandb_mcp_server"
],
"env": {
"WANDB_API_KEY": "YOUR_API_KEY"
}
}
}
}
```
### Prerequisites
- Python 3.10+
- [uv](https://docs.astral.sh/uv/) (recommended) or pip
```bash
# Install uv (if not already installed)
curl -LsSf https://astral.sh/uv/install.sh | sh
```
### Installation
```bash
# Using uv (recommended)
uv pip install wandb-mcp-server
# Or from GitHub
pip install git+https://github.com/wandb/wandb-mcp-server
```
### Client-Specific Installation Commands
#### Cursor (Project-only)
Enable the server for a specific project:
```bash
uvx --from git+https://github.com/wandb/wandb-mcp-server add_to_client --config_path .cursor/mcp.json && uvx wandb login
```
#### Cursor (Global)
Enable the server for all Cursor projects:
```bash
uvx --from git+https://github.com/wandb/wandb-mcp-server add_to_client --config_path ~/.cursor/mcp.json && uvx wandb login
```
#### Windsurf
```bash
uvx --from git+https://github.com/wandb/wandb-mcp-server add_to_client --config_path ~/.codeium/windsurf/mcp_config.json && uvx wandb login
```
#### Claude Code
```bash
claude mcp add wandb -- uvx --from git+https://github.com/wandb/wandb-mcp-server wandb_mcp_server && uvx wandb login
```
With API key:
```bash
claude mcp add wandb -e WANDB_API_KEY=your-api-key -- uvx --from git+https://github.com/wandb/wandb-mcp-server wandb_mcp_server
```
#### Claude Desktop
```bash
uvx --from git+https://github.com/wandb/wandb-mcp-server add_to_client --config_path "~/Library/Application Support/Claude/claude_desktop_config.json" && uvx wandb login
```
### Testing with ngrok (for server-side clients)
For clients like OpenAI and LeChat that require public URLs:
```bash
# 1. Start HTTP server
uvx wandb-mcp-server --transport http --port 8080
# 2. Expose with ngrok
ngrok http 8080
# 3. Use the ngrok URL in your client configuration
```
> **Note**: These utilities are inspired by the OpenMCP Server Registry [add-to-client pattern](https://www.open-mcp.org/servers).
</details>
<details>
<summary><strong>Option 3: Self-Hosted HTTP Server</strong></summary>
Deploy your own W&B MCP server for team-wide access or custom infrastructure requirements. This option gives you complete control over deployment, security, and scaling while maintaining compatibility with all MCP clients. Perfect for organizations that need on-premises deployment or want to integrate with existing infrastructure.
### Using Docker
```bash
docker run -p 7860:7860 \
-e WANDB_API_KEY=your-server-key \
ghcr.io/wandb/wandb-mcp-server
```
### From Source
```bash
# Clone repository
git clone https://github.com/wandb/wandb-mcp-server
cd wandb-mcp-server
# Install and run
uv pip install -r requirements.txt
uv run app.py
```
### Deploy to HuggingFace Spaces
1. Fork [wandb-mcp-server](https://github.com/wandb/wandb-mcp-server)
2. Create new Space on [Hugging Face](https://huggingface.co/spaces)
3. Choose "Docker" SDK
4. Connect your fork
5. Add `WANDB_API_KEY` as secret (optional)
Server URL: `https://YOUR-SPACE.hf.space/mcp`
</details>
---
## More Information
### Architecture & Performance
The W&B MCP Server uses **pure stateless architecture** for excellent performance:
| Metric | Performance |
|--------|------------|
| **Concurrent Connections** | 500+ (hosted) / 1000+ (local) |
| **Throughput** | ~35 req/s (hosted) / ~50 req/s (local) |
| **Success Rate** | 100% up to capacity |
| **Scaling** | Horizontal (add workers) |
> π See [Architecture Guide](ARCHITECTURE.md) for technical details
### Key Resources
- **W&B Docs**: [docs.wandb.ai](https://docs.wandb.ai)
- **Weave Docs**: [weave-docs.wandb.ai](https://weave-docs.wandb.ai)
- **MCP Spec**: [modelcontextprotocol.io](https://modelcontextprotocol.io)
- **GitHub**: [github.com/wandb/wandb-mcp-server](https://github.com/wandb/wandb-mcp-server)
### Example Code
<details>
<summary>Complete OpenAI Example</summary>
```python
from openai import OpenAI
from dotenv import load_dotenv
import os
load_dotenv()
client = OpenAI()
resp = client.responses.create(
model="gpt-4o", # Use gpt-4o for larger context window
tools=[
{
"type": "mcp",
"server_label": "wandb",
"server_description": "Query W&B data",
"server_url": "https://mcp.withwandb.com/mcp",
"authorization": os.getenv('WANDB_API_KEY'),
"require_approval": "never",
},
],
input="How many traces are in wandb-smle/hiring-agent-demo-public?",
)
print(resp.output_text)
```
</details>
### Support
- [GitHub Issues](https://github.com/wandb/wandb-mcp-server/issues)
- [W&B Community](https://community.wandb.ai)
- [W&B Support](https://wandb.ai/support)
|