Spaces:
Sleeping
Sleeping
A newer version of the Gradio SDK is available:
6.1.0
πΎ Alert Summary Prototype Backend
Multi-stage MCP Pipeline for Agricultural Intelligence
π― Overview
Farmer.Chat uses a 4-stage pipeline to process agricultural queries:
Query β Router β Executor (Parallel) β Compiler β Translator β Advice
Architecture
- Stage 1: Query Router - Analyzes farmer's question and selects relevant MCP servers
- Stage 2: MCP Executor - Calls multiple APIs in parallel (weather, soil, water, elevation, pests)
- Stage 3: Response Compiler - Merges data from all sources
- Stage 4: Farmer Translator - Converts technical data to actionable farmer advice
π API Endpoints
POST /api/query
Process a farmer's question
Request:
{
"query": "Should I plant rice today?",
"location": {
"name": "Bangalore",
"lat": 12.8716,
"lon": 77.4946
}
}
Response:
{
"success": true,
"query": "Should I plant rice today?",
"advice": "...",
"routing": {...},
"data": {...},
"execution_time_seconds": 3.5
}
POST /api/export-pdf
Export query result as PDF
Request: Same as /api/query
Response: PDF file download
GET /api/health
Health check
GET /api/servers
List available MCP servers
π οΈ MCP Servers
| Server | Data Source | Information |
|---|---|---|
| weather | Open-Meteo | Current weather, 7-day forecasts |
| soil_properties | SoilGrids | Clay, sand, pH, nutrients |
| water | GRACE Satellite | Groundwater levels, drought status |
| elevation | OpenElevation | Field elevation, terrain data |
| pests | iNaturalist | Recent pest observations |
π Deployment Instructions
1. Create Hugging Face Space
- Go to https://huggingface.co/new-space
- Space name:
farmer-chat-backend - Owner:
aakashdg - SDK: Gradio (we'll use FastAPI inside)
- Set to Public
2. Upload Files
Upload all files maintaining this structure:
farmer-chat-backend/
βββ app.py
βββ requirements.txt
βββ README.md
βββ src/
β βββ __init__.py
β βββ pipeline.py
β βββ router.py
β βββ executor.py
β βββ compiler.py
β βββ translator.py
β βββ pdf_generator.py
β βββ servers/
β βββ __init__.py
β βββ (all server classes in one file or separate)
3. Set Environment Variables
In Space Settings β Variables and secrets:
- Add secret:
OPENAI_API_KEY= your OpenAI API key
4. Deploy!
Space will auto-deploy. Access at:
https://huggingface.co/spaces/aakashdg/farmer-chat-backend
π§ͺ Testing
Test with cURL:
curl -X POST https://huggingface.co/spaces/aakashdg/farmer-chat-backend/api/query \
-H "Content-Type: application/json" \
-d '{
"query": "What is the soil composition?",
"location": {"name": "Bangalore", "lat": 12.8716, "lon": 77.4946}
}'
Test with Python:
import requests
response = requests.post(
"https://huggingface.co/spaces/aakashdg/farmer-chat-backend/api/query",
json={
"query": "Will it rain this week?",
"location": {"name": "Bangalore", "lat": 12.8716, "lon": 77.4946}
}
)
print(response.json())
π Performance
- Parallel execution: All MCP servers called simultaneously
- Typical response time: 3-5 seconds
- Success rate: ~95% (graceful degradation if servers fail)
π Security
- OpenAI API key stored as HF Space secret
- CORS enabled for frontend integration
- Rate limiting: 100 queries/hour per IP (configurable)
π Scaling
To add more MCP servers:
- Create new server class in
src/servers/ - Add to
MCP_SERVER_REGISTRYinexecutor.py - Router will automatically include it in routing decisions
π Troubleshooting
"OPENAI_API_KEY not set"
- Check HF Space Settings β Variables and secrets
- Ensure secret name is exactly
OPENAI_API_KEY
Slow responses
- Normal for first query (cold start)
- Subsequent queries faster due to caching
Server failures
- System uses graceful degradation
- If one server fails, others still provide data
- Check
failed_serversin response
π Support
- GitHub Issues: [Link to repo]
- Creator: @aakashdg
- Built for: Farmer.chat product demo
π License
MIT License
Built with β€οΈ for farmers