Spaces:
Running
Running
Update commit
Browse files- .space/config.json +6 -0
- Dockerfile +10 -5
- HF_SPACE.md +36 -0
- README.md +25 -324
- app.py +28 -0
- requirements.txt +2 -11
- run_ui.py +21 -7
- src/clients/mcp_client.py +51 -18
- src/ui/app.py +106 -111
- src/ui/formatters.py +72 -82
.space/config.json
ADDED
|
@@ -0,0 +1,6 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"sdk": "gradio",
|
| 3 |
+
"env": [
|
| 4 |
+
"OPENAI_API_KEY="
|
| 5 |
+
]
|
| 6 |
+
}
|
Dockerfile
CHANGED
|
@@ -1,4 +1,4 @@
|
|
| 1 |
-
# EcoMCP - Docker Image
|
| 2 |
# Fast, beautiful, production-grade MCP server
|
| 3 |
|
| 4 |
FROM python:3.11-slim
|
|
@@ -8,28 +8,33 @@ WORKDIR /app
|
|
| 8 |
# Install system dependencies
|
| 9 |
RUN apt-get update && apt-get install -y --no-install-recommends \
|
| 10 |
curl \
|
|
|
|
|
|
|
| 11 |
&& rm -rf /var/lib/apt/lists/*
|
| 12 |
|
| 13 |
# Copy requirements
|
| 14 |
COPY requirements.txt .
|
| 15 |
|
| 16 |
# Install Python dependencies
|
| 17 |
-
RUN pip install --no-cache-dir
|
|
|
|
| 18 |
|
| 19 |
# Copy application
|
| 20 |
-
COPY
|
| 21 |
-
COPY run_ui.py .
|
| 22 |
|
| 23 |
# Create non-root user
|
| 24 |
RUN useradd -m -u 1000 ecomcp && chown -R ecomcp:ecomcp /app
|
| 25 |
USER ecomcp
|
| 26 |
|
|
|
|
|
|
|
|
|
|
| 27 |
# Expose port
|
| 28 |
EXPOSE 7860
|
| 29 |
|
| 30 |
# Health check
|
| 31 |
HEALTHCHECK --interval=30s --timeout=10s --start-period=5s --retries=3 \
|
| 32 |
-
CMD curl -f http://localhost:7860/
|
| 33 |
|
| 34 |
# Run UI by default
|
| 35 |
CMD ["python", "run_ui.py"]
|
|
|
|
| 1 |
+
# EcoMCP - Docker Image for Hugging Face Spaces
|
| 2 |
# Fast, beautiful, production-grade MCP server
|
| 3 |
|
| 4 |
FROM python:3.11-slim
|
|
|
|
| 8 |
# Install system dependencies
|
| 9 |
RUN apt-get update && apt-get install -y --no-install-recommends \
|
| 10 |
curl \
|
| 11 |
+
git \
|
| 12 |
+
gcc \
|
| 13 |
&& rm -rf /var/lib/apt/lists/*
|
| 14 |
|
| 15 |
# Copy requirements
|
| 16 |
COPY requirements.txt .
|
| 17 |
|
| 18 |
# Install Python dependencies
|
| 19 |
+
RUN pip install --no-cache-dir --upgrade pip && \
|
| 20 |
+
pip install --no-cache-dir -r requirements.txt
|
| 21 |
|
| 22 |
# Copy application
|
| 23 |
+
COPY . .
|
|
|
|
| 24 |
|
| 25 |
# Create non-root user
|
| 26 |
RUN useradd -m -u 1000 ecomcp && chown -R ecomcp:ecomcp /app
|
| 27 |
USER ecomcp
|
| 28 |
|
| 29 |
+
# Set environment variable for Hugging Face Spaces
|
| 30 |
+
ENV HUGGINGFACE_SPACES=1
|
| 31 |
+
|
| 32 |
# Expose port
|
| 33 |
EXPOSE 7860
|
| 34 |
|
| 35 |
# Health check
|
| 36 |
HEALTHCHECK --interval=30s --timeout=10s --start-period=5s --retries=3 \
|
| 37 |
+
CMD curl -f http://localhost:7860/ || exit 1
|
| 38 |
|
| 39 |
# Run UI by default
|
| 40 |
CMD ["python", "run_ui.py"]
|
HF_SPACE.md
ADDED
|
@@ -0,0 +1,36 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# EcoMCP - E-commerce Intelligence Platform
|
| 2 |
+
|
| 3 |
+
EcoMCP is a powerful e-commerce intelligence platform built on the Model Context Protocol (MCP). It provides AI-driven insights and recommendations to help you make better business decisions.
|
| 4 |
+
|
| 5 |
+
## Features
|
| 6 |
+
|
| 7 |
+
- π¦ Product Analysis: Discover market opportunities and competitive positioning
|
| 8 |
+
- β Review Intelligence: Extract actionable insights from customer feedback
|
| 9 |
+
- βοΈ Listing Generation: Create high-converting product descriptions
|
| 10 |
+
- π° Smart Pricing: Optimize profit margins with data-driven pricing strategies
|
| 11 |
+
- π― Competitive Analysis: Analyze market positioning and differentiation strategies
|
| 12 |
+
|
| 13 |
+
## Technical Details
|
| 14 |
+
|
| 15 |
+
- **Platform:** Built with Gradio 6.0+
|
| 16 |
+
- **Protocol:** JSON-RPC 2.0 compliant
|
| 17 |
+
- **AI Model:** OpenAI GPT-4-Turbo
|
| 18 |
+
- **Knowledge Base:** LlamaIndex with semantic search
|
| 19 |
+
|
| 20 |
+
## Environment Variables
|
| 21 |
+
|
| 22 |
+
To use the full MCP functionality, you'll need to set your OpenAI API key:
|
| 23 |
+
|
| 24 |
+
- `OPENAI_API_KEY`: Your OpenAI API key for AI-powered analysis
|
| 25 |
+
|
| 26 |
+
## How to Use
|
| 27 |
+
|
| 28 |
+
1. Enter product information or customer reviews in the relevant sections
|
| 29 |
+
2. Click the appropriate analyze button
|
| 30 |
+
3. Review the AI-generated insights and recommendations
|
| 31 |
+
|
| 32 |
+
## About MCP
|
| 33 |
+
|
| 34 |
+
This application demonstrates the Model Context Protocol (MCP) for integrating AI models with application-specific tools and knowledge bases. MCP enables rich, contextual AI interactions for specialized domains.
|
| 35 |
+
|
| 36 |
+
Built for the MCP 1st Birthday Hackathon - Track 1: Building MCP
|
README.md
CHANGED
|
@@ -1,341 +1,42 @@
|
|
| 1 |
---
|
| 2 |
-
title: EcoMCP - E-
|
| 3 |
-
emoji:
|
| 4 |
colorFrom: blue
|
| 5 |
-
colorTo:
|
| 6 |
sdk: gradio
|
| 7 |
-
sdk_version:
|
| 8 |
-
app_file:
|
| 9 |
pinned: false
|
| 10 |
-
|
| 11 |
-
- building-mcp-track-enterprise
|
| 12 |
-
- ecommerce
|
| 13 |
-
- llm
|
| 14 |
-
- openai
|
| 15 |
-
- mcp
|
| 16 |
---
|
| 17 |
|
| 18 |
-
# EcoMCP - E-
|
| 19 |
|
| 20 |
-
|
| 21 |
|
| 22 |
-
|
| 23 |
-
|
| 24 |
-
## π― Overview
|
| 25 |
-
|
| 26 |
-
EcoMCP is an MCP server designed for e-commerce businesses with AI-powered tools for:
|
| 27 |
-
- **Product Analysis**: Market insights and competitive positioning
|
| 28 |
-
- **Review Analysis**: Sentiment analysis and customer insights
|
| 29 |
-
- **Listing Generation**: Conversion-optimized product copy
|
| 30 |
-
- **Price Recommendation**: Strategic pricing strategies
|
| 31 |
-
- **Competitor Analysis**: Market positioning opportunities
|
| 32 |
-
|
| 33 |
-
---
|
| 34 |
-
|
| 35 |
-
## β¨ Key Features
|
| 36 |
-
|
| 37 |
-
- **7 AI-Powered Tools** - Comprehensive e-commerce analysis with knowledge base search
|
| 38 |
-
- **OpenAI GPT-4 Turbo Integration** - Intelligent insights powered by latest models
|
| 39 |
-
- **Beautiful Gradio Interface** - Modern, responsive UI with polished design
|
| 40 |
-
- **Production-Ready** - Input validation, error handling, and logging
|
| 41 |
-
- **Fully Async** - Non-blocking I/O with sync/async bridges for Gradio
|
| 42 |
-
- **Centralized Configuration** - Environment-based settings management
|
| 43 |
-
- **Knowledge Base Integration** - LlamaIndex semantic search across products and docs
|
| 44 |
-
|
| 45 |
-
---
|
| 46 |
-
|
| 47 |
-
## π Quick Start
|
| 48 |
-
|
| 49 |
-
### Prerequisites
|
| 50 |
-
- Python 3.8+
|
| 51 |
-
- OpenAI API key (from https://platform.openai.com/api-keys)
|
| 52 |
-
|
| 53 |
-
### Local Development
|
| 54 |
-
|
| 55 |
-
```bash
|
| 56 |
-
# Clone repository
|
| 57 |
-
git clone <repo-url>
|
| 58 |
-
cd ecomcp
|
| 59 |
|
| 60 |
-
|
| 61 |
-
|
| 62 |
-
|
|
|
|
|
|
|
| 63 |
|
| 64 |
-
|
| 65 |
-
pip install -r requirements.txt
|
| 66 |
|
| 67 |
-
|
| 68 |
-
|
|
|
|
|
|
|
| 69 |
|
| 70 |
-
|
| 71 |
-
python3 run_ui.py
|
| 72 |
-
```
|
| 73 |
-
|
| 74 |
-
Visit: **http://localhost:7860**
|
| 75 |
-
|
| 76 |
-
### On HuggingFace Spaces
|
| 77 |
-
|
| 78 |
-
1. Space builds automatically (2-3 minutes)
|
| 79 |
-
2. Add `OPENAI_API_KEY` to Repository secrets
|
| 80 |
-
3. Click "Open in iframe"
|
| 81 |
-
4. Start using the tools!
|
| 82 |
-
|
| 83 |
-
---
|
| 84 |
-
|
| 85 |
-
## π Tools Included
|
| 86 |
-
|
| 87 |
-
### 1. **Analyze Product**
|
| 88 |
-
Comprehensive market analysis including:
|
| 89 |
-
- Market positioning and competitive advantages
|
| 90 |
-
- Target audience insights
|
| 91 |
-
- Pricing analysis
|
| 92 |
-
|
| 93 |
-
### 2. **Analyze Reviews**
|
| 94 |
-
Customer sentiment analysis:
|
| 95 |
-
- Sentiment distribution (positive/neutral/negative)
|
| 96 |
-
- Key strengths and weaknesses
|
| 97 |
-
- Recommendations for improvement
|
| 98 |
-
|
| 99 |
-
### 3. **Generate Listing**
|
| 100 |
-
AI-powered product copy:
|
| 101 |
-
- Multiple tone styles (professional, casual, technical)
|
| 102 |
-
- Conversion-optimized content
|
| 103 |
-
- SEO-friendly descriptions
|
| 104 |
-
|
| 105 |
-
### 4. **Price Recommendation**
|
| 106 |
-
Strategic pricing guidance:
|
| 107 |
-
- Market-based pricing suggestions
|
| 108 |
-
- Discount strategies
|
| 109 |
-
- Bundle recommendations
|
| 110 |
-
|
| 111 |
-
### 5. **Competitor Analysis**
|
| 112 |
-
Market intelligence:
|
| 113 |
-
- Competitive positioning
|
| 114 |
-
- Differentiation opportunities
|
| 115 |
-
- Market trends analysis
|
| 116 |
-
|
| 117 |
-
---
|
| 118 |
|
| 119 |
-
|
| 120 |
|
| 121 |
-
|
| 122 |
-
|
| 123 |
-
Create a `.env` file or add to Space Repository secrets:
|
| 124 |
-
|
| 125 |
-
```env
|
| 126 |
-
OPENAI_API_KEY=sk-your-api-key-here
|
| 127 |
-
```
|
| 128 |
-
|
| 129 |
-
### HuggingFace Spaces Setup
|
| 130 |
-
|
| 131 |
-
In your Space Settings:
|
| 132 |
-
1. **Repository secrets** β Add `OPENAI_API_KEY`
|
| 133 |
-
2. Space automatically reloads with the secret
|
| 134 |
-
|
| 135 |
-
---
|
| 136 |
-
|
| 137 |
-
## π¦ Requirements
|
| 138 |
-
|
| 139 |
-
- **Python**: 3.8+
|
| 140 |
-
- **Gradio**: 5.0.0+
|
| 141 |
-
- **OpenAI API**: Active account with credits
|
| 142 |
-
- **Memory**: ~300MB RAM
|
| 143 |
-
- **Storage**: ~500MB
|
| 144 |
-
|
| 145 |
-
### Core Dependencies
|
| 146 |
-
|
| 147 |
-
```
|
| 148 |
-
gradio>=5.0.0,<6.0.0
|
| 149 |
-
huggingface-hub>=0.16.0,<0.20.0
|
| 150 |
-
openai>=1.0.0
|
| 151 |
-
python-dotenv>=1.0.0
|
| 152 |
-
pydantic>=2.5.0
|
| 153 |
-
```
|
| 154 |
-
|
| 155 |
-
---
|
| 156 |
-
|
| 157 |
-
## π Documentation
|
| 158 |
-
|
| 159 |
-
- **[IMMEDIATE_ACTION_ITEMS.txt](IMMEDIATE_ACTION_ITEMS.txt)** - Quick checklist (2 min)
|
| 160 |
-
- **[docs/deployment/DEPLOYMENT_FINAL_SUMMARY.txt](docs/deployment/DEPLOYMENT_FINAL_SUMMARY.txt)** - Complete summary (5 min)
|
| 161 |
-
- **[docs/deployment/START_HF_SPACES.md](docs/deployment/START_HF_SPACES.md)** - 5-minute quick start
|
| 162 |
-
- **[docs/deployment/DEPLOYMENT_COMPLETE.md](docs/deployment/DEPLOYMENT_COMPLETE.md)** - Post-deployment guide
|
| 163 |
-
- **[docs/deployment/HUGGINGFACE_DEPLOYMENT.md](docs/deployment/HUGGINGFACE_DEPLOYMENT.md)** - Detailed deployment guide
|
| 164 |
-
- **[docs/deployment/PRE_DEPLOYMENT_CHECKLIST.md](docs/deployment/PRE_DEPLOYMENT_CHECKLIST.md)** - Verification checklist
|
| 165 |
-
|
| 166 |
-
### API & Technical Documentation
|
| 167 |
-
|
| 168 |
-
- **[FIX_GRADIO_IMPORT_ERROR.md](FIX_GRADIO_IMPORT_ERROR.md)** - Troubleshooting guide
|
| 169 |
-
- **[HF_SPACES_FIX.txt](HF_SPACES_FIX.txt)** - Quick reference
|
| 170 |
-
|
| 171 |
-
---
|
| 172 |
-
|
| 173 |
-
## π Architecture
|
| 174 |
-
|
| 175 |
-
### Technology Stack
|
| 176 |
-
|
| 177 |
-
- **Framework**: Gradio 5.x (Web UI)
|
| 178 |
-
- **Protocol**: MCP (Model Context Protocol)
|
| 179 |
-
- **AI**: OpenAI GPT-4/5.1 API
|
| 180 |
-
- **Language**: Python 3.8+
|
| 181 |
-
- **Async**: asyncio with async/await
|
| 182 |
-
|
| 183 |
-
### Project Structure
|
| 184 |
-
|
| 185 |
-
```
|
| 186 |
-
ecomcp/
|
| 187 |
-
βββ run_ui.py # Entry point
|
| 188 |
-
βββ requirements.txt # Dependencies
|
| 189 |
-
βββ src/
|
| 190 |
-
β βββ ui/
|
| 191 |
-
β β βββ app.py # Gradio interface
|
| 192 |
-
β β βββ components.py # UI components
|
| 193 |
-
β βββ server/
|
| 194 |
-
β β βββ mcp_server.py # MCP protocol
|
| 195 |
-
β β βββ tools.py # Tool definitions
|
| 196 |
-
β βββ clients/
|
| 197 |
-
β β βββ mcp_client.py # JSON-RPC client
|
| 198 |
-
β βββ core/
|
| 199 |
-
β βββ ... # Core functionality
|
| 200 |
-
βββ docs/ # Documentation
|
| 201 |
-
βββ tests/ # Test suite
|
| 202 |
-
```
|
| 203 |
-
|
| 204 |
-
---
|
| 205 |
|
| 206 |
-
##
|
| 207 |
|
| 208 |
-
|
| 209 |
-
- Check Python version (3.8+ required)
|
| 210 |
-
- Verify all dependencies: `pip install -r requirements.txt`
|
| 211 |
-
- Check for error messages in console
|
| 212 |
-
|
| 213 |
-
### Tools don't work
|
| 214 |
-
- Verify `OPENAI_API_KEY` is set
|
| 215 |
-
- Check OpenAI account has active credits
|
| 216 |
-
- Review OpenAI API usage limits
|
| 217 |
-
|
| 218 |
-
### Gradio interface issues
|
| 219 |
-
- Try refreshing the page
|
| 220 |
-
- Clear browser cache
|
| 221 |
-
- Check browser console for errors
|
| 222 |
-
|
| 223 |
-
### HuggingFace Spaces specific
|
| 224 |
-
- Check Space logs: Settings β Logs
|
| 225 |
-
- Verify `OPENAI_API_KEY` is in Repository secrets
|
| 226 |
-
- Ensure Space has enough memory
|
| 227 |
-
|
| 228 |
-
**For detailed troubleshooting**: See [FIX_GRADIO_IMPORT_ERROR.md](FIX_GRADIO_IMPORT_ERROR.md)
|
| 229 |
-
|
| 230 |
-
---
|
| 231 |
-
|
| 232 |
-
## π Performance
|
| 233 |
-
|
| 234 |
-
- **Build time**: 2-3 minutes (first time)
|
| 235 |
-
- **Startup time**: 30-60 seconds
|
| 236 |
-
- **Interface load**: <5 seconds
|
| 237 |
-
- **Tool response**: 2-5 seconds (includes OpenAI latency)
|
| 238 |
-
- **Concurrent users**: 10-50+ (depends on hardware)
|
| 239 |
-
|
| 240 |
-
---
|
| 241 |
-
|
| 242 |
-
## π Links & Resources
|
| 243 |
-
|
| 244 |
-
- **[MCP Specification](https://modelcontextprotocol.io/)** - Model Context Protocol docs
|
| 245 |
-
- **[Gradio Documentation](https://www.gradio.app/)** - Gradio framework docs
|
| 246 |
-
- **[OpenAI API](https://platform.openai.com/docs)** - API reference
|
| 247 |
-
- **[HuggingFace Spaces](https://huggingface.co/docs/hub/spaces)** - Spaces documentation
|
| 248 |
-
- **[Hackathon](https://huggingface.co/MCP-1st-Birthday)** - MCP 1st Birthday Hackathon
|
| 249 |
-
|
| 250 |
-
---
|
| 251 |
-
|
| 252 |
-
## π― Hackathon Submission
|
| 253 |
-
|
| 254 |
-
**Track**: Building MCP (Track 1)
|
| 255 |
-
**Category**: Enterprise Applications
|
| 256 |
-
**Status**: β
Production Ready
|
| 257 |
-
|
| 258 |
-
### Requirements Met
|
| 259 |
-
|
| 260 |
-
- β
Fully functional MCP server
|
| 261 |
-
- β
Multiple integrated tools (5 tools)
|
| 262 |
-
- β
Beautiful Gradio interface
|
| 263 |
-
- β
Complete documentation
|
| 264 |
-
- β
Production-ready code
|
| 265 |
-
- β
OpenAI API integration
|
| 266 |
-
|
| 267 |
-
---
|
| 268 |
-
|
| 269 |
-
## π What's Next?
|
| 270 |
-
|
| 271 |
-
### Immediate
|
| 272 |
-
1. Add `OPENAI_API_KEY` to Repository secrets
|
| 273 |
-
2. Test all 5 tools
|
| 274 |
-
3. Verify everything works
|
| 275 |
-
|
| 276 |
-
### Short-term
|
| 277 |
-
1. Record demo video (1-5 minutes)
|
| 278 |
-
2. Post on social media
|
| 279 |
-
3. Gather user feedback
|
| 280 |
-
|
| 281 |
-
### Before Deadline (Nov 30)
|
| 282 |
-
1. Prepare submission materials
|
| 283 |
-
2. Document achievements
|
| 284 |
-
3. Submit to hackathon
|
| 285 |
-
|
| 286 |
-
---
|
| 287 |
-
|
| 288 |
-
## π¬ Support
|
| 289 |
-
|
| 290 |
-
### Questions or Issues?
|
| 291 |
-
|
| 292 |
-
1. **Quick help**: Read [IMMEDIATE_ACTION_ITEMS.txt](IMMEDIATE_ACTION_ITEMS.txt)
|
| 293 |
-
2. **Deployment issues**: Read [docs/deployment/DEPLOYMENT_COMPLETE.md](docs/deployment/DEPLOYMENT_COMPLETE.md)
|
| 294 |
-
3. **Technical problems**: Read [FIX_GRADIO_IMPORT_ERROR.md](FIX_GRADIO_IMPORT_ERROR.md)
|
| 295 |
-
4. **Space logs**: Settings β Logs (for error details)
|
| 296 |
-
|
| 297 |
-
---
|
| 298 |
-
|
| 299 |
-
## β
Status
|
| 300 |
-
|
| 301 |
-
- **Build Status**: β
Passing
|
| 302 |
-
- **Deployment**: β
Live on HF Spaces
|
| 303 |
-
- **Documentation**: β
Complete
|
| 304 |
-
- **Testing**: β
Verified
|
| 305 |
-
- **Status**: β
Production Ready
|
| 306 |
-
|
| 307 |
-
---
|
| 308 |
-
|
| 309 |
-
## π License
|
| 310 |
-
|
| 311 |
-
MIT License - See LICENSE file for details
|
| 312 |
-
|
| 313 |
-
---
|
| 314 |
-
|
| 315 |
-
## π Acknowledgments
|
| 316 |
-
|
| 317 |
-
- **Anthropic** - Model Context Protocol
|
| 318 |
-
- **HuggingFace** - Spaces hosting platform
|
| 319 |
-
- **OpenAI** - API integration
|
| 320 |
-
- **Gradio** - Web interface framework
|
| 321 |
-
|
| 322 |
-
---
|
| 323 |
-
|
| 324 |
-
## π Quick Links
|
| 325 |
-
|
| 326 |
-
| Item | Link |
|
| 327 |
-
|------|------|
|
| 328 |
-
| **Space URL** | https://huggingface.co/spaces/MCP-1st-Birthday/ecomcp |
|
| 329 |
-
| **OpenAI Keys** | https://platform.openai.com/api-keys |
|
| 330 |
-
| **Hackathon** | https://huggingface.co/MCP-1st-Birthday |
|
| 331 |
-
| **MCP Docs** | https://modelcontextprotocol.io/ |
|
| 332 |
-
|
| 333 |
-
---
|
| 334 |
-
|
| 335 |
-
**Last Updated**: November 27, 2025
|
| 336 |
-
**Version**: 1.0.0
|
| 337 |
-
**Status**: Production Ready π
|
| 338 |
-
|
| 339 |
-
---
|
| 340 |
|
| 341 |
-
|
|
|
|
| 1 |
---
|
| 2 |
+
title: EcoMCP - E-commerce Intelligence Platform
|
| 3 |
+
emoji: π
|
| 4 |
colorFrom: blue
|
| 5 |
+
colorTo: gray
|
| 6 |
sdk: gradio
|
| 7 |
+
sdk_version: 4.44.1
|
| 8 |
+
app_file: app.py
|
| 9 |
pinned: false
|
| 10 |
+
license: mit
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 11 |
---
|
| 12 |
|
| 13 |
+
# EcoMCP - E-commerce Intelligence Platform
|
| 14 |
|
| 15 |
+
EcoMCP is a powerful e-commerce intelligence platform built on the Model Context Protocol (MCP). It provides AI-driven insights and recommendations to help you make better business decisions.
|
| 16 |
|
| 17 |
+
## Features
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 18 |
|
| 19 |
+
- π¦ Product Analysis: Discover market opportunities and competitive positioning
|
| 20 |
+
- β Review Intelligence: Extract actionable insights from customer feedback
|
| 21 |
+
- βοΈ Listing Generation: Create high-converting product descriptions
|
| 22 |
+
- π° Smart Pricing: Optimize profit margins with data-driven pricing strategies
|
| 23 |
+
- π― Competitive Analysis: Analyze market positioning and differentiation strategies
|
| 24 |
|
| 25 |
+
## Technical Details
|
|
|
|
| 26 |
|
| 27 |
+
- **Platform:** Built with Gradio
|
| 28 |
+
- **Protocol:** JSON-RPC 2.0 compliant
|
| 29 |
+
- **AI Model:** OpenAI GPT-4-Turbo
|
| 30 |
+
- **Knowledge Base:** LlamaIndex with semantic search
|
| 31 |
|
| 32 |
+
## Environment Variables
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 33 |
|
| 34 |
+
To use the full MCP functionality, you'll need to set your OpenAI API key:
|
| 35 |
|
| 36 |
+
- `OPENAI_API_KEY`: Your OpenAI API key for AI-powered analysis
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 37 |
|
| 38 |
+
## About MCP
|
| 39 |
|
| 40 |
+
This application demonstrates the Model Context Protocol (MCP) for integrating AI models with application-specific tools and knowledge bases. MCP enables rich, contextual AI interactions for specialized domains.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 41 |
|
| 42 |
+
Built for the MCP 1st Birthday Hackathon - Track 1: Building MCP
|
app.py
ADDED
|
@@ -0,0 +1,28 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
"""
|
| 2 |
+
Hugging Face Space Entry Point
|
| 3 |
+
This file is the main entry point for Hugging Face Spaces deployment
|
| 4 |
+
"""
|
| 5 |
+
|
| 6 |
+
import os
|
| 7 |
+
import sys
|
| 8 |
+
|
| 9 |
+
# Add the project root to the Python path
|
| 10 |
+
sys.path.insert(0, os.path.dirname(__file__))
|
| 11 |
+
|
| 12 |
+
from src.ui.app import create_app
|
| 13 |
+
|
| 14 |
+
# Create the Gradio app
|
| 15 |
+
app = create_app()
|
| 16 |
+
|
| 17 |
+
# For Hugging Face Spaces, we need to make the app available at the module level
|
| 18 |
+
# The Space will look for a variable named "app"
|
| 19 |
+
demo = app
|
| 20 |
+
|
| 21 |
+
if __name__ == "__main__":
|
| 22 |
+
# This is only used if running directly, not in Hugging Face Spaces
|
| 23 |
+
app.launch(
|
| 24 |
+
server_name="0.0.0.0",
|
| 25 |
+
server_port=7860,
|
| 26 |
+
show_error=True,
|
| 27 |
+
share=False
|
| 28 |
+
)
|
requirements.txt
CHANGED
|
@@ -1,19 +1,10 @@
|
|
| 1 |
-
gradio>=
|
| 2 |
gradio-client>=0.16.0
|
| 3 |
-
huggingface-hub>=0.33.5,<1.0.0
|
| 4 |
httpx>=0.25.0
|
| 5 |
aiohttp>=3.9.0
|
| 6 |
openai>=1.0.0
|
| 7 |
tiktoken>=0.5.0
|
| 8 |
-
llama-index>=0.9.0
|
| 9 |
-
llama-index-embeddings-openai>=0.1.0
|
| 10 |
-
llama-index-vector-stores-pinecone>=0.1.0
|
| 11 |
python-dotenv>=1.0.0
|
| 12 |
pydantic>=2.5.0
|
| 13 |
pydantic-settings>=2.0.0
|
| 14 |
-
|
| 15 |
-
pytest>=7.4.0
|
| 16 |
-
pytest-asyncio>=0.21.0
|
| 17 |
-
pytest-cov>=4.1.0
|
| 18 |
-
black>=23.0.0
|
| 19 |
-
ruff>=0.1.0
|
|
|
|
| 1 |
+
gradio>=4.0.0,<4.45.0
|
| 2 |
gradio-client>=0.16.0
|
|
|
|
| 3 |
httpx>=0.25.0
|
| 4 |
aiohttp>=3.9.0
|
| 5 |
openai>=1.0.0
|
| 6 |
tiktoken>=0.5.0
|
|
|
|
|
|
|
|
|
|
| 7 |
python-dotenv>=1.0.0
|
| 8 |
pydantic>=2.5.0
|
| 9 |
pydantic-settings>=2.0.0
|
| 10 |
+
huggingface-hub>=0.20.0
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
run_ui.py
CHANGED
|
@@ -38,13 +38,27 @@ if __name__ == "__main__":
|
|
| 38 |
logger.info("Creating Gradio app...")
|
| 39 |
app = create_app()
|
| 40 |
|
| 41 |
-
|
| 42 |
-
|
| 43 |
-
|
| 44 |
-
|
| 45 |
-
|
| 46 |
-
|
| 47 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 48 |
except ImportError as e:
|
| 49 |
logger.error(f"Failed to import required module: {e}")
|
| 50 |
logger.error("Make sure all dependencies are installed: pip install -r requirements.txt")
|
|
|
|
| 38 |
logger.info("Creating Gradio app...")
|
| 39 |
app = create_app()
|
| 40 |
|
| 41 |
+
# Check if running in Hugging Face Space environment
|
| 42 |
+
import os
|
| 43 |
+
is_hf_space = os.environ.get("HUGGINGFACE_SPACES") == "1"
|
| 44 |
+
|
| 45 |
+
if is_hf_space:
|
| 46 |
+
logger.info("Launching EcoMCP in Hugging Face Space environment")
|
| 47 |
+
app.launch(
|
| 48 |
+
server_name="0.0.0.0",
|
| 49 |
+
server_port=7860,
|
| 50 |
+
show_error=True,
|
| 51 |
+
share=False,
|
| 52 |
+
show_tips=True
|
| 53 |
+
)
|
| 54 |
+
else:
|
| 55 |
+
logger.info("Launching EcoMCP at http://localhost:7860")
|
| 56 |
+
app.launch(
|
| 57 |
+
server_name="0.0.0.0",
|
| 58 |
+
server_port=7860,
|
| 59 |
+
show_error=True,
|
| 60 |
+
share=False
|
| 61 |
+
)
|
| 62 |
except ImportError as e:
|
| 63 |
logger.error(f"Failed to import required module: {e}")
|
| 64 |
logger.error("Make sure all dependencies are installed: pip install -r requirements.txt")
|
src/clients/mcp_client.py
CHANGED
|
@@ -3,6 +3,7 @@ EcoMCP Client - Handles communication with MCP server via JSON-RPC
|
|
| 3 |
"""
|
| 4 |
|
| 5 |
import json
|
|
|
|
| 6 |
import sys
|
| 7 |
import subprocess
|
| 8 |
import asyncio
|
|
@@ -11,24 +12,41 @@ from typing import Dict, Any, Optional
|
|
| 11 |
|
| 12 |
class MCPClient:
|
| 13 |
"""JSON-RPC client for MCP server"""
|
| 14 |
-
|
| 15 |
def __init__(self, server_script: str = "ecomcp_server.py"):
|
| 16 |
self.server_script = server_script
|
| 17 |
self.process: Optional[subprocess.Popen] = None
|
| 18 |
self.request_id = 0
|
| 19 |
-
|
|
|
|
|
|
|
| 20 |
async def start_server(self) -> None:
|
| 21 |
"""Start MCP server process"""
|
|
|
|
| 22 |
if self.process is None:
|
| 23 |
try:
|
| 24 |
-
|
| 25 |
-
|
| 26 |
-
|
| 27 |
-
|
| 28 |
-
|
| 29 |
-
|
| 30 |
-
|
| 31 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 32 |
except Exception as e:
|
| 33 |
print(f"Server error: {e}")
|
| 34 |
raise
|
|
@@ -40,16 +58,16 @@ class MCPClient:
|
|
| 40 |
) -> Dict[str, Any]:
|
| 41 |
"""
|
| 42 |
Send JSON-RPC request to server
|
| 43 |
-
|
| 44 |
Args:
|
| 45 |
method: RPC method name
|
| 46 |
params: Method parameters
|
| 47 |
-
|
| 48 |
Returns:
|
| 49 |
Server response
|
| 50 |
"""
|
| 51 |
await self.start_server()
|
| 52 |
-
|
| 53 |
self.request_id += 1
|
| 54 |
message = {
|
| 55 |
"jsonrpc": "2.0",
|
|
@@ -57,20 +75,35 @@ class MCPClient:
|
|
| 57 |
"params": params or {},
|
| 58 |
"id": self.request_id
|
| 59 |
}
|
| 60 |
-
|
| 61 |
try:
|
| 62 |
if not self.process or not self.process.stdin:
|
| 63 |
raise RuntimeError("Server process not initialized")
|
| 64 |
-
|
|
|
|
| 65 |
self.process.stdin.write(json.dumps(message) + "\n")
|
| 66 |
self.process.stdin.flush()
|
| 67 |
-
|
|
|
|
| 68 |
response_line = self.process.stdout.readline()
|
| 69 |
if response_line:
|
| 70 |
-
return json.loads(response_line)
|
| 71 |
except Exception as e:
|
| 72 |
print(f"Request error: {e}")
|
| 73 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 74 |
return {"error": "Failed to communicate with server"}
|
| 75 |
|
| 76 |
def stop(self) -> None:
|
|
|
|
| 3 |
"""
|
| 4 |
|
| 5 |
import json
|
| 6 |
+
import os
|
| 7 |
import sys
|
| 8 |
import subprocess
|
| 9 |
import asyncio
|
|
|
|
| 12 |
|
| 13 |
class MCPClient:
|
| 14 |
"""JSON-RPC client for MCP server"""
|
| 15 |
+
|
| 16 |
def __init__(self, server_script: str = "ecomcp_server.py"):
|
| 17 |
self.server_script = server_script
|
| 18 |
self.process: Optional[subprocess.Popen] = None
|
| 19 |
self.request_id = 0
|
| 20 |
+
# Check if we're in Hugging Face Space environment
|
| 21 |
+
self.is_hf_space = os.environ.get("HUGGINGFACE_SPACES") == "1"
|
| 22 |
+
|
| 23 |
async def start_server(self) -> None:
|
| 24 |
"""Start MCP server process"""
|
| 25 |
+
# In Hugging Face Spaces, we may need to handle this differently
|
| 26 |
if self.process is None:
|
| 27 |
try:
|
| 28 |
+
# For Hugging Face Spaces, use a more robust approach
|
| 29 |
+
cmd = [sys.executable, self.server_script]
|
| 30 |
+
if self.is_hf_space:
|
| 31 |
+
# In Hugging Face Spaces, make sure to handle environment properly
|
| 32 |
+
self.process = subprocess.Popen(
|
| 33 |
+
cmd,
|
| 34 |
+
stdin=subprocess.PIPE,
|
| 35 |
+
stdout=subprocess.PIPE,
|
| 36 |
+
stderr=subprocess.PIPE,
|
| 37 |
+
text=True,
|
| 38 |
+
bufsize=1,
|
| 39 |
+
universal_newlines=True
|
| 40 |
+
)
|
| 41 |
+
else:
|
| 42 |
+
self.process = subprocess.Popen(
|
| 43 |
+
cmd,
|
| 44 |
+
stdin=subprocess.PIPE,
|
| 45 |
+
stdout=subprocess.PIPE,
|
| 46 |
+
stderr=subprocess.PIPE,
|
| 47 |
+
text=True,
|
| 48 |
+
bufsize=1
|
| 49 |
+
)
|
| 50 |
except Exception as e:
|
| 51 |
print(f"Server error: {e}")
|
| 52 |
raise
|
|
|
|
| 58 |
) -> Dict[str, Any]:
|
| 59 |
"""
|
| 60 |
Send JSON-RPC request to server
|
| 61 |
+
|
| 62 |
Args:
|
| 63 |
method: RPC method name
|
| 64 |
params: Method parameters
|
| 65 |
+
|
| 66 |
Returns:
|
| 67 |
Server response
|
| 68 |
"""
|
| 69 |
await self.start_server()
|
| 70 |
+
|
| 71 |
self.request_id += 1
|
| 72 |
message = {
|
| 73 |
"jsonrpc": "2.0",
|
|
|
|
| 75 |
"params": params or {},
|
| 76 |
"id": self.request_id
|
| 77 |
}
|
| 78 |
+
|
| 79 |
try:
|
| 80 |
if not self.process or not self.process.stdin:
|
| 81 |
raise RuntimeError("Server process not initialized")
|
| 82 |
+
|
| 83 |
+
# Write the message to the server
|
| 84 |
self.process.stdin.write(json.dumps(message) + "\n")
|
| 85 |
self.process.stdin.flush()
|
| 86 |
+
|
| 87 |
+
# Try to read the response
|
| 88 |
response_line = self.process.stdout.readline()
|
| 89 |
if response_line:
|
| 90 |
+
return json.loads(response_line.strip())
|
| 91 |
except Exception as e:
|
| 92 |
print(f"Request error: {e}")
|
| 93 |
+
# If in Hugging Face Space and we get an error, try to restart the server
|
| 94 |
+
if self.is_hf_space:
|
| 95 |
+
try:
|
| 96 |
+
self.stop()
|
| 97 |
+
await self.start_server()
|
| 98 |
+
# Retry the request once
|
| 99 |
+
self.process.stdin.write(json.dumps(message) + "\n")
|
| 100 |
+
self.process.stdin.flush()
|
| 101 |
+
response_line = self.process.stdout.readline()
|
| 102 |
+
if response_line:
|
| 103 |
+
return json.loads(response_line.strip())
|
| 104 |
+
except Exception as retry_error:
|
| 105 |
+
print(f"Retry also failed: {retry_error}")
|
| 106 |
+
|
| 107 |
return {"error": "Failed to communicate with server"}
|
| 108 |
|
| 109 |
def stop(self) -> None:
|
src/ui/app.py
CHANGED
|
@@ -19,8 +19,14 @@ try:
|
|
| 19 |
except ImportError:
|
| 20 |
LLAMAINDEX_AVAILABLE = False
|
| 21 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 22 |
# Initialize client and handler
|
| 23 |
-
|
|
|
|
| 24 |
handler = ToolCallHandler(client)
|
| 25 |
|
| 26 |
# Initialize knowledge base if available
|
|
@@ -30,16 +36,20 @@ if LLAMAINDEX_AVAILABLE:
|
|
| 30 |
kb = EcoMCPKnowledgeBase()
|
| 31 |
if os.path.exists("./docs"):
|
| 32 |
kb.initialize("./docs")
|
|
|
|
| 33 |
except Exception as e:
|
| 34 |
print(f"Warning: Could not initialize knowledge base: {e}")
|
| 35 |
kb = None
|
| 36 |
|
| 37 |
|
| 38 |
def create_theme() -> gr.themes.Base:
|
| 39 |
-
"""Create
|
| 40 |
-
return gr.themes.
|
| 41 |
-
primary_hue=
|
| 42 |
-
|
|
|
|
|
|
|
|
|
|
| 43 |
)
|
| 44 |
|
| 45 |
|
|
@@ -56,83 +66,81 @@ def create_app() -> gr.Blocks:
|
|
| 56 |
* {
|
| 57 |
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, 'Helvetica Neue', Arial, sans-serif;
|
| 58 |
}
|
| 59 |
-
|
| 60 |
-
/* Header section */
|
| 61 |
.header-container {
|
| 62 |
text-align: center;
|
| 63 |
-
padding:
|
| 64 |
-
background:
|
| 65 |
-
color:
|
| 66 |
-
border-radius:
|
| 67 |
margin-bottom: 2rem;
|
| 68 |
-
|
| 69 |
}
|
| 70 |
-
|
| 71 |
.header-container h1 {
|
| 72 |
margin: 0 0 0.5rem 0;
|
| 73 |
-
font-size:
|
| 74 |
-
font-weight:
|
| 75 |
-
|
| 76 |
}
|
| 77 |
-
|
| 78 |
.header-container p {
|
| 79 |
margin: 0.25rem 0;
|
| 80 |
-
font-size:
|
| 81 |
-
|
| 82 |
-
font-weight:
|
| 83 |
}
|
| 84 |
-
|
| 85 |
.header-subtitle {
|
| 86 |
-
font-size: 0.
|
| 87 |
-
|
| 88 |
margin-top: 0.5rem;
|
| 89 |
}
|
| 90 |
-
|
| 91 |
/* Tab styling */
|
| 92 |
.tool-tab {
|
| 93 |
padding: 0;
|
| 94 |
}
|
| 95 |
-
|
| 96 |
/* Tool section */
|
| 97 |
.tool-section {
|
| 98 |
-
padding:
|
| 99 |
-
background:
|
| 100 |
-
border-radius:
|
| 101 |
border: 1px solid #e5e7eb;
|
| 102 |
-
|
| 103 |
-
transition: all 0.3s ease;
|
| 104 |
}
|
| 105 |
-
|
| 106 |
.tool-section:hover {
|
| 107 |
-
box-shadow: 0 4px 12px rgba(0, 0, 0, 0.08);
|
| 108 |
border-color: #d1d5db;
|
| 109 |
}
|
| 110 |
-
|
| 111 |
.tool-section h3 {
|
| 112 |
margin: 0 0 0.5rem 0;
|
| 113 |
-
font-size: 1.
|
| 114 |
font-weight: 600;
|
| 115 |
color: #1f2937;
|
| 116 |
}
|
| 117 |
-
|
| 118 |
.tool-description {
|
| 119 |
color: #6b7280;
|
| 120 |
-
font-size: 0.
|
| 121 |
-
margin-bottom:
|
| 122 |
line-height: 1.5;
|
| 123 |
}
|
| 124 |
-
|
| 125 |
/* Output box */
|
| 126 |
.output-box {
|
| 127 |
background: #f9fafb;
|
| 128 |
-
padding:
|
| 129 |
-
border-radius:
|
| 130 |
-
border-left:
|
| 131 |
border: 1px solid #e5e7eb;
|
| 132 |
margin-top: 1rem;
|
| 133 |
line-height: 1.6;
|
| 134 |
}
|
| 135 |
-
|
| 136 |
.output-box h1,
|
| 137 |
.output-box h2,
|
| 138 |
.output-box h3 {
|
|
@@ -140,156 +148,143 @@ def create_app() -> gr.Blocks:
|
|
| 140 |
margin-top: 1rem;
|
| 141 |
margin-bottom: 0.5rem;
|
| 142 |
}
|
| 143 |
-
|
| 144 |
.output-box h1 {
|
| 145 |
-
border-bottom:
|
| 146 |
padding-bottom: 0.5rem;
|
| 147 |
}
|
| 148 |
-
|
| 149 |
/* Success and error states */
|
| 150 |
.success {
|
| 151 |
color: #059669;
|
| 152 |
}
|
| 153 |
-
|
| 154 |
.error {
|
| 155 |
color: #dc2626;
|
| 156 |
}
|
| 157 |
-
|
| 158 |
.warning {
|
| 159 |
color: #d97706;
|
| 160 |
}
|
| 161 |
-
|
| 162 |
/* Input labels */
|
| 163 |
.input-label {
|
| 164 |
-
font-weight:
|
| 165 |
-
color: #
|
| 166 |
-
margin-bottom: 0.
|
| 167 |
-
font-size: 0.
|
| 168 |
}
|
| 169 |
-
|
| 170 |
/* Button styling */
|
| 171 |
.gr-button {
|
| 172 |
-
font-weight:
|
| 173 |
-
border-radius:
|
| 174 |
-
transition: all 0.2s ease;
|
| 175 |
text-transform: none;
|
| 176 |
letter-spacing: 0;
|
| 177 |
}
|
| 178 |
-
|
| 179 |
-
.gr-button:hover {
|
| 180 |
-
transform: translateY(-1px);
|
| 181 |
-
box-shadow: 0 4px 12px rgba(0, 0, 0, 0.15);
|
| 182 |
-
}
|
| 183 |
-
|
| 184 |
-
.gr-button:active {
|
| 185 |
-
transform: translateY(0);
|
| 186 |
-
}
|
| 187 |
-
|
| 188 |
/* Examples section */
|
| 189 |
.examples-section {
|
| 190 |
-
background: #
|
| 191 |
-
border-radius:
|
| 192 |
-
padding:
|
| 193 |
-
margin-top:
|
| 194 |
-
border: 1px solid #
|
| 195 |
}
|
| 196 |
-
|
| 197 |
/* About section */
|
| 198 |
.about-container {
|
| 199 |
max-width: 900px;
|
| 200 |
margin: 0 auto;
|
| 201 |
-
padding:
|
| 202 |
}
|
| 203 |
-
|
| 204 |
.about-container h2 {
|
| 205 |
color: #1f2937;
|
| 206 |
margin-top: 1.5rem;
|
| 207 |
margin-bottom: 0.75rem;
|
| 208 |
font-weight: 600;
|
| 209 |
}
|
| 210 |
-
|
| 211 |
.feature-grid {
|
| 212 |
display: grid;
|
| 213 |
-
grid-template-columns: repeat(auto-fit, minmax(
|
| 214 |
-
gap:
|
| 215 |
margin: 1.5rem 0;
|
| 216 |
}
|
| 217 |
-
|
| 218 |
.feature-card {
|
| 219 |
-
background:
|
| 220 |
-
padding:
|
| 221 |
-
border-radius:
|
| 222 |
border: 1px solid #e5e7eb;
|
| 223 |
-
transition: all 0.3s ease;
|
| 224 |
}
|
| 225 |
-
|
| 226 |
.feature-card:hover {
|
| 227 |
-
|
| 228 |
-
border-color: #2563eb;
|
| 229 |
-
transform: translateY(-2px);
|
| 230 |
}
|
| 231 |
-
|
| 232 |
.feature-card h3 {
|
| 233 |
margin: 0 0 0.5rem 0;
|
| 234 |
color: #1f2937;
|
| 235 |
-
font-size:
|
| 236 |
}
|
| 237 |
-
|
| 238 |
.feature-card p {
|
| 239 |
margin: 0;
|
| 240 |
color: #6b7280;
|
| 241 |
-
font-size: 0.
|
| 242 |
-
line-height: 1.
|
| 243 |
}
|
| 244 |
-
|
| 245 |
/* Row and column adjustments */
|
| 246 |
.gr-row {
|
| 247 |
gap: 1rem;
|
| 248 |
}
|
| 249 |
-
|
| 250 |
/* Loading state */
|
| 251 |
.gr-loading {
|
| 252 |
-
opacity: 0.
|
| 253 |
}
|
| 254 |
-
|
| 255 |
/* Markdown improvements */
|
| 256 |
.gr-markdown {
|
| 257 |
line-height: 1.6;
|
| 258 |
}
|
| 259 |
-
|
| 260 |
.gr-markdown strong {
|
| 261 |
font-weight: 600;
|
| 262 |
color: #1f2937;
|
| 263 |
}
|
| 264 |
-
|
| 265 |
.gr-markdown code {
|
| 266 |
background: #f3f4f6;
|
| 267 |
-
padding: 0.
|
| 268 |
-
border-radius:
|
| 269 |
font-family: 'Courier New', monospace;
|
| 270 |
-
font-size: 0.
|
| 271 |
-
color: #
|
| 272 |
}
|
| 273 |
-
|
| 274 |
/* Form spacing */
|
| 275 |
.gr-form {
|
| 276 |
-
gap:
|
| 277 |
}
|
| 278 |
-
|
| 279 |
/* Responsive adjustments */
|
| 280 |
@media (max-width: 768px) {
|
| 281 |
.header-container {
|
| 282 |
-
padding:
|
| 283 |
}
|
| 284 |
-
|
| 285 |
.header-container h1 {
|
| 286 |
-
font-size: 1.
|
| 287 |
}
|
| 288 |
-
|
| 289 |
.tool-section {
|
| 290 |
-
padding:
|
| 291 |
}
|
| 292 |
-
|
| 293 |
.feature-grid {
|
| 294 |
grid-template-columns: 1fr;
|
| 295 |
}
|
|
@@ -297,7 +292,7 @@ def create_app() -> gr.Blocks:
|
|
| 297 |
"""
|
| 298 |
) as demo:
|
| 299 |
|
| 300 |
-
# Header with
|
| 301 |
gr.HTML("""
|
| 302 |
<div class="header-container">
|
| 303 |
<h1>β‘ EcoMCP</h1>
|
|
|
|
| 19 |
except ImportError:
|
| 20 |
LLAMAINDEX_AVAILABLE = False
|
| 21 |
|
| 22 |
+
import os
|
| 23 |
+
|
| 24 |
+
# Check if running in Hugging Face Space environment
|
| 25 |
+
is_hf_space = os.environ.get("HUGGINGFACE_SPACES") == "1"
|
| 26 |
+
|
| 27 |
# Initialize client and handler
|
| 28 |
+
server_script_path = "src/server/mcp_server.py"
|
| 29 |
+
client = MCPClient(server_script=server_script_path)
|
| 30 |
handler = ToolCallHandler(client)
|
| 31 |
|
| 32 |
# Initialize knowledge base if available
|
|
|
|
| 36 |
kb = EcoMCPKnowledgeBase()
|
| 37 |
if os.path.exists("./docs"):
|
| 38 |
kb.initialize("./docs")
|
| 39 |
+
print("Knowledge base initialized successfully")
|
| 40 |
except Exception as e:
|
| 41 |
print(f"Warning: Could not initialize knowledge base: {e}")
|
| 42 |
kb = None
|
| 43 |
|
| 44 |
|
| 45 |
def create_theme() -> gr.themes.Base:
|
| 46 |
+
"""Create a minimal, clean Gradio theme without gradients or colorful elements"""
|
| 47 |
+
return gr.themes.Default(
|
| 48 |
+
primary_hue=gr.themes.colors.neutral,
|
| 49 |
+
neutral_hue=gr.themes.colors.neutral,
|
| 50 |
+
font=["ui-sans-serif", "system-ui", "sans-serif"],
|
| 51 |
+
spacing_size=gr.themes.sizes.spacing.md,
|
| 52 |
+
radius_size=gr.themes.sizes.radius.sm,
|
| 53 |
)
|
| 54 |
|
| 55 |
|
|
|
|
| 66 |
* {
|
| 67 |
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, 'Helvetica Neue', Arial, sans-serif;
|
| 68 |
}
|
| 69 |
+
|
| 70 |
+
/* Header section - minimal styling */
|
| 71 |
.header-container {
|
| 72 |
text-align: center;
|
| 73 |
+
padding: 2rem 1rem;
|
| 74 |
+
background: #ffffff;
|
| 75 |
+
color: #333333;
|
| 76 |
+
border-radius: 8px;
|
| 77 |
margin-bottom: 2rem;
|
| 78 |
+
border: 1px solid #e5e7eb;
|
| 79 |
}
|
| 80 |
+
|
| 81 |
.header-container h1 {
|
| 82 |
margin: 0 0 0.5rem 0;
|
| 83 |
+
font-size: 2rem;
|
| 84 |
+
font-weight: 600;
|
| 85 |
+
color: #1f2937;
|
| 86 |
}
|
| 87 |
+
|
| 88 |
.header-container p {
|
| 89 |
margin: 0.25rem 0;
|
| 90 |
+
font-size: 1rem;
|
| 91 |
+
color: #6b7280;
|
| 92 |
+
font-weight: 400;
|
| 93 |
}
|
| 94 |
+
|
| 95 |
.header-subtitle {
|
| 96 |
+
font-size: 0.9rem;
|
| 97 |
+
color: #9ca3af;
|
| 98 |
margin-top: 0.5rem;
|
| 99 |
}
|
| 100 |
+
|
| 101 |
/* Tab styling */
|
| 102 |
.tool-tab {
|
| 103 |
padding: 0;
|
| 104 |
}
|
| 105 |
+
|
| 106 |
/* Tool section */
|
| 107 |
.tool-section {
|
| 108 |
+
padding: 1.5rem;
|
| 109 |
+
background: #ffffff;
|
| 110 |
+
border-radius: 8px;
|
| 111 |
border: 1px solid #e5e7eb;
|
| 112 |
+
margin-bottom: 1rem;
|
|
|
|
| 113 |
}
|
| 114 |
+
|
| 115 |
.tool-section:hover {
|
|
|
|
| 116 |
border-color: #d1d5db;
|
| 117 |
}
|
| 118 |
+
|
| 119 |
.tool-section h3 {
|
| 120 |
margin: 0 0 0.5rem 0;
|
| 121 |
+
font-size: 1.25rem;
|
| 122 |
font-weight: 600;
|
| 123 |
color: #1f2937;
|
| 124 |
}
|
| 125 |
+
|
| 126 |
.tool-description {
|
| 127 |
color: #6b7280;
|
| 128 |
+
font-size: 0.9rem;
|
| 129 |
+
margin-bottom: 1rem;
|
| 130 |
line-height: 1.5;
|
| 131 |
}
|
| 132 |
+
|
| 133 |
/* Output box */
|
| 134 |
.output-box {
|
| 135 |
background: #f9fafb;
|
| 136 |
+
padding: 1rem;
|
| 137 |
+
border-radius: 4px;
|
| 138 |
+
border-left: 2px solid #9ca3af;
|
| 139 |
border: 1px solid #e5e7eb;
|
| 140 |
margin-top: 1rem;
|
| 141 |
line-height: 1.6;
|
| 142 |
}
|
| 143 |
+
|
| 144 |
.output-box h1,
|
| 145 |
.output-box h2,
|
| 146 |
.output-box h3 {
|
|
|
|
| 148 |
margin-top: 1rem;
|
| 149 |
margin-bottom: 0.5rem;
|
| 150 |
}
|
| 151 |
+
|
| 152 |
.output-box h1 {
|
| 153 |
+
border-bottom: 1px solid #e5e7eb;
|
| 154 |
padding-bottom: 0.5rem;
|
| 155 |
}
|
| 156 |
+
|
| 157 |
/* Success and error states */
|
| 158 |
.success {
|
| 159 |
color: #059669;
|
| 160 |
}
|
| 161 |
+
|
| 162 |
.error {
|
| 163 |
color: #dc2626;
|
| 164 |
}
|
| 165 |
+
|
| 166 |
.warning {
|
| 167 |
color: #d97706;
|
| 168 |
}
|
| 169 |
+
|
| 170 |
/* Input labels */
|
| 171 |
.input-label {
|
| 172 |
+
font-weight: 500;
|
| 173 |
+
color: #374151;
|
| 174 |
+
margin-bottom: 0.25rem;
|
| 175 |
+
font-size: 0.9rem;
|
| 176 |
}
|
| 177 |
+
|
| 178 |
/* Button styling */
|
| 179 |
.gr-button {
|
| 180 |
+
font-weight: 500;
|
| 181 |
+
border-radius: 4px;
|
|
|
|
| 182 |
text-transform: none;
|
| 183 |
letter-spacing: 0;
|
| 184 |
}
|
| 185 |
+
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 186 |
/* Examples section */
|
| 187 |
.examples-section {
|
| 188 |
+
background: #f9fafb;
|
| 189 |
+
border-radius: 4px;
|
| 190 |
+
padding: 0.75rem;
|
| 191 |
+
margin-top: 1rem;
|
| 192 |
+
border: 1px solid #e5e7eb;
|
| 193 |
}
|
| 194 |
+
|
| 195 |
/* About section */
|
| 196 |
.about-container {
|
| 197 |
max-width: 900px;
|
| 198 |
margin: 0 auto;
|
| 199 |
+
padding: 1.5rem;
|
| 200 |
}
|
| 201 |
+
|
| 202 |
.about-container h2 {
|
| 203 |
color: #1f2937;
|
| 204 |
margin-top: 1.5rem;
|
| 205 |
margin-bottom: 0.75rem;
|
| 206 |
font-weight: 600;
|
| 207 |
}
|
| 208 |
+
|
| 209 |
.feature-grid {
|
| 210 |
display: grid;
|
| 211 |
+
grid-template-columns: repeat(auto-fit, minmax(200px, 1fr));
|
| 212 |
+
gap: 1rem;
|
| 213 |
margin: 1.5rem 0;
|
| 214 |
}
|
| 215 |
+
|
| 216 |
.feature-card {
|
| 217 |
+
background: #ffffff;
|
| 218 |
+
padding: 1rem;
|
| 219 |
+
border-radius: 6px;
|
| 220 |
border: 1px solid #e5e7eb;
|
|
|
|
| 221 |
}
|
| 222 |
+
|
| 223 |
.feature-card:hover {
|
| 224 |
+
border-color: #d1d5db;
|
|
|
|
|
|
|
| 225 |
}
|
| 226 |
+
|
| 227 |
.feature-card h3 {
|
| 228 |
margin: 0 0 0.5rem 0;
|
| 229 |
color: #1f2937;
|
| 230 |
+
font-size: 1rem;
|
| 231 |
}
|
| 232 |
+
|
| 233 |
.feature-card p {
|
| 234 |
margin: 0;
|
| 235 |
color: #6b7280;
|
| 236 |
+
font-size: 0.85rem;
|
| 237 |
+
line-height: 1.4;
|
| 238 |
}
|
| 239 |
+
|
| 240 |
/* Row and column adjustments */
|
| 241 |
.gr-row {
|
| 242 |
gap: 1rem;
|
| 243 |
}
|
| 244 |
+
|
| 245 |
/* Loading state */
|
| 246 |
.gr-loading {
|
| 247 |
+
opacity: 0.8;
|
| 248 |
}
|
| 249 |
+
|
| 250 |
/* Markdown improvements */
|
| 251 |
.gr-markdown {
|
| 252 |
line-height: 1.6;
|
| 253 |
}
|
| 254 |
+
|
| 255 |
.gr-markdown strong {
|
| 256 |
font-weight: 600;
|
| 257 |
color: #1f2937;
|
| 258 |
}
|
| 259 |
+
|
| 260 |
.gr-markdown code {
|
| 261 |
background: #f3f4f6;
|
| 262 |
+
padding: 0.1rem 0.3rem;
|
| 263 |
+
border-radius: 3px;
|
| 264 |
font-family: 'Courier New', monospace;
|
| 265 |
+
font-size: 0.85rem;
|
| 266 |
+
color: #6b7280;
|
| 267 |
}
|
| 268 |
+
|
| 269 |
/* Form spacing */
|
| 270 |
.gr-form {
|
| 271 |
+
gap: 1rem;
|
| 272 |
}
|
| 273 |
+
|
| 274 |
/* Responsive adjustments */
|
| 275 |
@media (max-width: 768px) {
|
| 276 |
.header-container {
|
| 277 |
+
padding: 1.5rem 1rem;
|
| 278 |
}
|
| 279 |
+
|
| 280 |
.header-container h1 {
|
| 281 |
+
font-size: 1.5rem;
|
| 282 |
}
|
| 283 |
+
|
| 284 |
.tool-section {
|
| 285 |
+
padding: 1rem;
|
| 286 |
}
|
| 287 |
+
|
| 288 |
.feature-grid {
|
| 289 |
grid-template-columns: 1fr;
|
| 290 |
}
|
|
|
|
| 292 |
"""
|
| 293 |
) as demo:
|
| 294 |
|
| 295 |
+
# Header with minimal styling
|
| 296 |
gr.HTML("""
|
| 297 |
<div class="header-container">
|
| 298 |
<h1>β‘ EcoMCP</h1>
|
src/ui/formatters.py
CHANGED
|
@@ -6,76 +6,73 @@ from typing import Dict
|
|
| 6 |
|
| 7 |
|
| 8 |
def create_sentiment_chart_html(sentiment_data: Dict[str, int]) -> str:
|
| 9 |
-
"""Create
|
| 10 |
positive = sentiment_data.get("positive", 0)
|
| 11 |
neutral = sentiment_data.get("neutral", 0)
|
| 12 |
negative = sentiment_data.get("negative", 0)
|
| 13 |
-
|
| 14 |
# Determine dominant sentiment
|
| 15 |
sentiments = [("positive", positive), ("neutral", neutral), ("negative", negative)]
|
| 16 |
dominant = max(sentiments, key=lambda x: x[1])[0]
|
| 17 |
-
|
| 18 |
return f"""
|
| 19 |
-
<div style="margin:
|
| 20 |
-
<h3 style="font-size: 1.
|
| 21 |
π Sentiment Distribution
|
| 22 |
</h3>
|
| 23 |
-
|
| 24 |
-
<div style="display: grid; grid-template-columns: repeat(3, 1fr); gap:
|
| 25 |
-
<div style="background: {'#
|
| 26 |
-
border-radius:
|
| 27 |
-
text-align: center; border:
|
| 28 |
-
|
| 29 |
-
<div style="font-size:
|
| 30 |
-
<div style="font-size: 0.95rem; font-weight: 500;">Positive π</div>
|
| 31 |
</div>
|
| 32 |
-
|
| 33 |
-
<div style="background: {'#
|
| 34 |
-
border-radius:
|
| 35 |
-
text-align: center; border:
|
| 36 |
-
|
| 37 |
-
<div style="font-size:
|
| 38 |
-
<div style="font-size: 0.95rem; font-weight: 500;">Neutral π</div>
|
| 39 |
</div>
|
| 40 |
-
|
| 41 |
-
<div style="background: {'#
|
| 42 |
-
border-radius:
|
| 43 |
-
text-align: center; border:
|
| 44 |
-
|
| 45 |
-
<div style="font-size:
|
| 46 |
-
<div style="font-size: 0.95rem; font-weight: 500;">Negative π</div>
|
| 47 |
</div>
|
| 48 |
</div>
|
| 49 |
-
|
| 50 |
-
<!-- Progress bars for
|
| 51 |
-
<div style="margin-top:
|
| 52 |
-
<div style="margin-bottom:
|
| 53 |
-
<div style="display: flex; justify-content: space-between; margin-bottom: 0.
|
| 54 |
-
<span style="font-weight: 500;">Positive
|
| 55 |
-
<span style="font-weight:
|
| 56 |
</div>
|
| 57 |
-
<div style="background: #e5e7eb; border-radius:
|
| 58 |
-
<div style="background:
|
| 59 |
</div>
|
| 60 |
</div>
|
| 61 |
-
|
| 62 |
-
<div style="margin-bottom:
|
| 63 |
-
<div style="display: flex; justify-content: space-between; margin-bottom: 0.
|
| 64 |
-
<span style="font-weight: 500;">Neutral
|
| 65 |
-
<span style="font-weight:
|
| 66 |
</div>
|
| 67 |
-
<div style="background: #e5e7eb; border-radius:
|
| 68 |
-
<div style="background:
|
| 69 |
</div>
|
| 70 |
</div>
|
| 71 |
-
|
| 72 |
<div>
|
| 73 |
-
<div style="display: flex; justify-content: space-between; margin-bottom: 0.
|
| 74 |
-
<span style="font-weight: 500;">Negative
|
| 75 |
-
<span style="font-weight:
|
| 76 |
</div>
|
| 77 |
-
<div style="background: #e5e7eb; border-radius:
|
| 78 |
-
<div style="background:
|
| 79 |
</div>
|
| 80 |
</div>
|
| 81 |
</div>
|
|
@@ -84,66 +81,59 @@ def create_sentiment_chart_html(sentiment_data: Dict[str, int]) -> str:
|
|
| 84 |
|
| 85 |
|
| 86 |
def create_pricing_tiers_html(tiers: Dict[str, float]) -> str:
|
| 87 |
-
"""Create
|
| 88 |
html = """
|
| 89 |
-
<div style="margin:
|
| 90 |
-
<h3 style="font-size: 1.
|
| 91 |
π° Pricing Tier Comparison
|
| 92 |
</h3>
|
| 93 |
-
<div style="display: grid; grid-template-columns: repeat(auto-fit, minmax(
|
| 94 |
"""
|
| 95 |
-
|
| 96 |
tier_info = {
|
| 97 |
"budget": {
|
| 98 |
-
"name": "Budget
|
| 99 |
-
"color": "#
|
| 100 |
-
"
|
| 101 |
-
"bg": "#fff3e0"
|
| 102 |
},
|
| 103 |
"standard": {
|
| 104 |
"name": "Standard",
|
| 105 |
-
"color": "#
|
| 106 |
-
"
|
| 107 |
-
"bg": "#f1f8e9"
|
| 108 |
},
|
| 109 |
"premium": {
|
| 110 |
"name": "Premium",
|
| 111 |
-
"color": "#
|
| 112 |
-
"
|
| 113 |
-
"bg": "#e3f2fd"
|
| 114 |
}
|
| 115 |
}
|
| 116 |
-
|
| 117 |
# Find highest price to highlight
|
| 118 |
max_price = max(tiers.values()) if tiers else 0
|
| 119 |
-
|
| 120 |
for tier, price in tiers.items():
|
| 121 |
info = tier_info.get(tier, {
|
| 122 |
"name": tier.title(),
|
| 123 |
-
"color": "#
|
| 124 |
-
"
|
| 125 |
-
"bg": "#f5f5f5"
|
| 126 |
})
|
| 127 |
-
|
| 128 |
is_best = price == max_price
|
| 129 |
-
highlight = "2px solid "
|
| 130 |
-
|
| 131 |
-
|
| 132 |
html += f"""
|
| 133 |
-
<div style="background: {info['bg']}; border: {highlight}; border-radius:
|
| 134 |
-
text-align: center;
|
| 135 |
-
|
| 136 |
-
<div style="font-size: 0.85rem; color: #666; margin-bottom: 0.75rem; font-weight: 500; text-transform: uppercase; letter-spacing: 0.5px;">
|
| 137 |
{info['name']}
|
| 138 |
-
{'<span style="margin-left: 0.
|
| 139 |
</div>
|
| 140 |
-
<div style="background:
|
| 141 |
-
<div style="font-size:
|
| 142 |
-
<div style="font-size: 0.9rem; opacity: 0.95;">Recommended Price</div>
|
| 143 |
</div>
|
| 144 |
</div>
|
| 145 |
"""
|
| 146 |
-
|
| 147 |
html += """
|
| 148 |
</div>
|
| 149 |
</div>
|
|
|
|
| 6 |
|
| 7 |
|
| 8 |
def create_sentiment_chart_html(sentiment_data: Dict[str, int]) -> str:
|
| 9 |
+
"""Create minimal HTML sentiment visualization"""
|
| 10 |
positive = sentiment_data.get("positive", 0)
|
| 11 |
neutral = sentiment_data.get("neutral", 0)
|
| 12 |
negative = sentiment_data.get("negative", 0)
|
| 13 |
+
|
| 14 |
# Determine dominant sentiment
|
| 15 |
sentiments = [("positive", positive), ("neutral", neutral), ("negative", negative)]
|
| 16 |
dominant = max(sentiments, key=lambda x: x[1])[0]
|
| 17 |
+
|
| 18 |
return f"""
|
| 19 |
+
<div style="margin: 1.5rem 0;">
|
| 20 |
+
<h3 style="font-size: 1.2rem; font-weight: 600; color: #1f2937; margin-bottom: 1rem;">
|
| 21 |
π Sentiment Distribution
|
| 22 |
</h3>
|
| 23 |
+
|
| 24 |
+
<div style="display: grid; grid-template-columns: repeat(3, 1fr); gap: 0.75rem; margin-bottom: 1rem;">
|
| 25 |
+
<div style="background: {'#f0f0f0' if dominant == 'positive' else '#f8fafc'};
|
| 26 |
+
border-radius: 6px; padding: 1rem; color: #374151;
|
| 27 |
+
text-align: center; border: 1px solid #e5e7eb;">
|
| 28 |
+
<div style="font-size: 1.75rem; font-weight: 600; margin-bottom: 0.25rem;">{positive}%</div>
|
| 29 |
+
<div style="font-size: 0.85rem; font-weight: 500;">Positive π</div>
|
|
|
|
| 30 |
</div>
|
| 31 |
+
|
| 32 |
+
<div style="background: {'#f0f0f0' if dominant == 'neutral' else '#f8fafc'};
|
| 33 |
+
border-radius: 6px; padding: 1rem; color: #374151;
|
| 34 |
+
text-align: center; border: 1px solid #e5e7eb;">
|
| 35 |
+
<div style="font-size: 1.75rem; font-weight: 600; margin-bottom: 0.25rem;">{neutral}%</div>
|
| 36 |
+
<div style="font-size: 0.85rem; font-weight: 500;">Neutral π</div>
|
|
|
|
| 37 |
</div>
|
| 38 |
+
|
| 39 |
+
<div style="background: {'#f0f0f0' if dominant == 'negative' else '#f8fafc'};
|
| 40 |
+
border-radius: 6px; padding: 1rem; color: #374151;
|
| 41 |
+
text-align: center; border: 1px solid #e5e7eb;">
|
| 42 |
+
<div style="font-size: 1.75rem; font-weight: 600; margin-bottom: 0.25rem;">{negative}%</div>
|
| 43 |
+
<div style="font-size: 0.85rem; font-weight: 500;">Negative π</div>
|
|
|
|
| 44 |
</div>
|
| 45 |
</div>
|
| 46 |
+
|
| 47 |
+
<!-- Progress bars for visual representation -->
|
| 48 |
+
<div style="margin-top: 1rem;">
|
| 49 |
+
<div style="margin-bottom: 0.75rem;">
|
| 50 |
+
<div style="display: flex; justify-content: space-between; margin-bottom: 0.25rem; font-size: 0.85rem; color: #6b7280;">
|
| 51 |
+
<span style="font-weight: 500;">Positive</span>
|
| 52 |
+
<span style="font-weight: 500;">{positive}%</span>
|
| 53 |
</div>
|
| 54 |
+
<div style="background: #e5e7eb; border-radius: 3px; height: 6px;">
|
| 55 |
+
<div style="background: #6b7280; height: 100%; width: {positive}%; border-radius: 3px;"></div>
|
| 56 |
</div>
|
| 57 |
</div>
|
| 58 |
+
|
| 59 |
+
<div style="margin-bottom: 0.75rem;">
|
| 60 |
+
<div style="display: flex; justify-content: space-between; margin-bottom: 0.25rem; font-size: 0.85rem; color: #6b7280;">
|
| 61 |
+
<span style="font-weight: 500;">Neutral</span>
|
| 62 |
+
<span style="font-weight: 500;">{neutral}%</span>
|
| 63 |
</div>
|
| 64 |
+
<div style="background: #e5e7eb; border-radius: 3px; height: 6px;">
|
| 65 |
+
<div style="background: #6b7280; height: 100%; width: {neutral}%; border-radius: 3px;"></div>
|
| 66 |
</div>
|
| 67 |
</div>
|
| 68 |
+
|
| 69 |
<div>
|
| 70 |
+
<div style="display: flex; justify-content: space-between; margin-bottom: 0.25rem; font-size: 0.85rem; color: #6b7280;">
|
| 71 |
+
<span style="font-weight: 500;">Negative</span>
|
| 72 |
+
<span style="font-weight: 500;">{negative}%</span>
|
| 73 |
</div>
|
| 74 |
+
<div style="background: #e5e7eb; border-radius: 3px; height: 6px;">
|
| 75 |
+
<div style="background: #6b7280; height: 100%; width: {negative}%; border-radius: 3px;"></div>
|
| 76 |
</div>
|
| 77 |
</div>
|
| 78 |
</div>
|
|
|
|
| 81 |
|
| 82 |
|
| 83 |
def create_pricing_tiers_html(tiers: Dict[str, float]) -> str:
|
| 84 |
+
"""Create minimal HTML pricing tier visualization"""
|
| 85 |
html = """
|
| 86 |
+
<div style="margin: 1.5rem 0;">
|
| 87 |
+
<h3 style="font-size: 1.2rem; font-weight: 600; color: #1f2937; margin-bottom: 1rem;">
|
| 88 |
π° Pricing Tier Comparison
|
| 89 |
</h3>
|
| 90 |
+
<div style="display: grid; grid-template-columns: repeat(auto-fit, minmax(180px, 1fr)); gap: 1rem; margin-bottom: 1rem;">
|
| 91 |
"""
|
| 92 |
+
|
| 93 |
tier_info = {
|
| 94 |
"budget": {
|
| 95 |
+
"name": "Budget",
|
| 96 |
+
"color": "#9ca3af",
|
| 97 |
+
"bg": "#f9fafb"
|
|
|
|
| 98 |
},
|
| 99 |
"standard": {
|
| 100 |
"name": "Standard",
|
| 101 |
+
"color": "#9ca3af",
|
| 102 |
+
"bg": "#f9fafb"
|
|
|
|
| 103 |
},
|
| 104 |
"premium": {
|
| 105 |
"name": "Premium",
|
| 106 |
+
"color": "#9ca3af",
|
| 107 |
+
"bg": "#f9fafb"
|
|
|
|
| 108 |
}
|
| 109 |
}
|
| 110 |
+
|
| 111 |
# Find highest price to highlight
|
| 112 |
max_price = max(tiers.values()) if tiers else 0
|
| 113 |
+
|
| 114 |
for tier, price in tiers.items():
|
| 115 |
info = tier_info.get(tier, {
|
| 116 |
"name": tier.title(),
|
| 117 |
+
"color": "#9ca3af",
|
| 118 |
+
"bg": "#f9fafb"
|
|
|
|
| 119 |
})
|
| 120 |
+
|
| 121 |
is_best = price == max_price
|
| 122 |
+
highlight = "2px solid #9ca3af" if is_best else "1px solid #e5e7eb"
|
| 123 |
+
|
|
|
|
| 124 |
html += f"""
|
| 125 |
+
<div style="background: {info['bg']}; border: {highlight}; border-radius: 6px; padding: 1rem;
|
| 126 |
+
text-align: center;">
|
| 127 |
+
<div style="font-size: 0.8rem; color: #4b5563; margin-bottom: 0.5rem; font-weight: 500; text-transform: uppercase;">
|
|
|
|
| 128 |
{info['name']}
|
| 129 |
+
{'<span style="margin-left: 0.25rem; background: #e5e7eb; color: #4b5563; padding: 0.1rem 0.3rem; border-radius: 3px; font-size: 0.65rem;">RECOMMENDED</span>' if is_best else ''}
|
| 130 |
</div>
|
| 131 |
+
<div style="background: #f3f4f6; color: #1f2937; border-radius: 4px; padding: 0.75rem; margin-bottom: 0.5rem;">
|
| 132 |
+
<div style="font-size: 1.5rem; font-weight: 600;">${price:.2f}</div>
|
|
|
|
| 133 |
</div>
|
| 134 |
</div>
|
| 135 |
"""
|
| 136 |
+
|
| 137 |
html += """
|
| 138 |
</div>
|
| 139 |
</div>
|