Spaces:
Running
Running
File size: 4,150 Bytes
c2b3623 b42dfef 0055506 b42dfef c2b3623 b42dfef 579db2f 4d53e2b f7c3e69 0055506 c2b3623 b42dfef 082d9d1 b42dfef 082d9d1 b42dfef 082d9d1 b42dfef 78fc423 b42dfef 78fc423 b42dfef 78fc423 b42dfef 78fc423 28c44e9 b42dfef 78fc423 b42dfef 78fc423 b42dfef 78fc423 b42dfef 28c44e9 b42dfef 78fc423 2872dcd b42dfef 2872dcd b42dfef 2872dcd b42dfef e2d3712 b42dfef e2d3712 b42dfef e2d3712 b42dfef e2d3712 28c44e9 b42dfef e2d3712 b42dfef e2d3712 b42dfef e2d3712 b42dfef 78fc423 b42dfef 28c44e9 78fc423 b42dfef 78fc423 b42dfef 082d9d1 b42dfef 082d9d1 b42dfef e287280 0055506 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 |
---
title: AnyCoder
emoji: π
colorFrom: blue
colorTo: purple
sdk: docker
app_port: 7860
pinned: false
disable_embedding: false
hf_oauth: true
hf_oauth_expiration_minutes: 43200
hf_oauth_scopes:
- manage-repos
---
> **Note:** This is the Docker Space configuration for the React frontend version.
> For the original Gradio app, see `README_GRADIO.md`.
# AnyCoder - AI Code Generator with React Frontend
AnyCoder is a full-stack AI-powered code generator with a modern React/TypeScript frontend and FastAPI backend. Generate applications by describing them in plain English, with support for multiple AI models and one-click deployment to Hugging Face Spaces.
## π¨ Features
- **Modern React UI**: Apple-inspired design with VS Code layout
- **Real-time Streaming**: Server-Sent Events for live code generation
- **Multi-Model Support**: MiniMax M2, DeepSeek V3, and more via HuggingFace InferenceClient
- **Multiple Languages**: HTML, Gradio, Streamlit, React, Transformers.js, ComfyUI
- **Authentication**: HuggingFace OAuth + Dev mode for local testing
- **One-Click Deployment**: Deploy generated apps directly to HF Spaces
## ποΈ Architecture
```
anycoder/
βββ backend_api.py # FastAPI backend with streaming
βββ frontend/ # Next.js React frontend
β βββ src/
β β βββ app/ # Pages (page.tsx, layout.tsx, globals.css)
β β βββ components/ # React components
β β βββ lib/ # API client, auth utilities
β β βββ types/ # TypeScript types
β βββ package.json
βββ anycoder_app/ # Original Gradio app modules
β βββ agent.py
β βββ config.py
β βββ deploy.py
β βββ ...
βββ app.py # Original Gradio interface
βββ requirements.txt # Python dependencies
βββ Dockerfile # Docker Space configuration
βββ start_fullstack.sh # Local development script
```
## π Quick Start
### Local Development
1. **Backend**:
```bash
export HF_TOKEN="your_huggingface_token"
export GEMINI_API_KEY="your_gemini_api_key"
python backend_api.py
```
2. **Frontend** (new terminal):
```bash
cd frontend
npm install
npm run dev
```
3. Open `http://localhost:3000`
### Using start script:
```bash
export HF_TOKEN="your_token"
export GEMINI_API_KEY="your_gemini_api_key"
./start_fullstack.sh
```
## π³ Docker Space Deployment
This app runs as a Docker Space on HuggingFace. The Dockerfile:
- Builds the Next.js frontend
- Runs FastAPI backend on port 7860
- Uses proper user permissions (UID 1000)
- Handles environment variables securely
## π Authentication
- **Dev Mode** (localhost): Mock login for testing
- **Production**: HuggingFace OAuth with manage-repos scope
## π Supported Languages
- `html` - Static HTML pages
- `gradio` - Python Gradio apps
- `streamlit` - Python Streamlit apps
- `react` - React/Next.js apps
- `transformers.js` - Browser ML apps
- `comfyui` - ComfyUI workflows
## π€ Available Models
- **Gemini 3 Pro Preview** (Default) - Google's latest with deep thinking & Google Search
- MiniMax M2 (via HF router with Novita)
- DeepSeek V3/V3.1
- DeepSeek R1
- And more via HuggingFace InferenceClient
## π― Usage
1. Sign in with HuggingFace (or use Dev Login locally)
2. Select a language and AI model
3. Describe your app in the chat
4. Watch code generate in real-time
5. Click **π Deploy** to publish to HF Spaces
## π οΈ Environment Variables
- `HF_TOKEN` - HuggingFace API token (required)
- `GEMINI_API_KEY` - Google Gemini API key (required for Gemini 3 Pro Preview)
- `POE_API_KEY` - Poe API key (optional, for GPT-5 and Claude models)
- `DASHSCOPE_API_KEY` - DashScope API key (optional, for Qwen models)
- `OPENROUTER_API_KEY` - OpenRouter API key (optional, for Sherlock models)
- `MISTRAL_API_KEY` - Mistral API key (optional, for Mistral models)
## π¦ Tech Stack
**Frontend:**
- Next.js 14
- TypeScript
- Tailwind CSS
- Monaco Editor
**Backend:**
- FastAPI
- HuggingFace Hub
- Server-Sent Events (SSE)
## π License
MIT |