Spaces:
Running
Running
metadata
title: AnyCoder
emoji: π
colorFrom: blue
colorTo: purple
sdk: docker
app_port: 7860
pinned: false
disable_embedding: false
hf_oauth: true
hf_oauth_expiration_minutes: 43200
hf_oauth_scopes:
- manage-repos
Note: This is the Docker Space configuration for the React frontend version.
For the original Gradio app, seeREADME_GRADIO.md.
AnyCoder - AI Code Generator with React Frontend
AnyCoder is a full-stack AI-powered code generator with a modern React/TypeScript frontend and FastAPI backend. Generate applications by describing them in plain English, with support for multiple AI models and one-click deployment to Hugging Face Spaces.
π¨ Features
- Modern React UI: Apple-inspired design with VS Code layout
- Real-time Streaming: Server-Sent Events for live code generation
- Multi-Model Support: MiniMax M2, DeepSeek V3, and more via HuggingFace InferenceClient
- Multiple Languages: HTML, Gradio, Streamlit, React, Transformers.js, ComfyUI
- Authentication: HuggingFace OAuth + Dev mode for local testing
- One-Click Deployment: Deploy generated apps directly to HF Spaces
ποΈ Architecture
anycoder/
βββ backend_api.py # FastAPI backend with streaming
βββ frontend/ # Next.js React frontend
β βββ src/
β β βββ app/ # Pages (page.tsx, layout.tsx, globals.css)
β β βββ components/ # React components
β β βββ lib/ # API client, auth utilities
β β βββ types/ # TypeScript types
β βββ package.json
βββ anycoder_app/ # Original Gradio app modules
β βββ agent.py
β βββ config.py
β βββ deploy.py
β βββ ...
βββ app.py # Original Gradio interface
βββ requirements.txt # Python dependencies
βββ Dockerfile # Docker Space configuration
βββ start_fullstack.sh # Local development script
π Quick Start
Local Development
- Backend:
export HF_TOKEN="your_huggingface_token"
export GEMINI_API_KEY="your_gemini_api_key"
python backend_api.py
- Frontend (new terminal):
cd frontend
npm install
npm run dev
- Open
http://localhost:3000
Using start script:
export HF_TOKEN="your_token"
export GEMINI_API_KEY="your_gemini_api_key"
./start_fullstack.sh
π³ Docker Space Deployment
This app runs as a Docker Space on HuggingFace. The Dockerfile:
- Builds the Next.js frontend
- Runs FastAPI backend on port 7860
- Uses proper user permissions (UID 1000)
- Handles environment variables securely
π Authentication
- Dev Mode (localhost): Mock login for testing
- Production: HuggingFace OAuth with manage-repos scope
π Supported Languages
html- Static HTML pagesgradio- Python Gradio appsstreamlit- Python Streamlit appsreact- React/Next.js appstransformers.js- Browser ML appscomfyui- ComfyUI workflows
π€ Available Models
- Gemini 3 Pro Preview (Default) - Google's latest with deep thinking & Google Search
- MiniMax M2 (via HF router with Novita)
- DeepSeek V3/V3.1
- DeepSeek R1
- And more via HuggingFace InferenceClient
π― Usage
- Sign in with HuggingFace (or use Dev Login locally)
- Select a language and AI model
- Describe your app in the chat
- Watch code generate in real-time
- Click π Deploy to publish to HF Spaces
π οΈ Environment Variables
HF_TOKEN- HuggingFace API token (required)GEMINI_API_KEY- Google Gemini API key (required for Gemini 3 Pro Preview)POE_API_KEY- Poe API key (optional, for GPT-5 and Claude models)DASHSCOPE_API_KEY- DashScope API key (optional, for Qwen models)OPENROUTER_API_KEY- OpenRouter API key (optional, for Sherlock models)MISTRAL_API_KEY- Mistral API key (optional, for Mistral models)
π¦ Tech Stack
Frontend:
- Next.js 14
- TypeScript
- Tailwind CSS
- Monaco Editor
Backend:
- FastAPI
- HuggingFace Hub
- Server-Sent Events (SSE)
π License
MIT