title: HuggingClaw
emoji: π₯
colorFrom: yellow
colorTo: red
sdk: docker
pinned: false
license: mit
datasets:
- tao-shen/HuggingClaw-data
short_description: Free always-on AI assistant, no hardware required
app_port: 7860
tags:
- huggingface
- openrouter
- chatbot
- llm
- openclaw
- ai-assistant
- whatsapp
- telegram
- text-generation
- openai-api
- huggingface-spaces
- docker
- deployment
- persistent-storage
- agents
- multi-channel
- openai-compatible
- free-tier
- one-click-deploy
- self-hosted
- messaging-bot
Your always-on AI assistant β free, no server needed
WhatsApp Β· Telegram Β· 40+ channels Β· 16 GB RAM Β· One-click deploy Β· Auto-persistent
What you get
In about 5 minutes, youβll have a free, always-on AI assistant connected to WhatsApp, Telegram, and 40+ other channels β no server, no subscription, no hardware required.
| Free forever | HuggingFace Spaces gives you 2 vCPU + 16 GB RAM at no cost |
| Always online | Your conversations, settings, and credentials survive every restart |
| WhatsApp & Telegram | Works reliably, including channels that HF Spaces normally blocks |
| Any LLM | OpenAI, Claude, Gemini, OpenRouter (200+ models, free tier available), or your own Ollama |
| One-click deploy | Duplicate the Space, set two secrets, done |
Powered by OpenClaw β an open-source AI assistant that normally requires your own machine (e.g. a Mac Mini). HuggingClaw makes it run for free on HuggingFace Spaces by solving two Spaces limitations: data loss on restart (fixed via HF Dataset sync) and DNS failures for some domains like WhatsApp (fixed via DNS-over-HTTPS).
Architecture
Quick Start
1. Duplicate this Space
Click Duplicate this Space on the HuggingClaw Space page.
After duplicating: Edit your Space's
README.mdand update thedatasets:field in the YAML header to point to your own dataset repo (e.g.your-name/YourSpace-data), or remove it entirely. This prevents your Space from appearing as linked to the original dataset.
2. Set Secrets
Go to Settings β Repository secrets and add the following. The only two you must set are HF_TOKEN and one API key.
| Secret | Status | Description | Example |
|---|---|---|---|
HF_TOKEN |
Required | HF Access Token with write permission (create one) | hf_AbCdEfGhIjKlMnOpQrStUvWxYz |
AUTO_CREATE_DATASET |
Recommended | Set to true β HuggingClaw will automatically create a private backup dataset on first startup. No manual setup needed. |
true |
GROQ_API_KEY |
Recommended | Groq API key β Fastest inference (Llama 3.3 70B) | gsk_xxxxxxxxxxxx |
OPENROUTER_API_KEY |
Recommended | OpenRouter API key β 200+ models, free tier available. Easiest way to get started. | sk-or-v1-xxxxxxxxxxxx |
XAI_API_KEY |
Optional | xAI Grok API key β Fast inference, Grok-beta model | gsk_xxxxxxxxxxxx |
OPENAI_API_KEY |
Optional | OpenAI (or any OpenAI-compatible) API key | sk-proj-xxxxxxxxxxxx |
ANTHROPIC_API_KEY |
Optional | Anthropic Claude API key | sk-ant-xxxxxxxxxxxx |
GOOGLE_API_KEY |
Optional | Google / Gemini API key | AIzaSyXxXxXxXxXx |
OPENCLAW_DEFAULT_MODEL |
Optional | Default model for new conversations | groq/llama-3.3-70b-versatile |
Data Persistence
HuggingClaw syncs ~/.openclaw (conversations, settings, credentials) to a private HuggingFace Dataset repo so your data survives every restart.
Option A β Auto mode (recommended)
- Set
AUTO_CREATE_DATASET=truein your Space secrets - Set
HF_TOKENwith write permission - Done β on first startup, HuggingClaw automatically creates a private Dataset repo named
your-username/SpaceName-data. Each duplicated Space gets its own isolated dataset.
(Optional) Set
OPENCLAW_DATASET_REPO=your-name/custom-nameif you prefer a specific repo name.
Option B β Manual mode
- Go to huggingface.co/new-dataset and create a private Dataset repo (e.g.
your-name/HuggingClaw-data) - Set
OPENCLAW_DATASET_REPO=your-name/HuggingClaw-datain your Space secrets - Set
HF_TOKENwith write permission - Done β HuggingClaw will sync to this repo every 60 seconds
Security note:
AUTO_CREATE_DATASETdefaults tofalseβ HuggingClaw will never create repos on your behalf unless you explicitly opt in.
Running Local Models (CPU-Friendly)
HuggingClaw can run small models (β€1B parameters) locally on CPU - perfect for HF Spaces free tier!
Supported Models:
- NeuralNexusLab/HacKing (0.6B) - β Recommended
- TinyLlama-1.1B
- Qwen-1.5B
- Phi-2 (2.7B, may be slower)
Quick Setup:
- Set these secrets in your Space:
| Secret | Value |
|---|---|
LOCAL_MODEL_ENABLED |
true |
LOCAL_MODEL_NAME |
neuralnexuslab/hacking |
LOCAL_MODEL_ID |
neuralnexuslab/hacking |
LOCAL_MODEL_NAME_DISPLAY |
NeuralNexus HacKing 0.6B |
Wait for startup - The model will be pulled on first startup (~30 seconds for 0.6B)
Connect to Control UI - The local model will appear in the model selector
Performance Expectations:
| Model Size | CPU Speed (tokens/s) | RAM Usage |
|---|---|---|
| 0.6B | 20-50 t/s | ~500 MB |
| 1B | 10-20 t/s | ~1 GB |
| 3B | 3-8 t/s | ~2 GB |
Note: 0.6B models run very smoothly on HF Spaces free tier (2 vCPU, 16GB RAM)
Environment Variables
Fine-tune persistence and performance. Set these as Repository Secrets in HF Spaces, or in .env for local Docker.
| Variable | Default | Description |
|---|---|---|
GATEWAY_TOKEN |
huggingclaw |
Gateway token for Control UI access. Override to set a custom token. |
AUTO_CREATE_DATASET |
false |
Auto-create the Dataset repo. Set to true to auto-create a private Dataset repo on first startup. |
SYNC_INTERVAL |
60 |
Backup interval in seconds. How often data syncs to the Dataset repo. |
For the full list (including
OPENAI_BASE_URL,OLLAMA_HOST, proxy settings, etc.), see.env.example.
3. Open the Control UI
Visit your Space URL. Enter the gateway token (default: huggingclaw) to connect. Customize via GATEWAY_TOKEN secret.
Messaging integrations (Telegram, WhatsApp) can be configured directly inside the Control UI after connecting.
Telegram note: HF Spaces blocks
api.telegram.orgDNS. HuggingClaw automatically probes alternative API endpoints at startup and selects one that works β no manual configuration needed.
Configuration
HuggingClaw supports all OpenClaw environment variables β it passes the entire environment to the OpenClaw process (env=os.environ.copy()), so any variable from the OpenClaw docs works out of the box in HF Spaces. This includes:
- API Keys β
OPENAI_API_KEY,ANTHROPIC_API_KEY,GOOGLE_API_KEY,MISTRAL_API_KEY,COHERE_API_KEY,OPENROUTER_API_KEY,GROQ_API_KEY,XAI_API_KEY - Server β
OPENCLAW_API_PORT,OPENCLAW_WS_PORT,OPENCLAW_HOST - Memory β
OPENCLAW_MEMORY_BACKEND,OPENCLAW_REDIS_URL,OPENCLAW_SQLITE_PATH - Network β
OPENCLAW_HTTP_PROXY,OPENCLAW_HTTPS_PROXY,OPENCLAW_NO_PROXY - Ollama β
OLLAMA_HOST,OLLAMA_NUM_PARALLEL,OLLAMA_KEEP_ALIVE - Secrets β
OPENCLAW_SECRETS_BACKEND,VAULT_ADDR,VAULT_TOKEN
HuggingClaw adds its own variables for persistence and deployment: HF_TOKEN, OPENCLAW_DATASET_REPO, AUTO_CREATE_DATASET, SYNC_INTERVAL, OPENCLAW_DEFAULT_MODEL, etc. See .env.example for the complete reference.
Security
- Token authentication β Control UI requires a gateway token to connect (default:
huggingclaw, customizable viaGATEWAY_TOKEN) - Secrets stay server-side β API keys and tokens are never exposed to the browser
- Private backups β the Dataset repo is created as private by default
License
MIT