|
--- |
|
title: Llama2 Gradio Hugging Face |
|
emoji: 🛩️ |
|
colorFrom: gray |
|
colorTo: gray |
|
sdk: docker |
|
app_file: app.py |
|
app_port: 7860 |
|
pinned: true |
|
--- |
|
|
|
## App |
|
|
|
See live app on [Hugging Face](https://huggingface.co/spaces/AdamNovotnyCom/llama2-gradio-huggingface) |
|
|
|
## Local Container management |
|
|
|
Start |
|
|
|
export HF_TOKEN=paste_HF_token && docker-compose -f docker-compose.yml up llama2hf |
|
|
|
View app in browser |
|
|
|
http://0.0.0.0:7860/ |
|
|
|
Exec command |
|
|
|
docker exec -it llama2hf bash -c 'pip install torch==2.1.*' |
|
|
|
## References |
|
- [huggingface.co/llama2](https://huggingface.co/blog/llama2) |
|
- [demo-docker-gradio](https://huggingface.co/spaces/sayakpaul/demo-docker-gradio/tree/main) |
|
- [space config reference](https://huggingface.co/docs/hub/spaces-config-reference) |
|
|