Instructions to use isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- llama-cpp-python
How to use isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp with llama-cpp-python:
# !pip install llama-cpp-python from llama_cpp import Llama llm = Llama.from_pretrained( repo_id="isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp", filename="Wan2.2-TI2V-5B-Q2_K.gguf", )
output = llm( "Once upon a time,", max_tokens=512, echo=True ) print(output)
- Notebooks
- Google Colab
- Kaggle
- Local Apps
- llama.cpp
How to use isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp with llama.cpp:
Install from brew
brew install llama.cpp # Start a local OpenAI-compatible server with a web UI: llama-server -hf isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K # Run inference directly in the terminal: llama-cli -hf isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K
Install from WinGet (Windows)
winget install llama.cpp # Start a local OpenAI-compatible server with a web UI: llama-server -hf isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K # Run inference directly in the terminal: llama-cli -hf isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K
Use pre-built binary
# Download pre-built binary from: # https://github.com/ggerganov/llama.cpp/releases # Start a local OpenAI-compatible server with a web UI: ./llama-server -hf isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K # Run inference directly in the terminal: ./llama-cli -hf isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K
Build from source code
git clone https://github.com/ggerganov/llama.cpp.git cd llama.cpp cmake -B build cmake --build build -j --target llama-server llama-cli # Start a local OpenAI-compatible server with a web UI: ./build/bin/llama-server -hf isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K # Run inference directly in the terminal: ./build/bin/llama-cli -hf isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K
Use Docker
docker model run hf.co/isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K
- LM Studio
- Jan
- Ollama
How to use isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp with Ollama:
ollama run hf.co/isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K
- Unsloth Studio new
How to use isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp with Unsloth Studio:
Install Unsloth Studio (macOS, Linux, WSL)
curl -fsSL https://unsloth.ai/install.sh | sh # Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp to start chatting
Install Unsloth Studio (Windows)
irm https://unsloth.ai/install.ps1 | iex # Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp to start chatting
Using HuggingFace Spaces for Unsloth
# No setup required # Open https://huggingface.co/spaces/unsloth/studio in your browser # Search for isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp to start chatting
- Docker Model Runner
How to use isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp with Docker Model Runner:
docker model run hf.co/isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K
- Lemonade
How to use isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp with Lemonade:
Pull the model
# Download Lemonade from https://lemonade-server.ai/ lemonade pull isfs/wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp:Q2_K
Run and chat with the model
lemonade run user.wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp-Q2_K
List all available models
lemonade list
wan-2.2-5b-ti2v-gguf-stable-diffusion-cpp / stable-diffusion-cpp-nvidia /CMakeFiles /3.31.10 /CompilerIdCUDA /tmp /a_dlink.fatbin.c
| asm( | |
| ".section .nv_fatbin, \"a\"\n" | |
| ".align 8\n" | |
| "fatbinData:\n" | |
| ".quad 0x00100001ba55ed50,0x00000000000003a8,0x0000004001010002,0x0000000000000368\n" | |
| ".quad 0x0000000000000000,0x0000003400010007,0x0000000000000000,0x0000000000000011\n" | |
| ".quad 0x0000000000000000,0x0000000000000000,0x33010102464c457f,0x0000000000000007\n" | |
| ".quad 0x0000007d00be0002,0x0000000000000000,0x00000000000002f8,0x0000000000000178\n" | |
| ".quad 0x0038004000340534,0x0001000600400002,0x7472747368732e00,0x747274732e006261\n" | |
| ".quad 0x746d79732e006261,0x746d79732e006261,0x78646e68735f6261,0x666e692e766e2e00\n" | |
| ".quad 0x61632e766e2e006f,0x0068706172676c6c,0x746f72702e766e2e,0x6e2e00657079746f\n" | |
| ".quad 0x63612e6c65722e76,0x732e00006e6f6974,0x0062617472747368,0x006261747274732e\n" | |
| ".quad 0x006261746d79732e,0x5f6261746d79732e,0x6e2e0078646e6873,0x2e006f666e692e76\n" | |
| ".quad 0x676c6c61632e766e,0x766e2e0068706172,0x79746f746f72702e,0x722e766e2e006570\n" | |
| ".quad 0x6f697463612e6c65,0x000000000000006e,0x0000000000000000,0x0000000000000000\n" | |
| ".quad 0x0000000000000000,0x0004000300000032,0x0000000000000000,0x0000000000000000\n" | |
| ".quad 0x000500030000004e,0x0000000000000000,0x0000000000000000,0xffffffff00000000\n" | |
| ".quad 0xfffffffe00000000,0xfffffffd00000000,0xfffffffc00000000,0x0000000000000073\n" | |
| ".quad 0x3605002511000000,0x0000000000000000,0x0000000000000000,0x0000000000000000\n" | |
| ".quad 0x0000000000000000,0x0000000000000000,0x0000000000000000,0x0000000000000000\n" | |
| ".quad 0x0000000000000000,0x0000000300000001,0x0000000000000000,0x0000000000000000\n" | |
| ".quad 0x0000000000000040,0x000000000000005d,0x0000000000000000,0x0000000000000001\n" | |
| ".quad 0x0000000000000000,0x000000030000000b,0x0000000000000000,0x0000000000000000\n" | |
| ".quad 0x000000000000009d,0x000000000000005d,0x0000000000000000,0x0000000000000001\n" | |
| ".quad 0x0000000000000000,0x0000000200000013,0x0000000000000000,0x0000000000000000\n" | |
| ".quad 0x0000000000000100,0x0000000000000048,0x0000000300000002,0x0000000000000008\n" | |
| ".quad 0x0000000000000018,0x7000000100000032,0x0000000000000000,0x0000000000000000\n" | |
| ".quad 0x0000000000000148,0x0000000000000020,0x0000000000000003,0x0000000000000004\n" | |
| ".quad 0x0000000000000008,0x7000000b0000004e,0x0000000000000000,0x0000000000000000\n" | |
| ".quad 0x0000000000000168,0x0000000000000010,0x0000000000000000,0x0000000000000008\n" | |
| ".quad 0x0000000000000008,0x0000000500000006,0x00000000000002f8,0x0000000000000000\n" | |
| ".quad 0x0000000000000000,0x0000000000000070,0x0000000000000070,0x0000000000000008\n" | |
| ".quad 0x0000000500000001,0x00000000000002f8,0x0000000000000000,0x0000000000000000\n" | |
| ".quad 0x0000000000000070, 0x0000000000000070, 0x0000000000000008\n" | |
| ".text\n"); | |
| extern "C" { | |
| extern const unsigned long long fatbinData[119]; | |
| } | |
| extern "C" { | |
| static const __fatBinC_Wrapper_t __fatDeviceText __attribute__ ((aligned (8))) __attribute__ ((section (__CUDAFATBINSECTION)))= | |
| { 0x466243b1, 2, fatbinData, (void**)__cudaPrelinkedFatbins }; | |
| } | |