Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Lyte
/
RWKV-6-World-1.6B-GGUF
like
1
Text Generation
GGUF
rwkv
rwkv-6
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Deploy
Use this model
main
RWKV-6-World-1.6B-GGUF
1 contributor
History:
21 commits
Lyte
Update README.md
cdfbd47
verified
3 months ago
.gitattributes
Safe
2 kB
Upload RWKV-6-World-1.6B-GGUF-F16.gguf with huggingface_hub
3 months ago
README.md
Safe
1.34 kB
Update README.md
3 months ago
RWKV-6-World-1.6B-GGUF-F16.gguf
Safe
3.25 GB
LFS
Upload RWKV-6-World-1.6B-GGUF-F16.gguf with huggingface_hub
3 months ago
RWKV-6-World-1.6B-GGUF-Q2_K.gguf
Safe
676 MB
LFS
Upload RWKV-6-World-1.6B-GGUF-Q2_K.gguf with huggingface_hub
3 months ago
RWKV-6-World-1.6B-GGUF-Q3_K.gguf
Safe
823 MB
LFS
Upload RWKV-6-World-1.6B-GGUF-Q3_K.gguf with huggingface_hub
3 months ago
RWKV-6-World-1.6B-GGUF-Q4_K_M.gguf
Safe
1.01 GB
LFS
Upload RWKV-6-World-1.6B-GGUF-Q4_K_M.gguf with huggingface_hub
3 months ago
RWKV-6-World-1.6B-GGUF-Q5_K.gguf
Safe
1.19 GB
LFS
Upload RWKV-6-World-1.6B-GGUF-Q5_K.gguf with huggingface_hub
3 months ago
RWKV-6-World-1.6B-GGUF-Q6_K.gguf
Safe
1.39 GB
LFS
Upload RWKV-6-World-1.6B-GGUF-Q6_K.gguf with huggingface_hub
3 months ago
RWKV-6-World-1.6B-GGUF-Q8_0.gguf
Safe
1.77 GB
LFS
Upload RWKV-6-World-1.6B-GGUF-Q8_0.gguf with huggingface_hub
3 months ago
convert-model-to-gguf.ipynb
Safe
179 kB
Upload convert-model-to-gguf.ipynb
3 months ago