Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
parinzee
/
SeaLLM-7B-Chat-GGUF
like
1
GGUF
10 languages
License:
seallms
Model card
Files
Files and versions
Community
Use this model
main
SeaLLM-7B-Chat-GGUF
1 contributor
History:
22 commits
parinzee
Update README.md
a027d14
12 months ago
.gitattributes
2.63 kB
Upload seallm-7b-chat.q4_k.gguf with huggingface_hub
12 months ago
README.md
6.53 kB
Update README.md
12 months ago
seallm-7b-chat.f16.gguf
13.7 GB
LFS
Upload seallm-7b-chat.f16.gguf with huggingface_hub
12 months ago
seallm-7b-chat.q2_k.gguf
2.9 GB
LFS
Upload seallm-7b-chat.q2_k.gguf with huggingface_hub
12 months ago
seallm-7b-chat.q3_k.gguf
3.38 GB
LFS
Upload seallm-7b-chat.q3_k.gguf with huggingface_hub
12 months ago
seallm-7b-chat.q3_k_l.gguf
3.68 GB
LFS
Upload seallm-7b-chat.q3_k_l.gguf with huggingface_hub
12 months ago
seallm-7b-chat.q3_k_m.gguf
3.38 GB
LFS
Upload seallm-7b-chat.q3_k_m.gguf with huggingface_hub
12 months ago
seallm-7b-chat.q3_k_s.gguf
3.03 GB
LFS
Upload seallm-7b-chat.q3_k_s.gguf with huggingface_hub
12 months ago
seallm-7b-chat.q4_0.gguf
3.92 GB
LFS
Upload seallm-7b-chat.q4_0.gguf with huggingface_hub
12 months ago
seallm-7b-chat.q4_1.gguf
4.34 GB
LFS
Upload seallm-7b-chat.q4_1.gguf with huggingface_hub
12 months ago
seallm-7b-chat.q4_k.gguf
4.17 GB
LFS
Upload seallm-7b-chat.q4_k.gguf with huggingface_hub
12 months ago
seallm-7b-chat.q4_k_m.gguf
4.17 GB
LFS
Upload seallm-7b-chat.q4_k_m.gguf with huggingface_hub
12 months ago
seallm-7b-chat.q4_k_s.gguf
3.95 GB
LFS
Upload seallm-7b-chat.q4_k_s.gguf with huggingface_hub
12 months ago
seallm-7b-chat.q5_0.gguf
4.75 GB
LFS
Upload seallm-7b-chat.q5_0.gguf with huggingface_hub
12 months ago
seallm-7b-chat.q5_1.gguf
5.17 GB
LFS
Upload seallm-7b-chat.q5_1.gguf with huggingface_hub
12 months ago
seallm-7b-chat.q5_k.gguf
4.89 GB
LFS
Upload seallm-7b-chat.q5_k.gguf with huggingface_hub
12 months ago
seallm-7b-chat.q5_k_m.gguf
4.89 GB
LFS
Upload seallm-7b-chat.q5_k_m.gguf with huggingface_hub
12 months ago
seallm-7b-chat.q5_k_s.gguf
4.75 GB
LFS
Upload seallm-7b-chat.q5_k_s.gguf with huggingface_hub
12 months ago
seallm-7b-chat.q6_k.gguf
5.64 GB
LFS
Upload seallm-7b-chat.q6_k.gguf with huggingface_hub
12 months ago
seallm-7b-chat.q8_0.gguf
7.31 GB
LFS
Upload seallm-7b-chat.q8_0.gguf with huggingface_hub
12 months ago