Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
apresence
/
internlm2_5-7b-chat-GGUF_with-tool-fix
like
0
Text Generation
GGUF
English
chat
License:
apache-2.0
Model card
Files
Files and versions
Community
Use this model
main
internlm2_5-7b-chat-GGUF_with-tool-fix
2 contributors
History:
10 commits
apresence
Add RoPE scaling issue to model card
eae4beb
verified
3 days ago
.gitattributes
1.56 kB
Initial commit
4 days ago
.gitignore
14 Bytes
Updated README.md
4 days ago
README.md
13.8 kB
Add RoPE scaling issue to model card
3 days ago
internlm2_5-7b-chat-IQ2_M.gguf
2.78 GB
LFS
Added IMATRIX quants
4 days ago
internlm2_5-7b-chat-IQ2_S.gguf
2.59 GB
LFS
Added IMATRIX quants
4 days ago
internlm2_5-7b-chat-IQ2_XS.gguf
2.45 GB
LFS
Added IMATRIX quants
4 days ago
internlm2_5-7b-chat-IQ3_M.gguf
3.6 GB
LFS
Added IMATRIX quants
4 days ago
internlm2_5-7b-chat-IQ3_XS.gguf
3.33 GB
LFS
Added IMATRIX quants
4 days ago
internlm2_5-7b-chat-IQ3_XXS.gguf
3.11 GB
LFS
Added IMATRIX quants
4 days ago
internlm2_5-7b-chat-IQ4_XS.gguf
4.24 GB
LFS
Added IMATRIX quants
4 days ago
internlm2_5-7b-chat-Q2_K.gguf
3.01 GB
LFS
Updated README.md & fixed gguf naming
4 days ago
internlm2_5-7b-chat-Q3_K_L.gguf
4.13 GB
LFS
Updated README.md & fixed gguf naming
4 days ago
internlm2_5-7b-chat-Q3_K_M.gguf
3.83 GB
LFS
Updated README.md & fixed gguf naming
4 days ago
internlm2_5-7b-chat-Q3_K_XL.gguf
5.18 GB
LFS
Updated README.md & fixed gguf naming
4 days ago
internlm2_5-7b-chat-Q4_K_L.gguf
5.7 GB
LFS
Updated README.md & fixed gguf naming
4 days ago
internlm2_5-7b-chat-Q4_K_M.gguf
4.71 GB
LFS
Updated README.md & fixed gguf naming
4 days ago
internlm2_5-7b-chat-Q4_K_S.gguf
4.48 GB
LFS
Updated README.md & fixed gguf naming
4 days ago
internlm2_5-7b-chat-Q5_K_L.gguf
6.45 GB
LFS
Updated README.md & fixed gguf naming
4 days ago
internlm2_5-7b-chat-Q5_K_M.gguf
5.51 GB
LFS
Updated README.md & fixed gguf naming
4 days ago
internlm2_5-7b-chat-Q5_K_S.gguf
5.37 GB
LFS
Updated README.md & fixed gguf naming
4 days ago
internlm2_5-7b-chat-Q6_K.gguf
6.35 GB
LFS
Updated README.md & fixed gguf naming
4 days ago
internlm2_5-7b-chat-Q6_K_L.gguf
7.24 GB
LFS
Updated README.md & fixed gguf naming
4 days ago
internlm2_5-7b-chat-Q8_0.gguf
8.22 GB
LFS
Updated README.md & fixed gguf naming
4 days ago
internlm2_5-7b-chat-Q8_0_L.gguf
8.93 GB
LFS
Updated README.md & fixed gguf naming
4 days ago
internlm2_5-7b-chat-f32.gguf
31 GB
LFS
Updated README.md & fixed gguf naming
4 days ago