Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
mmnga
/
ELYZA-japanese-Llama-2-7b-fast-gguf
like
7
GGUF
Japanese
llama2
Inference Endpoints
arxiv:
2307.09288
License:
llama2
Model card
Files
Files and versions
Community
Deploy
Use this model
18ab9e0
ELYZA-japanese-Llama-2-7b-fast-gguf
1 contributor
History:
18 commits
mmnga
Upload ELYZA-japanese-Llama-2-7b-fast-q8_0.gguf with huggingface_hub
18ab9e0
about 1 year ago
.gitattributes
Safe
2.46 kB
Upload ELYZA-japanese-Llama-2-7b-fast-q6_K.gguf with huggingface_hub
over 1 year ago
ELYZA-japanese-Llama-2-7b-fast-q2_K.gguf
Safe
2.89 GB
LFS
Upload ELYZA-japanese-Llama-2-7b-fast-q2_K.gguf with huggingface_hub
over 1 year ago
ELYZA-japanese-Llama-2-7b-fast-q3_K_L.gguf
Safe
3.66 GB
LFS
Upload ELYZA-japanese-Llama-2-7b-fast-q3_K_L.gguf with huggingface_hub
over 1 year ago
ELYZA-japanese-Llama-2-7b-fast-q3_K_M.gguf
Safe
3.37 GB
LFS
Upload ELYZA-japanese-Llama-2-7b-fast-q3_K_M.gguf with huggingface_hub
over 1 year ago
ELYZA-japanese-Llama-2-7b-fast-q3_K_S.gguf
Safe
3.02 GB
LFS
Upload ELYZA-japanese-Llama-2-7b-fast-q3_K_S.gguf with huggingface_hub
over 1 year ago
ELYZA-japanese-Llama-2-7b-fast-q4_0.gguf
Safe
3.9 GB
LFS
Upload ELYZA-japanese-Llama-2-7b-fast-q4_0.gguf with huggingface_hub
about 1 year ago
ELYZA-japanese-Llama-2-7b-fast-q4_K_M.gguf
Safe
4.16 GB
LFS
Upload ELYZA-japanese-Llama-2-7b-fast-q4_K_M.gguf with huggingface_hub
over 1 year ago
ELYZA-japanese-Llama-2-7b-fast-q4_K_S.gguf
Safe
3.93 GB
LFS
Upload ELYZA-japanese-Llama-2-7b-fast-q4_K_S.gguf with huggingface_hub
over 1 year ago
ELYZA-japanese-Llama-2-7b-fast-q5_0.gguf
Safe
4.73 GB
LFS
Upload ELYZA-japanese-Llama-2-7b-fast-q5_0.gguf with huggingface_hub
about 1 year ago
ELYZA-japanese-Llama-2-7b-fast-q5_K_M.gguf
Safe
4.86 GB
LFS
Upload ELYZA-japanese-Llama-2-7b-fast-q5_K_M.gguf with huggingface_hub
over 1 year ago
ELYZA-japanese-Llama-2-7b-fast-q5_K_S.gguf
Safe
4.73 GB
LFS
Upload ELYZA-japanese-Llama-2-7b-fast-q5_K_S.gguf with huggingface_hub
over 1 year ago
ELYZA-japanese-Llama-2-7b-fast-q6_K.gguf
Safe
5.62 GB
LFS
Upload ELYZA-japanese-Llama-2-7b-fast-q6_K.gguf with huggingface_hub
over 1 year ago
ELYZA-japanese-Llama-2-7b-fast-q8_0.gguf
Safe
7.27 GB
LFS
Upload ELYZA-japanese-Llama-2-7b-fast-q8_0.gguf with huggingface_hub
about 1 year ago
README.md
Safe
3.21 kB
Update README.md
over 1 year ago