Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
DavidAU
/
Psyonic-Cetacean-EXP
like
1
Text Generation
GGUF
Inference Endpoints
imatrix
License:
apache-2.0
Model card
Files
Files and versions
Community
Deploy
Use this model
main
Psyonic-Cetacean-EXP
1 contributor
History:
18 commits
DavidAU
Update README.md
a5faa44
verified
4 months ago
.gitattributes
Safe
2.47 kB
Upload Space-Whale-V5-f32-f32-PASS-20B-Q6_k.gguf with huggingface_hub
10 months ago
Psyonic-Cetacean-Ultra-Q4_K_M-imat2.gguf
12 GB
LFS
Upload Psyonic-Cetacean-Ultra-Q4_K_M-imat2.gguf with huggingface_hub
10 months ago
Psyonic-Cetacean-Ultra-Q4_K_M-imat3.gguf
12 GB
LFS
Upload Psyonic-Cetacean-Ultra-Q4_K_M-imat3.gguf with huggingface_hub
10 months ago
Psyonic-Cetacean-Ultra-Q4_K_M-imat4.gguf
12 GB
LFS
Upload Psyonic-Cetacean-Ultra-Q4_K_M-imat4.gguf with huggingface_hub
10 months ago
Psyonic-Cetacean-Ultra-Q4_K_M-imat5.gguf
12 GB
LFS
Upload Psyonic-Cetacean-Ultra-Q4_K_M-imat5.gguf with huggingface_hub
10 months ago
Psyonic-Cetacean-Ultra-Quality-PLUS-20b-Q4_k_m.gguf
13.1 GB
LFS
Upload Psyonic-Cetacean-Ultra-Quality-PLUS-20b-Q4_k_m.gguf with huggingface_hub
10 months ago
README.md
Safe
1.2 kB
Update README.md
4 months ago
Space-Whale-V3-f32-f32-20B-Q4_k_m.gguf
12 GB
LFS
Upload Space-Whale-V3-f32-f32-20B-Q4_k_m.gguf with huggingface_hub
10 months ago
Space-Whale-V3-f32-f32-PLUS-20B-Q4_k_m.gguf
13.1 GB
LFS
Upload Space-Whale-V3-f32-f32-PLUS-20B-Q4_k_m.gguf with huggingface_hub
10 months ago
Space-Whale-V3-f32-f32-PURE-20B-Q5_k_s.gguf
13.7 GB
LFS
Upload Space-Whale-V3-f32-f32-PURE-20B-Q5_k_s.gguf with huggingface_hub
10 months ago
Space-Whale-V4-f32-f32-PASS-20B-Q4_k_m.gguf
12.4 GB
LFS
Upload Space-Whale-V4-f32-f32-PASS-20B-Q4_k_m.gguf with huggingface_hub
10 months ago
Space-Whale-V4-f32-f32-PASS-PLUS-20B-Q4_k_m.gguf
13.5 GB
LFS
Upload Space-Whale-V4-f32-f32-PASS-PLUS-20B-Q4_k_m.gguf with huggingface_hub
10 months ago
Space-Whale-V5-f32-f32-PASS-20B-Q4_k_m.gguf
12.4 GB
LFS
Upload Space-Whale-V5-f32-f32-PASS-20B-Q4_k_m.gguf with huggingface_hub
10 months ago
Space-Whale-V5-f32-f32-PASS-20B-Q6_k.gguf
16.9 GB
LFS
Upload Space-Whale-V5-f32-f32-PASS-20B-Q6_k.gguf with huggingface_hub
10 months ago