Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
TheDrummer
/
Nautilus-70B-v0.1-GGUF
like
4
GGUF
Inference Endpoints
conversational
License:
other
Model card
Files
Files and versions
Community
1
Deploy
Use this model
2edbfa9
Nautilus-70B-v0.1-GGUF
1 contributor
History:
15 commits
TheDrummer
Rename Nautilus-70B-v1a-Q6_K-00001-of-00002.gguf to Nautilus-70B-v0.1-Q6_K-00001-of-00002.gguf
2edbfa9
verified
3 months ago
.gitattributes
3.25 kB
Rename Nautilus-70B-v1a-Q6_K-00001-of-00002.gguf to Nautilus-70B-v0.1-Q6_K-00001-of-00002.gguf
3 months ago
Nautilus-70B-v0.1-BF16-00001-of-00004.gguf
Safe
44.9 GB
LFS
Rename Nautilus-70B-v1a-BF16-00001-of-00004.gguf to Nautilus-70B-v0.1-BF16-00001-of-00004.gguf
3 months ago
Nautilus-70B-v0.1-BF16-00002-of-00004.gguf
Safe
45 GB
LFS
Rename Nautilus-70B-v1a-BF16-00002-of-00004.gguf to Nautilus-70B-v0.1-BF16-00002-of-00004.gguf
3 months ago
Nautilus-70B-v0.1-BF16-00003-of-00004.gguf
Safe
44.7 GB
LFS
Rename Nautilus-70B-v1a-BF16-00003-of-00004.gguf to Nautilus-70B-v0.1-BF16-00003-of-00004.gguf
3 months ago
Nautilus-70B-v0.1-BF16-00004-of-00004.gguf
Safe
6.6 GB
LFS
Rename Nautilus-70B-v1a-BF16-00004-of-00004.gguf to Nautilus-70B-v0.1-BF16-00004-of-00004.gguf
3 months ago
Nautilus-70B-v0.1-Q2_K.gguf
Safe
26.4 GB
LFS
Rename Nautilus-70B-v1a-Q2_K.gguf to Nautilus-70B-v0.1-Q2_K.gguf
3 months ago
Nautilus-70B-v0.1-Q3_K_M.gguf
Safe
34.3 GB
LFS
Rename Nautilus-70B-v1a-Q3_K_M.gguf to Nautilus-70B-v0.1-Q3_K_M.gguf
3 months ago
Nautilus-70B-v0.1-Q4_K_M.gguf
Safe
42.5 GB
LFS
Rename Nautilus-70B-v1a-Q4_K_M.gguf to Nautilus-70B-v0.1-Q4_K_M.gguf
3 months ago
Nautilus-70B-v0.1-Q5_K_M-00001-of-00002.gguf
Safe
44.9 GB
LFS
Rename Nautilus-70B-v1a-Q5_K_M-00001-of-00002.gguf to Nautilus-70B-v0.1-Q5_K_M-00001-of-00002.gguf
3 months ago
Nautilus-70B-v0.1-Q5_K_M-00002-of-00002.gguf
Safe
5.01 GB
LFS
Rename Nautilus-70B-v1a-Q5_K_M-00002-of-00002.gguf to Nautilus-70B-v0.1-Q5_K_M-00002-of-00002.gguf
3 months ago
Nautilus-70B-v0.1-Q6_K-00001-of-00002.gguf
Safe
45 GB
LFS
Rename Nautilus-70B-v1a-Q6_K-00001-of-00002.gguf to Nautilus-70B-v0.1-Q6_K-00001-of-00002.gguf
3 months ago
Nautilus-70B-v1a-Q6_K-00002-of-00002.gguf
Safe
12.9 GB
LFS
Upload folder using huggingface_hub
3 months ago
Nautilus-70B-v1a-Q8_0-00001-of-00002.gguf
Safe
44.8 GB
LFS
Upload folder using huggingface_hub
3 months ago
Nautilus-70B-v1a-Q8_0-00002-of-00002.gguf
Safe
30.2 GB
LFS
Upload folder using huggingface_hub
3 months ago
README.md
Safe
1.07 kB
Create README.md
3 months ago