Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Arki05
/
Grok-1-GGUF
like
64
Transformers
GGUF
Grok
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
17
Train
Deploy
Use this model
d4359f5
Grok-1-GGUF
/
Q3_K_S
2 contributors
History:
1 commit
Arki05
more quants (from f32) with ggerganov's IQ3_S imatrix (
#17
)
d4359f5
verified
7 months ago
grok-1-Q3_K_S-00001-of-00009.gguf
Safe
16.2 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
7 months ago
grok-1-Q3_K_S-00002-of-00009.gguf
Safe
15.7 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
7 months ago
grok-1-Q3_K_S-00003-of-00009.gguf
Safe
15.6 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
7 months ago
grok-1-Q3_K_S-00004-of-00009.gguf
Safe
15.1 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
7 months ago
grok-1-Q3_K_S-00005-of-00009.gguf
Safe
15.7 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
7 months ago
grok-1-Q3_K_S-00006-of-00009.gguf
Safe
15.7 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
7 months ago
grok-1-Q3_K_S-00007-of-00009.gguf
Safe
15.4 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
7 months ago
grok-1-Q3_K_S-00008-of-00009.gguf
Safe
15.3 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
7 months ago
grok-1-Q3_K_S-00009-of-00009.gguf
Safe
12.4 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
7 months ago