Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
ianZzzzzz
/
GLM-130B-quant-int4-4gpu
like
12
Model card
Files
Files and versions
Community
5
main
GLM-130B-quant-int4-4gpu
/
README.md
ianZzzzzz
Update README.md
5838951
over 1 year ago
preview
code
|
raw
Copy download link
history
blame
contribute
delete
Safe
215 Bytes
GLM-130B模型的int4量化版本,可在四张3090Ti的情况下进行推理。 An int4 quantized version of the GLM-130B model that can be inferred with 4 * 3090Ti .
license: apache-2.0
iannobug@gmail.com