Text Generation
Transformers
Safetensors
llama
conversational
Inference Endpoints
text-generation-inference

Why is the weight file of the 6.7b model as large as 27Gb?

#6
by wangdafa - opened

6 Safetensors files should be 27GB, but as far as I know, the weight file of Deepseek-Coder-6.7B should be only about 13.5GB. Why?

Intellligent Software Engineering (iSE) org

Hi @wangdaf, the reason is that our uploaded model were with fp32 safetensors default, so it’s 2x larger than DeepSeek-Coder-6.7B which used bf16 safetensors

wangdafa changed discussion status to closed

Sign up or log in to comment