Text Generation
Transformers
Safetensors
English
stablelm
causal-lm
conversational
Eval Results
Inference Endpoints

Why is a 3B model using 16GB of memory?

#5
by abacaj - opened

Any ideas what's wrong with the loading of the model?

image.png

image.png

abacaj changed discussion status to closed

Sign up or log in to comment