How much GPU graphics memory is required for deployment

#3
by chenfeicqq - opened

I want to deploy 33b model, need to evaluate how much graphics memory is needed.

tks for you help.

Hi, 28GB of GPU memory for Vicuna-13B, thus I would interpolate and assume around 70GB of GPU
https://github.com/lm-sys/FastChat#vicuna-weights

Sign up or log in to comment