WizardLM/WizardCoder-Python-7B-V1.0 Memory Requirements

#25
by VivekChauhan06 - opened

Can anyone tell me about the memory requirements for the WizardLM/WizardCoder-Python-7B-V1.0.

Here's the result given by Model Memory Calculator, which is a handy tool to calculate the memory requirements of a certain LLM.

dtype Largest Layer or Residual Group Total Size Training using Adam
float32 788.03 MB 25.11 GB 100.46 GB
float16/bfloat16 394.02 MB 12.56 GB 50.23 GB
int8 197.01 MB 6.28 GB 25.11 GB
int4 98.5 MB 3.14 GB 12.56 GB

model = AutoModelForCausalLM.from_pretrained("WizardLM/WizardCoder-Python-7B-V1.0"), quantization_config=bnb_config, device_map={"":0})
While running the above code the model takes the whole CPU RAM but it does not take the Colab GPU which I am using and due to this reason I am not able to use this model in Colab. I have already used the" codellama/CodeLlama-7b-Instruct-hf" model which works well while using the AutoModelForCausalLM.from_pretrained("codellama/CodeLlama-7b-Instruct-hf", quantization_config=bnb_config, device_map={"":0})

Give me the solution which can help me to know the reason behind it.

VivekChauhan06 changed discussion status to closed
VivekChauhan06 changed discussion status to open

Sign up or log in to comment