Hardware spec requirement?

#14
by its-eric-liu - opened

What is the hardware spec requirement for 7B model?
How about 13B and 34B?

It briefly mentioned it takes one GPU for 7B, but didn't mention how much for other versions?
https://about.fb.com/news/2023/08/code-llama-ai-for-coding/

for 16 bit precison, 7b(this model), it should roughly take 13ish - 14ish vram. 13b model should be around 25ish, and 34b model should take 68ish.

A general rule for llama models 16 bit precision is double the model parameters.

Code Llama org

You can also check #13 !

Sign up or log in to comment