Training GPU hours

#9
by dragon0116 - opened

Thanks for the great work! The model card mentions that the 34B Code-LLM was fine-tuned by QLoRA. I'm curious about the typical hardware configuration used (e.g., 4x A100-80GB GPUs?) and the total training time in hours.

Sign up or log in to comment