How to finetune this beauty?

#3
by mandeepbagga - opened

How can we finetune this? And could be the resources requirement for it.

How can we finetune this? And could be the resources requirement for it.

you fine tune it like a normal llm and keep in mind https://twitter.com/moinnadeem/status/1681393075367841792

and you will need 4,400 gigs of vram (with nothing to lower vram )

and you will need 4,400 gigs of vram (with nothing to lower vram )

But why? I thought LLAMA 2 can be fine-tuned with single A-100

and you will need 4,400 gigs of vram (with nothing to lower vram )

But why? I thought LLAMA 2 can be fine-tuned with single A-100

The 7b model yeah and the 13b model yeah with qlora but 70b model can't fine-tuned on a single gpu even with qlora

Sign up or log in to comment