Hardware requirements on falcon models: 7B, 40B, 180B

#83
by its-eric-liu - opened

Looking for information on the hardware requirement to run falcon models: 7B, 40B, 180B.

7b-instruct I've trained with 9-36gb vram, currently trying 7b.

40b is ~96gb vram, from what i've read there was someone who had trained 40b-instruct using something different to Lora with 48gb vRam, however, even then there seems to be more involved with the GPU configuration.

if anyone has more concrete details on the hardware requirements.

I've come across quite a few links that i'll post when i find them again.

Resources:
https://www.reddit.com/r/LocalLLaMA/comments/13wutj4/getting_falcon_40b_to_work/
https://huggingface.co/TheBloke/falcon-40b-instruct-GPTQ/discussions

Sign up or log in to comment