Panchovix's picture
Update README.md
3195f07
|
raw
history blame
No virus
347 Bytes
metadata
license: other

5 bit quantization of airoboros 70b 1.4.1 (https://huggingface.co/jondurbin/airoboros-l2-70b-gpt4-1.4.1), using exllama2.

On 2x4090, 3072 ctx seems to work fine with 21.5,22.5 gpu_split and max_attention_size = 1024 ** 2 instead if 2048 ** 2.

4096 may be factible on a single 48GB VRAM GPU (like A6000)

Tests are welcome.