Updated EXL2 Quants

#3
by OrangeApples - opened

@LoneStriker could you please make updated ExLlamaV2 quants of this model? I've been using an old 5bpw exl2 quant of this, and it's probably the best 34B I've used for RP. Your comment here https://huggingface.co/LoneStriker/dolphin-2.2-70b-6.0bpw-h6-exl2/discussions/1#657caac2416635415fc934b0 made me wonder just how good it would be when quantized using Turbo's new method :)

I'll put it on the list.

Just saw the uploaded quants on your profile. Thanks again!

OrangeApples changed discussion status to closed

Sign up or log in to comment