6.5 bpw EXL2 quant of Acolyte-22B


Acolyte-22B

image/png

LoRA of a bunch of random datasets on top of Mistral-Small-Instruct-2409, then SLERPed onto base at 0.5. Decent enough for its size. Check the LoRA for dataset info.

Use Mistral V2 & V3 template.

Downloads last month
4
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.

Model tree for Brioch/Acolyte-22B-6.5bpw-exl2

Quantized
(7)
this model