Can you produce a 2.4bpw exl2 quantisation of this model?
#2
by
xldistance
- opened
Models up, a couple more quants shortly:
https://huggingface.co/models?search=LoneStriker/TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO_f16