there is a dolphin 2.6-phi that is 3b, possible to make a mixtral 8x3b using this as a base? Maybe even with using qlora adapters?

#3
by LaferriereJC - opened

I've read
Lora moe
https://www.reddit.com/r/LocalLLaMA/comments/18vppf5/one_model_many_loras_theoretically_possible/
https://github.com/S-LoRA/S-LoRA

AFAIK they are trying to merge it into LightLLM (which is also a tragically under the radar project).

The ML space is unfortunately filled with great ideas that skip the attention they need to get traction.

https://github.com/punica-ai/punica

https://github.com/jondurbin/airoboros#lmoe


LoRAX can do this: https://github.com/predibase/lorax

LoRAHub: https://arxiv.org/pdf/2307.13269.pdf

https://arxiv.org/pdf/2208.03306.pdf

Sign up or log in to comment