Na0s/sft-ready-Text-Generation-Augmented-Data-Alpaca-Format Viewer • Updated Dec 13, 2024 • 7.67M • 42
Pruned MoEs (Mixtral-8x7B-Instruct-v0.1) Collection Pruned experts from Mixtral-8x7B-Instruct-v0.1 with respect to the paper "A Provably Effective Method for Pruning Experts in Fine-tuned Sparse MoEs" • 15 items • Updated Nov 18, 2024
Na0s/Mixtral-8x7B-Instruct-v0.1-exhaustive-LoRA-SFT-pruned-1-expert Text Generation • Updated Nov 18, 2024 • 6