Air-Striker-Mixtral-8x7B-ZLoss

Experimental model, trained using config and Transformers/Axolotl forks provided by Doctor-Shotgun

Model was fine-tuned from Mixtral-8x7B-v0.1 with airoboros-3.2 dataset, for 4 epochs, ChatML prompt format at 8K context length.

Downloads last month
5
GGUF
Model size
46.7B params
Architecture
llama

2-bit

3-bit

4-bit

5-bit

6-bit

Inference Examples
Inference API (serverless) has been turned off for this model.

Dataset used to train LoneStriker/Air-Striker-Mixtral-8x7B-ZLoss-GGUF