Air-Striker-Mixtral-8x7B-ZLoss

Experimental model, trained using config and Transformers/Axolotl forks provided by Doctor-Shotgun

Model was fine-tuned from Mixtral-8x7B-v0.1 with airoboros-3.2 dataset, for 4 epochs, ChatML prompt format at 8K context length.

Downloads last month
107
Inference Examples
Inference API (serverless) has been turned off for this model.

Model tree for LoneStriker/Air-Striker-Mixtral-8x7B-ZLoss-LoRA

Merges
2 models

Dataset used to train LoneStriker/Air-Striker-Mixtral-8x7B-ZLoss-LoRA