Edit model card

Crunchy-onion

This model is created by training Mixtral base model on LimaRP (ShareGPT format provided by SAO), theory of mind, and gnosis(provided by jeiku).

The 4-bit qlora was then merged into Mixtral Instruct resulting in what you see here.

Works best with Alpaca Instruct

Downloads last month
1,343
Safetensors
Model size
46.7B params
Tensor type
BF16
·
Inference API
Input a message to start chatting with Epiculous/Crunchy-onion.
Model is too large to load in Inference API (serverless). To try the model, launch it on Inference Endpoints (dedicated) instead.

Datasets used to train Epiculous/Crunchy-onion

Collection including Epiculous/Crunchy-onion