LoneStriker's picture
Create README.md
9c18caa verified
|
raw
history blame
570 Bytes
---
inference: false
language:
- en
library_name: transformers
pipeline_tag: text-generation
tags:
- mixtral
license: apache-2.0
datasets:
- jondurbin/airoboros-3.2
---
# Air-Striker-Mixtral-8x7B-ZLoss
Experimental model, trained using config and [Transformers/Axolotl](https://github.com/DocShotgun/axolotl) forks provided by [Doctor-Shotgun](https://huggingface.co/Doctor-Shotgun)
Model was fine-tuned from [Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) with airoboros-3.2 dataset, for 4 epochs, ChatML prompt format at 8K context length.