File size: 490 Bytes
dcda9da
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6926157
dcda9da
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
---
inference: false
language:
- en
library_name: transformers
pipeline_tag: text-generation
tags:
- mixtral
license: apache-2.0
datasets:
- jondurbin/airoboros-3.2
---

# Air-Striker-Mixtral-8x7B-ZLoss

Experimental model, trained using config and [Transformers/Axolotl](https://github.com/DocShotgun/axolotl) forks provided by [Doctor-Shotgun](https://huggingface.co/Doctor-Shotgun)

Model was trained with airoboros-3.2 dataset, for 4 epochs, ChatML prompt format at 8K context length.