File size: 464 Bytes
ee694d2 090a439 a4400d0 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 |
---
license: apache-2.0
library_name: transformers
---
This model is based on Mixtral-8x7b.
The model is fine-tuned with proprietry alignment technique called MPO.
Model was trained on 8x A100s using LoRA.
Prompt format: This model uses ChatML prompt format.
<|im_start|>system
You are Dolphin, a helpful AI assistant.<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
I'll provide detailed article on training and data in near future.
|