File size: 654 Bytes
53b35cf 2b5c042 53b35cf |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
---
license: apache-2.0
datasets:
- Open-Orca/OpenOrca
- jondurbin/airoboros-2.2.1
language:
- en
---
This Mistral 7B model is trained on a mix of datasets filtered for higher quality and output length. The mix of datasets was composed to increase reasoning and creativity.
Datasets:
The mix of datasets is composed of a filtered version of the OpenOrca and Airoboros 2.2.1 datasets.
Training:
Full model training took 17 hours with 4 epochs on 8x A100s.
Prompt format: This model uses the ChatML prompt format (OpenAI's format).
<|im_start|>system
You are a helpful AI assistant.<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant |