File size: 575 Bytes
d280a5a 6e08f70 d280a5a |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
---
license: apache-2.0
language:
- en
pipeline_tag: text-generation
tags:
- pytorch
- mixtral
- fine-tuned
- moe
---
# Mixtral 8x7B - Holodeck
## Model Description
Mistral 7B-Holodeck is a finetune created using Mixtral's 8x7B model.
## Training data
The training data contains around 3000 ebooks in various genres.
Most parts of the dataset have been prepended using the following text: `[Genre: <genre1>, <genre2>]`
***
### Limitations and Biases
Based on known problems with NLP technology, potential relevant factors include bias (gender, profession, race and religion). |