Edit model card

Mixtral 8x7B - Holodeck

Model Description

Mistral 7B-Holodeck is a finetune created using Mixtral's 8x7B model.

Training data

The training data contains around 3000 ebooks in various genres. Most parts of the dataset have been prepended using the following text: [Genre: <genre1>, <genre2>]


Limitations and Biases

Based on known problems with NLP technology, potential relevant factors include bias (gender, profession, race and religion).

Downloads last month
2,556
Safetensors
Model size
46.7B params
Tensor type
FP16
·