Edit model card

Mixtral 8x7B - Holodeck

This is the GGUF version of the model meant for use with Koboldcpp

Model Description

Mistral 7B-Holodeck is a finetune created using Mixtral's 8x7B model.

Training data

The training data contains around 3000 ebooks in various genres. Most parts of the dataset have been prepended using the following text: [Genre: <genre1>, <genre2>]

Limitations and Biases

Based on known problems with NLP technology, potential relevant factors include bias (gender, profession, race and religion).

Downloads last month
Inference Examples
Unable to determine this model's library. Check the docs .