--- license: apache-2.0 language: - en pipeline_tag: text-generation inference: false tags: - pytorch - mistral - finetuned - not-for-all-audiences --- This is an ExLlamaV2 quantized model in 4bpw of [KoboldAI/Mistral-7B-Erebus-v3](https://huggingface.co/KoboldAI/Mistral-7B-Erebus-v3) using the default calibration dataset. # Original Model card: # Mistral-7B-Erebus ## Model description This is the third generation of the original Shinen made by Mr. Seeker. The full dataset consists of 8 different sources, all surrounding the "Adult" theme. The name "Erebus" comes from the greek mythology, also named "darkness". This is in line with Shin'en, or "deep abyss". For inquiries, please contact the KoboldAI community. **Warning: THIS model is NOT suitable for use by minors. The model will output X-rated content.** ## Training procedure Mistral-7B-Erebus was trained on 8x A6000 Ada GPU's for a single epoch. No special frameworks have been used. ## Training data The data can be divided in 8 different datasets: - Literotica (everything with 3.0/5 or higher) - Sexstories (everything with 70 or higher) - Dataset-G (private dataset of X-rated stories) - Doc's Lab (all stories) - Lushstories (Editor's pick) - Swinglifestyle (all stories) - Pike-v2 Dataset (novels with "adult" rating) - SoFurry (collection of various animals) The dataset uses `[Genre: ]` for tagging. The full dataset is 2.3B tokens in size. ## Limitations and biases Based on known problems with NLP technology, potential relevant factors include bias (gender, profession, race and religion). **Warning: This model has a very strong NSFW bias!**