File size: 1,310 Bytes
c3dcae7
 
 
 
 
 
 
 
 
 
 
 
 
 
c98df05
 
c3dcae7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
11e3be7
c3dcae7
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
---
license: cc-by-nc-4.0
tags:
- not-for-all-audiences
- nsfw
---

Mixtral-8x7B-MoE-RP-Story is a model made primarely for chatting, RP (Roleplay) and storywriting.
2 RP model, 2 chat model, 1 occult model, 1 storywritting model, 1 mathematic model and 1 DPO model was used for a MoE. Bagel was the base.

The DPO chat model is here to help get more human reply.

This is my first try at doing this, so don't hesitate to give feedback!

WARNING: ALL THE "K" GGUF QUANT OF MIXTRAL MODELS SEEMS TO BE [BROKEN](https://cdn-uploads.huggingface.co/production/uploads/63ab1241ad514ca8d1430003/TvjEP14ps7ZUgJ-0-mhIX.png), PREFER Q4_0, Q5_0 or Q8_0!

<!-- description start -->
## Description

This repo contains fp16 files of Mixtral-8x7B-MoE-RP-Story.

<!-- description end -->
<!-- description start -->
## Models used

The list of model used and their activator/theme can be found [here](https://huggingface.co/Undi95/Mixtral-8x7B-MoE-RP-Story/blob/main/config.yaml)

<!-- description end -->
<!-- prompt-template start -->
## Prompt template: Custom

Using Bagel as a base let us a lot of different prompting system theorically, you can see all the prompting available [here](https://huggingface.co/jondurbin/bagel-7b-v0.1#prompt-formatting).

If you want to support me, you can [here](https://ko-fi.com/undiai).