File size: 1,132 Bytes
d9ce811
 
 
87251ae
ced56bd
87251ae
 
 
 
 
64d684d
87251ae
5658128
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
---
license: cc-by-nc-4.0
---
# PiVot-MoE
![img](./PiVoT-MoE.png)

## Model Description

PiVoT-MoE, is an advanced AI model specifically designed for roleplaying purposes. It has been trained using a combination of four 10.7B sized experts, each with their own specialized characteristic, all fine-tuned to bring a unique and diverse roleplaying experience.

The Mixture of Experts (MoE) technique is utilized in this model, allowing the experts to work together synergistically, resulting in a more cohesive and natural conversation flow. The MoE architecture allows for a higher level of flexibility and adaptability, enabling PiVoT-MoE to handle a wide variety of roleplaying scenarios and characters.

Based on the PiVoT-10.7B-Mistral-v0.2-RP model, PiVoT-MoE takes it a step further with the incorporation of the MoE technique. This means that not only does the model have an expansive knowledge base, but it also has the ability to mix and match its expertise to better suit the specific roleplaying scenario.

## Prompt Template - Alpaca (ChatML works)
```
{system}
### Instruction:
{instruction}
### Response:
{response}
```