maywell commited on
Commit
5658128
1 Parent(s): 9335412

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -1
README.md CHANGED
@@ -10,4 +10,13 @@ PiVoT-MoE, is an advanced AI model specifically designed for roleplaying purpose
10
 
11
  The Mixer of Experts (MoE) technique is utilized in this model, allowing the experts to work together synergistically, resulting in a more cohesive and natural conversation flow. The MoE architecture allows for a higher level of flexibility and adaptability, enabling PiVoT-MoE to handle a wide variety of roleplaying scenarios and characters.
12
 
13
- Based on the PiVoT-10.7B-Mistral-v0.2-RP model, PiVoT-MoE takes it a step further with the incorporation of the MoE technique. This means that not only does the model have an expansive knowledge base, but it also has the ability to mix and match its expertise to better suit the specific roleplaying scenario.
 
 
 
 
 
 
 
 
 
 
10
 
11
  The Mixer of Experts (MoE) technique is utilized in this model, allowing the experts to work together synergistically, resulting in a more cohesive and natural conversation flow. The MoE architecture allows for a higher level of flexibility and adaptability, enabling PiVoT-MoE to handle a wide variety of roleplaying scenarios and characters.
12
 
13
+ Based on the PiVoT-10.7B-Mistral-v0.2-RP model, PiVoT-MoE takes it a step further with the incorporation of the MoE technique. This means that not only does the model have an expansive knowledge base, but it also has the ability to mix and match its expertise to better suit the specific roleplaying scenario.
14
+
15
+ ## Prompt Template - Alpaca (ChatML works)
16
+ ```
17
+ {system}
18
+ ### Instruction:
19
+ {instruction}
20
+ ### Response:
21
+ {response}
22
+ ```