Update README.md
Browse files
README.md
CHANGED
@@ -118,7 +118,7 @@ model-index:
|
|
118 |
MixTAO-7Bx2-MoE is a Mixture of Experts (MoE).
|
119 |
This model is mainly used for large model technology experiments, and increasingly perfect iterations will eventually create high-level large language models.
|
120 |
|
121 |
-
### Prompt Template
|
122 |
```
|
123 |
### Instruction:
|
124 |
<prompt> (without the <>)
|
|
|
118 |
MixTAO-7Bx2-MoE is a Mixture of Experts (MoE).
|
119 |
This model is mainly used for large model technology experiments, and increasingly perfect iterations will eventually create high-level large language models.
|
120 |
|
121 |
+
### Prompt Template (Alpaca)
|
122 |
```
|
123 |
### Instruction:
|
124 |
<prompt> (without the <>)
|