Update README.md
Browse files
README.md
CHANGED
@@ -25,6 +25,16 @@ You can read more about the dataset on its relevant page. It's a ShareGPT reform
|
|
25 |
- **Training Environment**: Axolotl
|
26 |
- **sequence_len**: 4096
|
27 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
28 |
## Notes
|
29 |
|
30 |
This Qlora was produced as an experiment to see how the public version of PIPPA can affect a model. As a result, I have no idea if this lora is of great quality or absolute garbage.
|
@@ -35,6 +45,11 @@ Thanks to:
|
|
35 |
- PygmalionAI: The creators of the PIPPA dataset
|
36 |
- Axolotl: Finetuning suite
|
37 |
|
|
|
|
|
|
|
|
|
|
|
38 |
## Axolotl stuff
|
39 |
|
40 |
## Training procedure
|
|
|
25 |
- **Training Environment**: Axolotl
|
26 |
- **sequence_len**: 4096
|
27 |
|
28 |
+
## Instruct Format
|
29 |
+
|
30 |
+
ShareGPT gets converted to vicuna format. The dataset uses modified roles of `USER` and `CHARACTER` instead of `USER` and `ASSISTANT`.
|
31 |
+
|
32 |
+
```
|
33 |
+
SYSTEM: Enter roleplay mode...
|
34 |
+
USER: {prompt}
|
35 |
+
CHARACTER:
|
36 |
+
```
|
37 |
+
|
38 |
## Notes
|
39 |
|
40 |
This Qlora was produced as an experiment to see how the public version of PIPPA can affect a model. As a result, I have no idea if this lora is of great quality or absolute garbage.
|
|
|
45 |
- PygmalionAI: The creators of the PIPPA dataset
|
46 |
- Axolotl: Finetuning suite
|
47 |
|
48 |
+
## Donate?
|
49 |
+
All my infrastructure and cloud expenses are paid out of pocket. If you'd like to donate, you can do so here: [https://ko-fi.com/kingbri](https://ko-fi.com/kingbri)
|
50 |
+
|
51 |
+
You should not feel obligated to donate, but if you do, I'd appreciate it.
|
52 |
+
|
53 |
## Axolotl stuff
|
54 |
|
55 |
## Training procedure
|