chrisociepa
commited on
Commit
•
e4ce93f
1
Parent(s):
f1c4afe
Update README.md
Browse files
README.md
CHANGED
@@ -9,7 +9,7 @@ tags:
|
|
9 |
- self-instruct
|
10 |
---
|
11 |
|
12 |
-
This repo contains a low-rank adapter for LLaMA-7B trained on generated (not translated!) 55125 [instructions](https://huggingface.co/datasets/chrisociepa/self-generated-instructions-pl) in Polish.
|
13 |
|
14 |
The training took almost 16 hours on a single RTX 4090 with the following hyperparameters:
|
15 |
|
|
|
9 |
- self-instruct
|
10 |
---
|
11 |
|
12 |
+
This repo contains a low-rank adapter for LLaMA-7B trained on generated (not translated!) 55125 [instructions](https://huggingface.co/datasets/chrisociepa/raw-self-generated-instructions-pl) in Polish.
|
13 |
|
14 |
The training took almost 16 hours on a single RTX 4090 with the following hyperparameters:
|
15 |
|