kaiokendev
commited on
Commit
•
74f14c3
1
Parent(s):
8236d32
Update README.md
Browse files
README.md
CHANGED
@@ -32,4 +32,5 @@ I trained the LoRA with the following configuration:
|
|
32 |
- no dropout
|
33 |
- weight decay of 0.1
|
34 |
- AdamW beta1 of 0.9 and beta2 0.99, epsilon of 1e-5
|
35 |
-
- Trained on 4-bit base model
|
|
|
|
32 |
- no dropout
|
33 |
- weight decay of 0.1
|
34 |
- AdamW beta1 of 0.9 and beta2 0.99, epsilon of 1e-5
|
35 |
+
- Trained on 4-bit base model
|
36 |
+
- Cutoff of 4096
|