Update README.md
Browse files
README.md
CHANGED
@@ -4,7 +4,7 @@ license_name: yi-license
|
|
4 |
license_link: LICENSE
|
5 |
---
|
6 |
|
7 |
-
THIS MODEL IS EXPERIMENTAL AND MIGHT BE BUGGY, I DIDN'T PERFECT THE STRENGTH OF DPO AND SFT YET.
|
8 |
Submitting to Open LLM leaderboard with base model yi-34b-200k-llamafied to see whether there's a point in merging a lora over a lora if both have the same lora_r or if it doesn't matter.
|
9 |
|
10 |
Another AEZAKMI v2 finetune over Yi-34B-200K-rawrr-r3. Sequence length 2200
|
|
|
4 |
license_link: LICENSE
|
5 |
---
|
6 |
|
7 |
+
THIS MODEL IS EXPERIMENTAL AND MIGHT BE BUGGY, I DIDN'T PERFECT THE STRENGTH OF DPO AND SFT YET. \
|
8 |
Submitting to Open LLM leaderboard with base model yi-34b-200k-llamafied to see whether there's a point in merging a lora over a lora if both have the same lora_r or if it doesn't matter.
|
9 |
|
10 |
Another AEZAKMI v2 finetune over Yi-34B-200K-rawrr-r3. Sequence length 2200
|