Muhammad2003
commited on
Commit
•
0a1f33b
1
Parent(s):
d080063
Update README.md
Browse files
README.md
CHANGED
@@ -9,7 +9,8 @@ tags:
|
|
9 |
|
10 |
# Llama3-8B-OpenHermes-DPO
|
11 |
|
12 |
-
|
|
|
13 |
|
14 |
Llama3-8B-OpenHermes-DPO is DPO-Finetuned model of Llama3-8B, on the OpenHermes-2.5 preference dataset using QLoRA.
|
15 |
|
|
|
9 |
|
10 |
# Llama3-8B-OpenHermes-DPO
|
11 |
|
12 |
+
|
13 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64fc6d81d75293f417fee1d1/QF2OsDu9DJKP4QYPBu4aK.png)
|
14 |
|
15 |
Llama3-8B-OpenHermes-DPO is DPO-Finetuned model of Llama3-8B, on the OpenHermes-2.5 preference dataset using QLoRA.
|
16 |
|