kwchoi commited on
Commit
8c2a367
1 Parent(s): 18caba9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +61 -0
README.md CHANGED
@@ -1,3 +1,64 @@
1
  ---
2
  license: apache-2.0
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
+ datasets:
4
+ - argilla/ultrafeedback-binarized-preferences-cleaned
5
+ language:
6
+ - en
7
  ---
8
+ Testing Mistral-Instruct model with Orca DPO dataset.
9
+ Trying to see the effects of DPO for own study.
10
+ Used Mistral-7B-Instrcut-v0.2 model due to its good performance
11
+ Testing Mistral-Instruct model with Orca DPO dataset.
12
+ Trying to see the effects of DPO for own study.
13
+ Used Mistral-7B-Instrcut-v0.2 model due to its good performance
14
+ Testing Mistral-Instruct model with Orca DPO dataset.
15
+ Trying to see the effects of DPO for own study.
16
+ Used Mistral-7B-Instrcut-v0.2 model due to its good performance
17
+ Testing Mistral-Instruct model with Orca DPO dataset.
18
+ Trying to see the effects of DPO for own study.
19
+ Used Mistral-7B-Instrcut-v0.2 model due to its good performance
20
+ Testing Mistral-Instruct model with Orca DPO dataset.
21
+ Trying to see the effects of DPO for own study.
22
+ Used Mistral-7B-Instrcut-v0.2 model due to its good performance
23
+ Testing Mistral-Instruct model with Orca DPO dataset.
24
+ Trying to see the effects of DPO for own study.
25
+ Used Mistral-7B-Instrcut-v0.2 model due to its good performance
26
+ Testing Mistral-Instruct model with Orca DPO dataset.
27
+ Trying to see the effects of DPO for own study.
28
+ Used Mistral-7B-Instrcut-v0.2 model due to its good performance
29
+ Testing Mistral-Instruct model with Orca DPO dataset.
30
+ Trying to see the effects of DPO for own study.
31
+ Used Mistral-7B-Instrcut-v0.2 model due to its good performanceTesting Mistral-Instruct model with Orca DPO dataset.
32
+ Trying to see the effects of DPO for own study.
33
+ Used Mistral-7B-Instrcut-v0.2 model due to its good performance
34
+ Testing Mistral-Instruct model with Orca DPO dataset.
35
+ Trying to see the effects of DPO for own study.
36
+ Used Mistral-7B-Instrcut-v0.2 model due to its good performance
37
+ Testing Mistral-Instruct model with Orca DPO dataset.
38
+ Trying to see the effects of DPO for own study.
39
+ Used Mistral-7B-Instrcut-v0.2 model due to its good performance
40
+ Testing Mistral-Instruct model with Orca DPO dataset.
41
+ Trying to see the effects of DPO for own study.
42
+ Used Mistral-7B-Instrcut-v0.2 model due to its good performanceTesting Mistral-Instruct model with Orca DPO dataset.
43
+ Trying to see the effects of DPO for own study.
44
+ Used Mistral-7B-Instrcut-v0.2 model due to its good performance
45
+ Testing Mistral-Instruct model with Orca DPO dataset.
46
+ Trying to see the effects of DPO for own study.
47
+ Used Mistral-7B-Instrcut-v0.2 model due to its good performance
48
+ Testing Mistral-Instruct model with Orca DPO dataset.
49
+ Trying to see the effects of DPO for own study.
50
+ Used Mistral-7B-Instrcut-v0.2 model due to its good performance
51
+ Testing Mistral-Instruct model with Orca DPO dataset.
52
+ Trying to see the effects of DPO for own study.
53
+ Used Mistral-7B-Instrcut-v0.2 model due to its good performanceTesting Mistral-Instruct model with Orca DPO dataset.
54
+ Trying to see the effects of DPO for own study.
55
+ Used Mistral-7B-Instrcut-v0.2 model due to its good performance
56
+ Testing Mistral-Instruct model with Orca DPO dataset.
57
+ Trying to see the effects of DPO for own study.
58
+ Used Mistral-7B-Instrcut-v0.2 model due to its good performance
59
+ Testing Mistral-Instruct model with Orca DPO dataset.
60
+ Trying to see the effects of DPO for own study.
61
+ Used Mistral-7B-Instrcut-v0.2 model due to its good performance
62
+ Testing Mistral-Instruct model with Orca DPO dataset.
63
+ Trying to see the effects of DPO for own study.
64
+ Used Mistral-7B-Instrcut-v0.2 model due to its good performance