barthfab commited on
Commit
be4f280
1 Parent(s): 50cc49b

clean table

Browse files

- 5 lang table
- repo_id only as label
- avg. as first column

Files changed (1) hide show
  1. README.md +31 -19
README.md CHANGED
@@ -92,30 +92,42 @@ Currently, we are working on more suitable benchmarks for Spanish, French, Germa
92
  <details>
93
  <summary>Evaluation results</summary>
94
 
95
- ### English
 
 
 
 
 
 
 
 
 
 
96
 
97
- | | arc_challenge | belebele | hellaswag | mmlu | truthfulqa | avg |
98
- |:-------------------------------------|----------------:|-----------:|------------:|---------:|-------------:|---------:|
99
- | occiglot/occiglot-7b-eu5 | 0.530717 | 0.726667 | 0.789882 | 0.531904 | 0.403678 | 0.59657 |
100
- | occiglot/occiglot-7b-eu5-instruct | 0.558874 | 0.746667 | 0.799841 | 0.535109 | 0.449034 | 0.617905 |
101
- | occiglot/occiglot-7b-de-en | 0.556314 | 0.791111 | 0.803824 | 0.568438 | 0.423251 | 0.628587 |
102
- | occiglot/occiglot-7b-de-en-instruct | 0.604096 | 0.812222 | 0.80004 | 0.570574 | 0.493807 | 0.656148 |
103
- | LeoLM/leo-mistral-hessianai-7b | 0.522184 | 0.736667 | 0.777833 | 0.538812 | 0.429248 | 0.600949 |
104
- | mistralai/Mistral-7B-v0.1 | 0.612628 | 0.844444 | 0.834097 | 0.624555 | 0.426201 | 0.668385 |
105
- | mistralai/Mistral-7B-Instruct-v0.2 | 0.637372 | 0.824444 | 0.846345 | 0.59201 | 0.668116 | 0.713657 |
106
 
 
 
 
 
 
 
 
 
 
 
 
107
 
108
  ### German
109
 
110
- | | arc_challenge_de | belebele_de | hellaswag_de | mmlu_de | truthfulqa_de | avg |
111
- |:-------------------------------------|-------------------:|--------------:|---------------:|----------:|----------------:|---------:|
112
- | occiglot/occiglot-7b-eu5 | 0.493584 | 0.646667 | 0.666631 | 0.483406 | 0.251269 | 0.508311 |
113
- | occiglot/occiglot-7b-eu5-instruct | 0.529512 | 0.667778 | 0.685205 | 0.488234 | 0.286802 | 0.531506 |
114
- | occiglot/occiglot-7b-de-en | 0.50556 | 0.743333 | 0.67421 | 0.514633 | 0.26269 | 0.540085 |
115
- | occiglot/occiglot-7b-de-en-instruct | 0.54491 | 0.772222 | 0.688407 | 0.515915 | 0.310914 | 0.566474 |
116
- | LeoLM/leo-mistral-hessianai-7b | 0.474765 | 0.691111 | 0.682109 | 0.488309 | 0.252538 | 0.517766 |
117
- | mistralai/Mistral-7B-v0.1 | 0.476476 | 0.738889 | 0.610589 | 0.529567 | 0.284264 | 0.527957 |
118
- | mistralai/Mistral-7B-Instruct-v0.2 | 0.485885 | 0.688889 | 0.622438 | 0.501961 | 0.376904 | 0.535215 |
119
 
120
  </details>
121
 
 
92
  <details>
93
  <summary>Evaluation results</summary>
94
 
95
+ ### All 5 Languages
96
+
97
+ | | avg | arc_challenge | belebele | hellaswag | mmlu | truthfulqa |
98
+ |:---------------------------|---------:|----------------:|-----------:|------------:|---------:|-------------:|
99
+ | Occiglot-7b-eu5 | 0.516895 | 0.508109 | 0.675556 | 0.718963 | 0.402064 | 0.279782 |
100
+ | Occiglot-7b-eu5-instruct | 0.537799 | 0.53632 | 0.691111 | 0.731918 | 0.405198 | 0.32445 |
101
+ | Occiglot-7b-de-en | 0.518337 | 0.496297 | 0.715111 | 0.669034 | 0.412545 | 0.298697 |
102
+ | Occiglot-7b-de-en-instruct | 0.543173 | 0.530826 | 0.745778 | 0.67676 | 0.411326 | 0.351176 |
103
+ | Leo-mistral-hessianai-7b | 0.484806 | 0.462103 | 0.653556 | 0.642242 | 0.379208 | 0.28692 |
104
+ | Mistral-7b-v0.1 | 0.547111 | 0.528937 | 0.768444 | 0.682516 | 0.448253 | 0.307403 |
105
+ | Mistral-7b-instruct-v0.2 | 0.56713 | 0.547228 | 0.741111 | 0.69455 | 0.422501 | 0.430262 |
106
 
 
 
 
 
 
 
 
 
 
107
 
108
+ ### English
109
+
110
+ | | avg | arc_challenge | belebele | hellaswag | mmlu | truthfulqa |
111
+ |:---------------------------|---------:|----------------:|-----------:|------------:|---------:|-------------:|
112
+ | Occiglot-7b-eu5 | 0.59657 | 0.530717 | 0.726667 | 0.789882 | 0.531904 | 0.403678 |
113
+ | Occiglot-7b-eu5-instruct | 0.617905 | 0.558874 | 0.746667 | 0.799841 | 0.535109 | 0.449 |
114
+ | Occiglot-7b-de-en | 0.518337 | 0.496297 | 0.715111 | 0.669034 | 0.412545 | 0.298697 |
115
+ | Occiglot-7b-de-en-instruct | 0.543173 | 0.530826 | 0.745778 | 0.67676 | 0.411326 | 0.351176 |
116
+ | Leo-mistral-hessianai-7b | 0.600949 | 0.522184 | 0.736667 | 0.777833 | 0.538812 | 0.429248 |
117
+ | Mistral-7b-v0.1 | 0.668385 | 0.612628 | 0.844444 | 0.834097 | 0.624555 | 0.426201 |
118
+ | Mistral-7b-instruct-v0.2 | 0.713657 | 0.637372 | 0.824444 | 0.846345 | 0.59201 | 0.668116 |
119
 
120
  ### German
121
 
122
+ | | avg | arc_challenge_de | belebele_de | hellaswag_de | mmlu_de | truthfulqa_de |
123
+ |:---------------------------|---------:|-------------------:|--------------:|---------------:|----------:|----------------:|
124
+ | Occiglot-7b-eu5 | 0.508311 | 0.493584 | 0.646667 | 0.666631 | 0.483406 | 0.251269 |
125
+ | Occiglot-7b-eu5-instruct | 0.531506 | 0.529512 | 0.667778 | 0.685205 | 0.488234 | 0.286802 |
126
+ | Occiglot-7b-de-en | 0.540085 | 0.50556 | 0.743333 | 0.67421 | 0.514633 | 0.26269 |
127
+ | Occiglot-7b-de-en-instruct | 0.566474 | 0.54491 | 0.772222 | 0.688407 | 0.515915 | 0.310914 |
128
+ | Leo-mistral-hessianai-7b | 0.517766 | 0.474765 | 0.691111 | 0.682109 | 0.488309 | 0.252538 |
129
+ | Mistral-7b-v0.1 | 0.527957 | 0.476476 | 0.738889 | 0.610589 | 0.529567 | 0.284264 |
130
+ | Mistral-7b-instruct-v0.2 | 0.535215 | 0.485885 | 0.688889 | 0.622438 | 0.501961 | 0.376904 |
131
 
132
  </details>
133