Update README.md
Browse files
README.md
CHANGED
@@ -56,21 +56,6 @@ Open-source models:
|
|
56 |
| MT Bench Ru | **8.7** | 8.15 | 8.39 | 7.96 | <u>8.26</u> |
|
57 |
| Alpaca Eval Ru | **47.61** | 35.01 | <u>43.15</u> | 38.82 | - |
|
58 |
|
59 |
-
|
60 |
-
|
61 |
-
| Benchmark | T-pro-it-1.0 | GPT-4o | GPT-4o-mini | Qwen-2.5-32B-Instruct | GigaChat Max 1.0.26.20 | RuAdapt-Qwen-32B-Instruct-v1 | gemma-2-27b-it | Llama-3.3-70B-Instruct |
|
62 |
-
|------------------------------------------------|-----------------------|------------------------------|-----------------------|-----------------------|--------------------|------------------------------|-----------------------|------------------------|
|
63 |
-
| [MERA](https://mera.a-ai.ru) | <u>0.629</u> | **0.642** | 0.57 | 0.578 | 0.588 | 0.615 | 0.574 | 0.567 |
|
64 |
-
| [MaMuRaMu](https://mera.a-ai.ru/ru/tasks/22) | <u>0.841</u> | **0.874** | 0.779 | 0.824 | 0.824 | 0.812 | 0.768 | 0.818 |
|
65 |
-
| ruMMLU-PRO | <u>0.665</u> | **0.713** | 0.573 | 0.637 | 0.535 | 0.631 | 0.470 | 0.653 |
|
66 |
-
| ruGSM8K | **0.941** | 0.931 | 0.888 | 0.926 | 0.892 | 0.923 | 0.894 | <u>0.934</u> |
|
67 |
-
| ruMATH | **0.776** | <u>0.771</u> | 0.724 | 0.727 | 0.589 | 0.742 | 0.538 | 0.636 |
|
68 |
-
| ruMBPP | 0.805 | 0.802 | 0.79 | **0.825** | 0.626 | <u>0.813</u> | 0.708 | 0.77 |
|
69 |
-
| [ruCodeEval](https://mera.a-ai.ru/ru/tasks/23) | 0.432 / 0.626 / 0.677 | <u>0.529 / 0.649 / 0.683</u> | **0.704 / 0.753 / 0.768** | 0.06 / 0.098 / 0.116 | 0.077 / 0.093 / 0.098 | 0.426 / 0.561 / 0.598 | 0.259 / 0.586 / 0.689 | 0.112 / 0.166 / 0.189 |
|
70 |
-
| Arena-Hard-Ru | **90.17** | <u>84.87</u> | 81 | 74.54 | - | 80.23 | 66.4 | 76.51 |
|
71 |
-
| MT Bench Ru | <u>8.7</u> | **8.706** | 8.45 | 8.15 | 8.53 | 8.39 | 7.96 | 8.26 |
|
72 |
-
| Alpaca Eval Ru | <u>47.61</u> | **50** | 45.51 | 35.01 | 38.13 | 43.15 | 38.82 | - |
|
73 |
-
|
74 |
Detailed evaluation results can be found in our [habr post](https://habr.com/ru/companies/tbank/articles/865582/)
|
75 |
|
76 |
## π¨βπ» Examples of usage
|
|
|
56 |
| MT Bench Ru | **8.7** | 8.15 | 8.39 | 7.96 | <u>8.26</u> |
|
57 |
| Alpaca Eval Ru | **47.61** | 35.01 | <u>43.15</u> | 38.82 | - |
|
58 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
59 |
Detailed evaluation results can be found in our [habr post](https://habr.com/ru/companies/tbank/articles/865582/)
|
60 |
|
61 |
## π¨βπ» Examples of usage
|