Update README.md
Browse files
README.md
CHANGED
@@ -14277,7 +14277,7 @@ print(util.dot_score(embeddings, embeddings))
|
|
14277 |
|
14278 |
Оценки модели на бенчмарке [ruMTEB](https://habr.com/ru/companies/sberdevices/articles/831150/):
|
14279 |
|
14280 |
-
|Model Name | Metric |
|
14281 |
|:----------------------------------|:--------------------|-----------------------:|--------------------:|----------------:|------------------:|----------------------:|---------------------:|----------------------:|
|
14282 |
|CEDRClassification | Accuracy | 0.368 | 0.358 | 0.369 | 0.390 | 0.401 | 0.423 | **0.448** |
|
14283 |
|GeoreviewClassification | Accuracy | 0.397 | 0.400 | 0.396 | 0.414 | 0.447 | 0.461 | **0.497** |
|
@@ -14297,7 +14297,7 @@ print(util.dot_score(embeddings, embeddings))
|
|
14297 |
|SensitiveTopicsClassification | Accuracy | **0.285** | 0.280 | 0.220 | 0.244 | 0.228 | 0.234 | 0.257 |
|
14298 |
|TERRaClassification | Average Precision | 0.520 | 0.502 | 0.519 | 0.563 | 0.551 | 0.550 | **0.584** |
|
14299 |
|
14300 |
-
|Model Name | Metric |
|
14301 |
|:----------------------------------|:--------------------|-----------------------:|--------------------:|----------------:|------------------:|----------------------:|----------------------:|---------------------:|
|
14302 |
|Classification | Accuracy | 0,554 | 0,552 | 0,514 | 0,535 | 0,551 | 0,561 | **0,588** |
|
14303 |
|Clustering | V-measure | **0,526** | 0,519 | 0,412 | 0,496 | 0,513 | 0,503 | 0,525 |
|
|
|
14277 |
|
14278 |
Оценки модели на бенчмарке [ruMTEB](https://habr.com/ru/companies/sberdevices/articles/831150/):
|
14279 |
|
14280 |
+
|Model Name | Metric | sbert_large_ mt_nlu_ru | sbert_large_ nlu_ru | rubert-tiny2 | rubert-tiny-turbo | multilingual-e5-small | multilingual-e5-base | multilingual-e5-large |
|
14281 |
|:----------------------------------|:--------------------|-----------------------:|--------------------:|----------------:|------------------:|----------------------:|---------------------:|----------------------:|
|
14282 |
|CEDRClassification | Accuracy | 0.368 | 0.358 | 0.369 | 0.390 | 0.401 | 0.423 | **0.448** |
|
14283 |
|GeoreviewClassification | Accuracy | 0.397 | 0.400 | 0.396 | 0.414 | 0.447 | 0.461 | **0.497** |
|
|
|
14297 |
|SensitiveTopicsClassification | Accuracy | **0.285** | 0.280 | 0.220 | 0.244 | 0.228 | 0.234 | 0.257 |
|
14298 |
|TERRaClassification | Average Precision | 0.520 | 0.502 | 0.519 | 0.563 | 0.551 | 0.550 | **0.584** |
|
14299 |
|
14300 |
+
|Model Name | Metric | sbert_large_ mt_nlu_ru | sbert_large_ nlu_ru | rubert-tiny2 | rubert-tiny-turbo | multilingual-e5-small | multilingual-e5-base | multilingual-e5-large |
|
14301 |
|:----------------------------------|:--------------------|-----------------------:|--------------------:|----------------:|------------------:|----------------------:|----------------------:|---------------------:|
|
14302 |
|Classification | Accuracy | 0,554 | 0,552 | 0,514 | 0,535 | 0,551 | 0,561 | **0,588** |
|
14303 |
|Clustering | V-measure | **0,526** | 0,519 | 0,412 | 0,496 | 0,513 | 0,503 | 0,525 |
|