Huertas97 commited on
Commit
7c1614a
1 Parent(s): e102e15

Update README text

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -86,16 +86,16 @@ Here we compare the average multilingual semantic textual similairty capabilitie
86
 
87
 
88
  | Model | Average Spearman Cosine Test |
89
- |---------------------------------------------|------------------------------|
90
  | mstsb-paraphrase-multilingual-mpnet-base-v2 | 0.835890 |
91
  | paraphrase-multilingual-mpnet-base-v2 | 0.818896 |
92
 
93
- \
94
 
95
  The following tables breakdown the performance of `mstsb-paraphrase-multilingual-mpnet-base-v2` according to the different tasks. For the sake of readability tasks have been splitted into monolingual and cross-lingual tasks.
96
 
97
  | Monolingual Task | Pearson Cosine test | Spearman Cosine test |
98
- |------------------|---------------------|-----------------------|
99
  | en;en | 0.868048310692506 | 0.8740170943535747 |
100
  | ar;ar | 0.8267139454193487 | 0.8284459741532022 |
101
  | cs;cs | 0.8466821720942157 | 0.8485417688803879 |
@@ -113,10 +113,10 @@ The following tables breakdown the performance of `mstsb-paraphrase-multilingual
113
  | zh-CN;zh-CN | 0.826233678946644 | 0.8248515460782744 |
114
  | zh-TW;zh-TW | 0.8242683809675422 | 0.8235506799952028 |
115
 
116
- \
117
 
118
  | Cross-lingual Task | Pearson Cosine test | Spearman Cosine test |
119
- |--------------------|---------------------|-----------------------|
120
  | en;ar | 0.7990830340462535 | 0.7956792016468148 |
121
  | en;cs | 0.8381274879061265 | 0.8388713450024455 |
122
  | en;de | 0.8414439600928739 | 0.8441971698649943 |
86
 
87
 
88
  | Model | Average Spearman Cosine Test |
89
+ |:---------------------------------------------:|:------------------------------:|
90
  | mstsb-paraphrase-multilingual-mpnet-base-v2 | 0.835890 |
91
  | paraphrase-multilingual-mpnet-base-v2 | 0.818896 |
92
 
93
+ <br>
94
 
95
  The following tables breakdown the performance of `mstsb-paraphrase-multilingual-mpnet-base-v2` according to the different tasks. For the sake of readability tasks have been splitted into monolingual and cross-lingual tasks.
96
 
97
  | Monolingual Task | Pearson Cosine test | Spearman Cosine test |
98
+ |:------------------:|:---------------------:|:-----------------------:|
99
  | en;en | 0.868048310692506 | 0.8740170943535747 |
100
  | ar;ar | 0.8267139454193487 | 0.8284459741532022 |
101
  | cs;cs | 0.8466821720942157 | 0.8485417688803879 |
113
  | zh-CN;zh-CN | 0.826233678946644 | 0.8248515460782744 |
114
  | zh-TW;zh-TW | 0.8242683809675422 | 0.8235506799952028 |
115
 
116
+ <br>
117
 
118
  | Cross-lingual Task | Pearson Cosine test | Spearman Cosine test |
119
+ |:--------------------:|:---------------------:|:-----------------------:|
120
  | en;ar | 0.7990830340462535 | 0.7956792016468148 |
121
  | en;cs | 0.8381274879061265 | 0.8388713450024455 |
122
  | en;de | 0.8414439600928739 | 0.8441971698649943 |