bobox commited on
Commit
3c4dae4
1 Parent(s): 25d3738

train_loss = AdaptiveLayerLoss(model=model,

Browse files

loss=train_loss,
n_layers_per_step = -1,
last_layer_weight = 1.5,
prior_layers_weight= 0.15,
kl_div_weight = 2,
kl_temperature= 2,
)

num_epochs = 2
learning_rate = 2e-5
warmup_ratio=0.25

weight_decay = 5e-7

schedule = "cosine_with_restarts"
num_cycles = 3

Files changed (2) hide show
  1. README.md +258 -117
  2. pytorch_model.bin +2 -2
README.md CHANGED
@@ -87,34 +87,124 @@ model-index:
87
  type: sts-test
88
  metrics:
89
  - type: pearson_cosine
90
- value: 0.7643554891812735
91
  name: Pearson Cosine
92
  - type: spearman_cosine
93
- value: 0.7591947144735277
94
  name: Spearman Cosine
95
  - type: pearson_manhattan
96
- value: 0.769108031897504
97
  name: Pearson Manhattan
98
  - type: spearman_manhattan
99
- value: 0.7590854149926064
100
  name: Spearman Manhattan
101
  - type: pearson_euclidean
102
- value: 0.7608914486061109
103
  name: Pearson Euclidean
104
  - type: spearman_euclidean
105
- value: 0.7488315275075106
106
  name: Spearman Euclidean
107
  - type: pearson_dot
108
- value: 0.6257426306656716
109
  name: Pearson Dot
110
  - type: spearman_dot
111
- value: 0.6045082518573447
112
  name: Spearman Dot
113
  - type: pearson_max
114
- value: 0.769108031897504
115
  name: Pearson Max
116
  - type: spearman_max
117
- value: 0.7591947144735277
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
118
  name: Spearman Max
119
  ---
120
 
@@ -231,16 +321,67 @@ You can finetune this model on your own dataset.
231
 
232
  | Metric | Value |
233
  |:--------------------|:-----------|
234
- | pearson_cosine | 0.7644 |
235
- | **spearman_cosine** | **0.7592** |
236
- | pearson_manhattan | 0.7691 |
237
- | spearman_manhattan | 0.7591 |
238
- | pearson_euclidean | 0.7609 |
239
- | spearman_euclidean | 0.7488 |
240
- | pearson_dot | 0.6257 |
241
- | spearman_dot | 0.6045 |
242
- | pearson_max | 0.7691 |
243
- | spearman_max | 0.7592 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
244
 
245
  <!--
246
  ## Bias, Risks and Limitations
@@ -316,16 +457,16 @@ You can finetune this model on your own dataset.
316
  * Size: 3,194 training samples
317
  * Columns: <code>label</code>, <code>sentence1</code>, and <code>sentence2</code>
318
  * Approximate statistics based on the first 1000 samples:
319
- | | label | sentence1 | sentence2 |
320
- |:--------|:-----------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
321
- | type | int | string | string |
322
- | details | <ul><li>1: 100.00%</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 15.76 tokens</li><li>max: 75 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 37.3 tokens</li><li>max: 502 tokens</li></ul> |
323
  * Samples:
324
- | label | sentence1 | sentence2 |
325
- |:---------------|:------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
326
- | <code>1</code> | <code>The film will be screened in 2200 theaters .</code> | <code>In the United States and Canada , pre-release tracking suggest the film will gross $ 7�8 million from 2,200 theaters in its opening weekend , trailing fellow newcomer 10 Cloverfield Lane ( $ 25�30 million projection ) , but similar t</code> |
327
- | <code>1</code> | <code>Neighbors 2 : Sorority Rising ( film ) scored over 65 % on Rotten Tomatoes .</code> | <code>On Rotten Tomatoes , the film has a rating of 67 % , based on 105 reviews , with an average rating of 5.9/10 .</code> |
328
- | <code>1</code> | <code>Averaged on more than 65 reviews , The Handmaiden scored 94 % .</code> | <code>On Rotten Tomatoes , the film has a rating of 94 % , based on 67 reviews , with an average rating of 8/10 .</code> |
329
  * Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
330
  ```json
331
  {
@@ -344,16 +485,16 @@ You can finetune this model on your own dataset.
344
  * Size: 4,000 training samples
345
  * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code>
346
  * Approximate statistics based on the first 1000 samples:
347
- | | sentence1 | sentence2 | label |
348
- |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------|
349
- | type | string | string | int |
350
- | details | <ul><li>min: 6 tokens</li><li>mean: 13.64 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 34.57 tokens</li><li>max: 149 tokens</li></ul> | <ul><li>0: 100.00%</li></ul> |
351
  * Samples:
352
- | sentence1 | sentence2 | label |
353
- |:-----------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------|
354
- | <code>What professors established the importance of Whitehead's work?</code> | <code>Professors such as Wieman, Charles Hartshorne, Bernard Loomer, Bernard Meland, and Daniel Day Williams made Whitehead's philosophy arguably the most important intellectual thread running through the Divinity School.</code> | <code>0</code> |
355
- | <code>When did people start living on the edge of the desert?</code> | <code>It was long believed that the region had been this way since about 1600 BCE, after shifts in the Earth's axis increased temperatures and decreased precipitation.</code> | <code>0</code> |
356
- | <code>What was the title of Gertrude Stein's 1906-1908 book?</code> | <code>Picasso in turn was an important influence on Stein's writing.</code> | <code>0</code> |
357
  * Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
358
  ```json
359
  {
@@ -375,13 +516,13 @@ You can finetune this model on your own dataset.
375
  | | sentence2 | sentence1 |
376
  |:--------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
377
  | type | string | string |
378
- | details | <ul><li>min: 7 tokens</li><li>mean: 16.2 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 14.65 tokens</li><li>max: 33 tokens</li></ul> |
379
  * Samples:
380
- | sentence2 | sentence1 |
381
- |:-------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------|
382
- | <code>Ash that enters the air naturally as a result of a volcano eruption is classified as a primary pollutant.</code> | <code>Ash that enters the air naturally as a result of a volcano eruption is classified as what kind of pollutant?</code> |
383
- | <code>Exposure to ultraviolet radiation can increase the amount of pigment in the skin and make it appear darker.</code> | <code>Exposure to what can increase the amount of pigment in the skin and make it appear darker?</code> |
384
- | <code>A lysozyme destroys bacteria by digesting their cell walls.</code> | <code>How does lysozyme destroy bacteria?</code> |
385
  * Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
386
  ```json
387
  {
@@ -400,16 +541,16 @@ You can finetune this model on your own dataset.
400
  * Size: 2,200 training samples
401
  * Columns: <code>sentence1</code> and <code>sentence2</code>
402
  * Approximate statistics based on the first 1000 samples:
403
- | | sentence1 | sentence2 |
404
- |:--------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
405
- | type | string | string |
406
- | details | <ul><li>min: 7 tokens</li><li>mean: 23.6 tokens</li><li>max: 74 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.23 tokens</li><li>max: 41 tokens</li></ul> |
407
  * Samples:
408
- | sentence1 | sentence2 |
409
- |:-------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------|
410
- | <code>An atom that gains electrons would be a negative ion.</code> | <code>Atoms that have gained electrons and become negatively charged are called negative ions.</code> |
411
- | <code>Scientists will use data collected during the collisions to explore the particles known as quarks and gluons that make up protons and neutrons.</code> | <code>Protons and neutrons are made of quarks, which are fundamental particles of matter.</code> |
412
- | <code>Watersheds and divides All of the land area whose water drains into a stream system is called the system's watershed.</code> | <code>All of the land drained by a river system is called its basin, or the "wet" term watershed</code> |
413
  * Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
414
  ```json
415
  {
@@ -428,16 +569,16 @@ You can finetune this model on your own dataset.
428
  * Size: 2,500 training samples
429
  * Columns: <code>sentence1</code> and <code>sentence2</code>
430
  * Approximate statistics based on the first 1000 samples:
431
- | | sentence1 | sentence2 |
432
- |:--------|:------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
433
- | type | string | string |
434
- | details | <ul><li>min: 2 tokens</li><li>mean: 350.46 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 27.13 tokens</li><li>max: 70 tokens</li></ul> |
435
  * Samples:
436
- | sentence1 | sentence2 |
437
- |:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
438
- | <code>An eyewitness told BBC Persian that the crowds were sharply divided between hardliners and moderates, but it was clear many people had responded to a call from former President Mohammad Khatami to attend the funeral as a show of support for the opposition reform movement.<br>Some were chanting opposition slogans, and others carried placards emphasising Mr Rafsanjani's links to the moderate and reformist camps.<br>"Long live Khatami, Long Live Rouhani. Hashemi, your soul is at peace!" said one banner.<br>"The circle became too closed for the centre," said another, using a quotation from Persian poetry to underline the growing distance in recent years between Mr Rafsanjani and Iran's hardline political establishment.<br>At one stage state television played loud music over its live broadcast of the event in order to drown out opposition slogans being chanted by the crowd.<br>As the official funeral eulogies were relayed to the crowds on the streets, they responded with calls of support for former President Khatami, and opposition leader Mir Hossein Mousavi, and shouts of: "You have the loudspeakers, we have the voice! Shame on you, Shame on State TV!"<br>On Iranian social media the funeral has been the number one topic with many opposition supporters using the hashtag #weallgathered to indicate their support and sympathy.<br>People have been posting photos and videos emphasising the number of opposition supporters out on the streets and showing the opposition slogans which state TV has been trying to obscure.<br>But government supporters have also taken to Twitter to play down the opposition showing at the funeral, accusing them of political opportunism.<br>"A huge army came out of love of the Supreme Leader," wrote a cleric called Sheikh Reza. "While a few foot soldiers came with their cameras to show off."<br>Another conversation engaging many on Twitter involved the wording of the prayers used at the funeral.<br>Did the Supreme Leader Ayatollah Ali Khamenei deliberately leave out a section praising the goodness of the deceased, some opposition supporters asked. And was this a comment on the political tensions between the two?<br>"No," responded another Twitter user, cleric Abbas Zolghadri. "The words of the prayer can be changed. There are no strict rules."<br>He followed this with a poignant photo of an empty grave - "Hashemi's final resting place" was the caption, summing up the sense of loss felt by Iranians of many different political persuasions despite the deep and bitter divisions.</code> | <code>Tehran has seen some of the biggest crowds on the streets since the 2009 "Green Movement" opposition demonstrations, as an estimated 2.5 million people gathered to bid farewell to Akbar Hashemi Rafsanjani, the man universally known as "Hashemi".</code> |
439
- | <code>Mark Evans is retracing the same route across the Rub Al Khali, also known as the "Empty Quarter", taken by Bristol pioneer Bertram Thomas in 1930.<br>The 54-year-old Shropshire-born explorer is leading a three-man team to walk the 800 mile (1,300 km) journey from Salalah, Oman to Doha, Qatar.<br>The trek is expected to take 60 days.<br>The Rub Al Khali desert is considered one of the hottest, driest and most inhospitable places on earth.<br>Nearly two decades after Thomas completed his trek, British explorer and writer Sir Wilfred Thesiger crossed the Empty Quarter - mapping it in detail along the way.<br>60 days<br>To cross the Rub' Al Khali desert<br>* From Salalah in Oman to Doha, Qatar<br>* Walking with camels for 1,300km<br>* Area nearly three times the size of the UK<br>Completed by explorer Bertram Thomas in 1930<br>Bertram Thomas, who hailed from Pill, near Bristol, received telegrams of congratulation from both King George V and Sultan Taimur, then ruler of Oman.<br>He went on to lecture all over the world about the journey and to write a book called Arabia Felix.<br>Unlike Mr Evans, Thomas did not obtain permission for his expedition.<br>He said: "The biggest challenges for Thomas were warring tribes, lack of water in the waterholes and his total dependence on his Omani companion Sheikh Saleh to negotiate their way through the desert.<br>"The biggest challenge for those who wanted to make the crossing in recent decades has been obtaining government permissions to walk through this desolate and unknown territory."</code> | <code>An explorer has embarked on a challenge to become only the third British person in history to cross the largest sand desert in the world.</code> |
440
- | <code>An Olympic gold medallist, he was also three-time world heavyweight champion and took part in some of the most memorable fights in boxing history.<br>He had a professional career spanning 21 years and BBC Sport takes a look at his 61 fights in more detail.</code> | <code>Boxing legend Muhammad Ali, who died at the age of 74, became a sporting icon during his career.</code> |
441
  * Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
442
  ```json
443
  {
@@ -796,9 +937,9 @@ You can finetune this model on your own dataset.
796
  - `per_device_eval_batch_size`: 18
797
  - `learning_rate`: 2e-05
798
  - `weight_decay`: 5e-07
799
- - `num_train_epochs`: 2
800
  - `lr_scheduler_type`: cosine_with_restarts
801
- - `lr_scheduler_kwargs`: {'num_cycles': 3}
802
  - `warmup_ratio`: 0.25
803
  - `save_safetensors`: False
804
  - `fp16`: True
@@ -826,10 +967,10 @@ You can finetune this model on your own dataset.
826
  - `adam_beta2`: 0.999
827
  - `adam_epsilon`: 1e-08
828
  - `max_grad_norm`: 1.0
829
- - `num_train_epochs`: 2
830
  - `max_steps`: -1
831
  - `lr_scheduler_type`: cosine_with_restarts
832
- - `lr_scheduler_kwargs`: {'num_cycles': 3}
833
  - `warmup_ratio`: 0.25
834
  - `warmup_steps`: 0
835
  - `log_level`: passive
@@ -922,57 +1063,57 @@ You can finetune this model on your own dataset.
922
  </details>
923
 
924
  ### Training Logs
925
- | Epoch | Step | Training Loss | nli-pairs loss | qnli-contrastive loss | scitail-pairs-pos loss | sts-test_spearman_cosine |
926
- |:------:|:----:|:-------------:|:--------------:|:---------------------:|:----------------------:|:------------------------:|
927
- | 0.0503 | 141 | 8.9676 | - | - | - | - |
928
- | 0.1006 | 282 | 7.1505 | - | - | - | - |
929
- | 0.1510 | 423 | 6.5458 | - | - | - | - |
930
- | 0.2002 | 561 | - | 5.2347 | 5.0924 | 3.0123 | - |
931
- | 0.2013 | 564 | 5.2862 | - | - | - | - |
932
- | 0.2516 | 705 | 4.31 | - | - | - | - |
933
- | 0.3019 | 846 | 3.904 | - | - | - | - |
934
- | 0.3522 | 987 | 3.3312 | - | - | - | - |
935
- | 0.4004 | 1122 | - | 2.8461 | 3.3017 | 1.8818 | - |
936
- | 0.4026 | 1128 | 3.4991 | - | - | - | - |
937
- | 0.4529 | 1269 | 3.305 | - | - | - | - |
938
- | 0.5032 | 1410 | 3.0787 | - | - | - | - |
939
- | 0.5535 | 1551 | 3.0456 | - | - | - | - |
940
- | 0.6006 | 1683 | - | 2.2152 | 2.6721 | 1.5917 | - |
941
- | 0.6039 | 1692 | 2.886 | - | - | - | - |
942
- | 0.6542 | 1833 | 2.9191 | - | - | - | - |
943
- | 0.7045 | 1974 | 2.7596 | - | - | - | - |
944
- | 0.7548 | 2115 | 3.0015 | - | - | - | - |
945
- | 0.8009 | 2244 | - | 1.9579 | 2.1218 | 1.4780 | - |
946
- | 0.8051 | 2256 | 2.6781 | - | - | - | - |
947
- | 0.8555 | 2397 | 2.6899 | - | - | - | - |
948
- | 0.9058 | 2538 | 2.5374 | - | - | - | - |
949
- | 0.9561 | 2679 | 2.9215 | - | - | - | - |
950
- | 1.0011 | 2805 | - | 1.8911 | 2.2687 | 1.4384 | - |
951
- | 1.0064 | 2820 | 2.9894 | - | - | - | - |
952
- | 1.0567 | 2961 | 2.67 | - | - | - | - |
953
- | 1.1071 | 3102 | 2.7954 | - | - | - | - |
954
- | 1.1574 | 3243 | 2.6645 | - | - | - | - |
955
- | 1.2013 | 3366 | - | 1.8474 | 2.0989 | 1.3498 | - |
956
- | 1.2077 | 3384 | 2.5357 | - | - | - | - |
957
- | 1.2580 | 3525 | 2.099 | - | - | - | - |
958
- | 1.3084 | 3666 | 2.2678 | - | - | - | - |
959
- | 1.3587 | 3807 | 2.0013 | - | - | - | - |
960
- | 1.4015 | 3927 | - | 1.7168 | 1.8503 | 1.2985 | - |
961
- | 1.4090 | 3948 | 2.2268 | - | - | - | - |
962
- | 1.4593 | 4089 | 2.2645 | - | - | - | - |
963
- | 1.5096 | 4230 | 1.8598 | - | - | - | - |
964
- | 1.5600 | 4371 | 2.1624 | - | - | - | - |
965
- | 1.6017 | 4488 | - | 1.7209 | 1.6492 | 1.2805 | - |
966
- | 1.6103 | 4512 | 2.0678 | - | - | - | - |
967
- | 1.6606 | 4653 | 2.1483 | - | - | - | - |
968
- | 1.7109 | 4794 | 2.2059 | - | - | - | - |
969
- | 1.7612 | 4935 | 2.3824 | - | - | - | - |
970
- | 1.8019 | 5049 | - | 1.6013 | 1.6620 | 1.2365 | - |
971
- | 1.8116 | 5076 | 2.1792 | - | - | - | - |
972
- | 1.8619 | 5217 | 2.1 | - | - | - | - |
973
- | 1.9122 | 5358 | 2.1818 | - | - | - | - |
974
- | 1.9625 | 5499 | 2.6552 | - | - | - | - |
975
- | 2.0 | 5604 | - | - | - | - | 0.7592 |
976
 
977
 
978
  ### Framework Versions
 
87
  type: sts-test
88
  metrics:
89
  - type: pearson_cosine
90
+ value: 0.566653720937157
91
  name: Pearson Cosine
92
  - type: spearman_cosine
93
+ value: 0.5551442914704277
94
  name: Spearman Cosine
95
  - type: pearson_manhattan
96
+ value: 0.5771354814213894
97
  name: Pearson Manhattan
98
  - type: spearman_manhattan
99
+ value: 0.5723970841918167
100
  name: Spearman Manhattan
101
  - type: pearson_euclidean
102
+ value: 0.5619024776733639
103
  name: Pearson Euclidean
104
  - type: spearman_euclidean
105
+ value: 0.5593253322063549
106
  name: Spearman Euclidean
107
  - type: pearson_dot
108
+ value: 0.23527108587659004
109
  name: Pearson Dot
110
  - type: spearman_dot
111
+ value: 0.24219982461742934
112
  name: Spearman Dot
113
  - type: pearson_max
114
+ value: 0.5771354814213894
115
  name: Pearson Max
116
  - type: spearman_max
117
+ value: 0.5723970841918167
118
+ name: Spearman Max
119
+ - type: pearson_cosine
120
+ value: 0.566653720937157
121
+ name: Pearson Cosine
122
+ - type: spearman_cosine
123
+ value: 0.5551442914704277
124
+ name: Spearman Cosine
125
+ - type: pearson_manhattan
126
+ value: 0.5771354814213894
127
+ name: Pearson Manhattan
128
+ - type: spearman_manhattan
129
+ value: 0.5723970841918167
130
+ name: Spearman Manhattan
131
+ - type: pearson_euclidean
132
+ value: 0.5619024776733639
133
+ name: Pearson Euclidean
134
+ - type: spearman_euclidean
135
+ value: 0.5593253322063549
136
+ name: Spearman Euclidean
137
+ - type: pearson_dot
138
+ value: 0.23527108587659004
139
+ name: Pearson Dot
140
+ - type: spearman_dot
141
+ value: 0.24219982461742934
142
+ name: Spearman Dot
143
+ - type: pearson_max
144
+ value: 0.5771354814213894
145
+ name: Pearson Max
146
+ - type: spearman_max
147
+ value: 0.5723970841918167
148
+ name: Spearman Max
149
+ - type: pearson_cosine
150
+ value: 0.566653720937157
151
+ name: Pearson Cosine
152
+ - type: spearman_cosine
153
+ value: 0.5551442914704277
154
+ name: Spearman Cosine
155
+ - type: pearson_manhattan
156
+ value: 0.5771354814213894
157
+ name: Pearson Manhattan
158
+ - type: spearman_manhattan
159
+ value: 0.5723970841918167
160
+ name: Spearman Manhattan
161
+ - type: pearson_euclidean
162
+ value: 0.5619024776733639
163
+ name: Pearson Euclidean
164
+ - type: spearman_euclidean
165
+ value: 0.5593253322063549
166
+ name: Spearman Euclidean
167
+ - type: pearson_dot
168
+ value: 0.23527108587659004
169
+ name: Pearson Dot
170
+ - type: spearman_dot
171
+ value: 0.24219982461742934
172
+ name: Spearman Dot
173
+ - type: pearson_max
174
+ value: 0.5771354814213894
175
+ name: Pearson Max
176
+ - type: spearman_max
177
+ value: 0.5723970841918167
178
+ name: Spearman Max
179
+ - type: pearson_cosine
180
+ value: 0.566653720937157
181
+ name: Pearson Cosine
182
+ - type: spearman_cosine
183
+ value: 0.5551442914704277
184
+ name: Spearman Cosine
185
+ - type: pearson_manhattan
186
+ value: 0.5771354814213894
187
+ name: Pearson Manhattan
188
+ - type: spearman_manhattan
189
+ value: 0.5723970841918167
190
+ name: Spearman Manhattan
191
+ - type: pearson_euclidean
192
+ value: 0.5619024776733639
193
+ name: Pearson Euclidean
194
+ - type: spearman_euclidean
195
+ value: 0.5593253322063549
196
+ name: Spearman Euclidean
197
+ - type: pearson_dot
198
+ value: 0.23527108587659004
199
+ name: Pearson Dot
200
+ - type: spearman_dot
201
+ value: 0.24219982461742934
202
+ name: Spearman Dot
203
+ - type: pearson_max
204
+ value: 0.5771354814213894
205
+ name: Pearson Max
206
+ - type: spearman_max
207
+ value: 0.5723970841918167
208
  name: Spearman Max
209
  ---
210
 
 
321
 
322
  | Metric | Value |
323
  |:--------------------|:-----------|
324
+ | pearson_cosine | 0.5667 |
325
+ | **spearman_cosine** | **0.5551** |
326
+ | pearson_manhattan | 0.5771 |
327
+ | spearman_manhattan | 0.5724 |
328
+ | pearson_euclidean | 0.5619 |
329
+ | spearman_euclidean | 0.5593 |
330
+ | pearson_dot | 0.2353 |
331
+ | spearman_dot | 0.2422 |
332
+ | pearson_max | 0.5771 |
333
+ | spearman_max | 0.5724 |
334
+
335
+ #### Semantic Similarity
336
+ * Dataset: `sts-test`
337
+ * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
338
+
339
+ | Metric | Value |
340
+ |:--------------------|:-----------|
341
+ | pearson_cosine | 0.5667 |
342
+ | **spearman_cosine** | **0.5551** |
343
+ | pearson_manhattan | 0.5771 |
344
+ | spearman_manhattan | 0.5724 |
345
+ | pearson_euclidean | 0.5619 |
346
+ | spearman_euclidean | 0.5593 |
347
+ | pearson_dot | 0.2353 |
348
+ | spearman_dot | 0.2422 |
349
+ | pearson_max | 0.5771 |
350
+ | spearman_max | 0.5724 |
351
+
352
+ #### Semantic Similarity
353
+ * Dataset: `sts-test`
354
+ * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
355
+
356
+ | Metric | Value |
357
+ |:--------------------|:-----------|
358
+ | pearson_cosine | 0.5667 |
359
+ | **spearman_cosine** | **0.5551** |
360
+ | pearson_manhattan | 0.5771 |
361
+ | spearman_manhattan | 0.5724 |
362
+ | pearson_euclidean | 0.5619 |
363
+ | spearman_euclidean | 0.5593 |
364
+ | pearson_dot | 0.2353 |
365
+ | spearman_dot | 0.2422 |
366
+ | pearson_max | 0.5771 |
367
+ | spearman_max | 0.5724 |
368
+
369
+ #### Semantic Similarity
370
+ * Dataset: `sts-test`
371
+ * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
372
+
373
+ | Metric | Value |
374
+ |:--------------------|:-----------|
375
+ | pearson_cosine | 0.5667 |
376
+ | **spearman_cosine** | **0.5551** |
377
+ | pearson_manhattan | 0.5771 |
378
+ | spearman_manhattan | 0.5724 |
379
+ | pearson_euclidean | 0.5619 |
380
+ | spearman_euclidean | 0.5593 |
381
+ | pearson_dot | 0.2353 |
382
+ | spearman_dot | 0.2422 |
383
+ | pearson_max | 0.5771 |
384
+ | spearman_max | 0.5724 |
385
 
386
  <!--
387
  ## Bias, Risks and Limitations
 
457
  * Size: 3,194 training samples
458
  * Columns: <code>label</code>, <code>sentence1</code>, and <code>sentence2</code>
459
  * Approximate statistics based on the first 1000 samples:
460
+ | | label | sentence1 | sentence2 |
461
+ |:--------|:-----------------------------|:---------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
462
+ | type | int | string | string |
463
+ | details | <ul><li>1: 100.00%</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 15.8 tokens</li><li>max: 75 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 38.29 tokens</li><li>max: 512 tokens</li></ul> |
464
  * Samples:
465
+ | label | sentence1 | sentence2 |
466
+ |:---------------|:---------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
467
+ | <code>1</code> | <code>Kyle Kendricks was otherwise called the Professor .</code> | <code>`` Chicago Cubs ( �present ) } } Kyle Christian Hendricks ( born December 7 , 1989 ) , nicknamed `` '' The Proffessor , '' '' is an American professional baseball pitcher for the Chicago Cubs of Major League Baseball ( MLB ) . ''</code> |
468
+ | <code>1</code> | <code>Since 1982 , 533 people have been executed in Texas .</code> | <code>Since the death penalty was re-instituted in the United States with the 1976 Gregg v. Georgia decision , Texas has executed more inmates than any other state , beginning in 1982 with the execution of Charles Brooks , Jr.. Since 1982 , 533 people have been executed in Texas. 1923 , the Texas Department of Criminal Justice ( TDCJ ) has been in charge of executions in the state .</code> |
469
+ | <code>1</code> | <code>Hilltop Hoods have released two `` restrung '' albums .</code> | <code>`` The group released its first extended play , Back Once Again , in 1997 and have subsequently released seven studio albums , two `` '' restrung '' '' albums and three DVDs . ''</code> |
470
  * Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
471
  ```json
472
  {
 
485
  * Size: 4,000 training samples
486
  * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code>
487
  * Approximate statistics based on the first 1000 samples:
488
+ | | sentence1 | sentence2 | label |
489
+ |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-----------------------------|
490
+ | type | string | string | int |
491
+ | details | <ul><li>min: 6 tokens</li><li>mean: 13.79 tokens</li><li>max: 40 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 35.8 tokens</li><li>max: 499 tokens</li></ul> | <ul><li>0: 100.00%</li></ul> |
492
  * Samples:
493
+ | sentence1 | sentence2 | label |
494
+ |:-----------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------|
495
+ | <code>Vinters have adopted solar technology to do what?</code> | <code>More recently the technology has been embraced by vinters, who use the energy generated by solar panels to power grape presses.</code> | <code>0</code> |
496
+ | <code>Who did Madonna's look and style of dressing influence?</code> | <code>It attracted the attention of organizations who complained that the song and its accompanying video promoted premarital sex and undermined family values, and moralists sought to have the song and video banned.</code> | <code>0</code> |
497
+ | <code>In addition to hearing him play, what else did people seek from Chopin in London?</code> | <code>The Prince, who was himself a talented musician, moved close to the keyboard to view Chopin's technique.</code> | <code>0</code> |
498
  * Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
499
  ```json
500
  {
 
516
  | | sentence2 | sentence1 |
517
  |:--------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
518
  | type | string | string |
519
+ | details | <ul><li>min: 7 tokens</li><li>mean: 16.0 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 14.71 tokens</li><li>max: 34 tokens</li></ul> |
520
  * Samples:
521
+ | sentence2 | sentence1 |
522
+ |:--------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|
523
+ | <code>The fetal period lasts approximately 30 weeks weeks.</code> | <code>Approximately how many weeks does the fetal period last?</code> |
524
+ | <code>Corals build hard exoskeletons that grow to become coral reefs.</code> | <code>Corals build hard exoskeletons that grow to become what?</code> |
525
+ | <code>A voltaic cell generates an electric current through a reaction known as a(n) spontaneous redox.</code> | <code>A voltaic cell uses what type of reaction to generate an electric current</code> |
526
  * Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
527
  ```json
528
  {
 
541
  * Size: 2,200 training samples
542
  * Columns: <code>sentence1</code> and <code>sentence2</code>
543
  * Approximate statistics based on the first 1000 samples:
544
+ | | sentence1 | sentence2 |
545
+ |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
546
+ | type | string | string |
547
+ | details | <ul><li>min: 7 tokens</li><li>mean: 23.76 tokens</li><li>max: 74 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.27 tokens</li><li>max: 41 tokens</li></ul> |
548
  * Samples:
549
+ | sentence1 | sentence2 |
550
+ |:-----------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------|
551
+ | <code>As the water vapor cools, it condenses , forming tiny droplets in clouds.</code> | <code>Clouds are formed from water droplets.</code> |
552
+ | <code>Poison ivy is green, with three leaflets on each leaf, grows as a shrub or vine, and may be in your yard.</code> | <code>Poison ivy typically has three groups of leaves.</code> |
553
+ | <code>(Formic acid is the poison found in the > sting of fire ants.)</code> | <code>Formic acid is found in the secretions of stinging ants.</code> |
554
  * Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
555
  ```json
556
  {
 
569
  * Size: 2,500 training samples
570
  * Columns: <code>sentence1</code> and <code>sentence2</code>
571
  * Approximate statistics based on the first 1000 samples:
572
+ | | sentence1 | sentence2 |
573
+ |:--------|:-------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
574
+ | type | string | string |
575
+ | details | <ul><li>min: 14 tokens</li><li>mean: 345.33 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 27.11 tokens</li><li>max: 60 tokens</li></ul> |
576
  * Samples:
577
+ | sentence1 | sentence2 |
578
+ |:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------|
579
+ | <code>Rahim Kalantar told the BBC his son Ali, 18, travelled to Syria with two friends from Coventry in March and believed he was now fighting with Isis.<br>He said he was sent "down this road" by an imam - who denied the allegations.<br>Up to 500 Britons are thought to have travelled to the Middle East to fight in the conflict, officials say.<br>Mr Kalantar - speaking to BBC Two's Newsnight, in collaboration with the BBC's Afghan Service and Newsday - said he worries about his son Ali "every minute" and that his grief is "limitless".<br>He said he believed Ali - who was planning to study computer science at university - had been radicalised during classes at a mosque after evening prayer.<br>"He [the imam] encouraged them and sent them down this road," he said.<br>The BBC contacted the mosque to speak to the imam, who refused to give an interview but said he completely denied the allegations.<br>Ali is believed to have travelled to Syria with Rashed Amani, also 18, who had been studying business at Coventry University.<br>Rashed's father, Khabir, said family members had travelled to the Turkish-Syrian border in the hope of finding the boys, but came back "empty-handed" after searching for more than two weeks.<br>He said he did not know what had happened to his son, who he fears has joined Isis - the militant-led group that has made rapid advances through Iraq in recent weeks.<br>"Maybe somebody worked with him, I don't know. Maybe somebody brainwashed him because he was not like that," he said.<br>The third teenager, Moh Ismael, is also believed to be in Syria with his friends. He is understood to have posted a message on Twitter saying he was with Isis.<br>It comes after Britons - including Reyaad Khan and Nasser Muthana from Cardiff - featured in an apparent recruitment video for jihadists in Iraq and Syria.<br>The video was posted online on Friday by accounts with links to Isis.<br>The BBC has learned a third Briton in the video is from Aberdeen. The man, named locally as Raqib, grew up in Scotland but was originally from Bangladesh.<br>Lord Carlile, a former independent reviewer of terrorism laws, told the BBC that the Muslim community was best placed to stop jihadists recruiting in the UK.<br>The Liberal Democrat peer also said the UK needed to reintroduce tougher measures to stop terrorism.<br>It comes after former MI6 director, Richard Barrett, said security services would not be able to track all Britons who return to the UK after fighting in Syria.<br>He said the number of those posing a threat would be small but unpredictable.<br>The Metropolitan Police has insisted it has the tools to monitor British jihadists returning from that country.<br>Shiraz Maher, a radicalisation expert, told Newsnight that social media was now acting as a recruitment ground for potential jihadists in the UK.<br>"You have hundreds of foreign fighters on the ground who in real time are giving you a live feed of what is happening and they are engaged in a conversation.<br>"It is these individual people who have been empowered to become recruiters in their own right," he said.<br>Lord Carlile said the "most important partners" in preventing young Muslims from being radicalised were the "Muslim communities themselves".<br>"Mothers, wives, sisters do not want their husbands, brothers, sons to become valid jihadists and run the risk of being killed in a civil war," he told the programme.<br>He also told BBC Radio 4's World at One programme that the government should look at reintroducing "something like control orders", which were scrapped in 2011 and replaced with the less restrictive Terrorism Prevention and Investigation Measures (TPims).<br>He said: "We need to look at preventing violent extremism before people leave the country and also we need to look for further measures."</code> | <code>The father of a British teenager who travelled to Syria to join jihadists believes his son was radicalised by an imam at a UK mosque.</code> |
580
+ | <code>Jawad Fairooz and Matar Matar were detained in May after resigning from parliament in protest at the handling of the protests.<br>Mr Matar told the BBC they had been tortured in prison.<br>They were prosecuted in a security court on charges of taking part in illegal protests and defaming the country.<br>It is not clear if they still face trial in a civilian court.<br>Civilian courts took over jurisdiction after King Hamad Bin Issa Al Khalifa lifted a state of emergency in June.<br>Mr Matar told the BBC he believed his arrest had been intended to put a pressure on his al-Wifaq party.<br>"At some stages we were tortured," he said. "In one of the cases we were beaten."<br>Human rights lawyer Mohamed al-Tajir was also released.<br>He was detained in April having defended people arrested during the Saudi-backed suppression of protests in March.<br>Correspondents say their release appears to be an attempt at defusing tensions in the country, a key US ally in the region that hosts the US Navy's 5th Fleet.<br>Bahrain's King Hamad Bin Issa Al Khalifa recently accepted a series of reforms drawn up by a government-backed committee created to address grievances that emerged during the protests.<br>The kingdom's Shia community makes up about 70% of the population but many say they are discriminated against by the minority Sunni monarchy.</code> | <code>Bahrain has freed two former Shia opposition MPs arrested in the wake of widespread anti-government protests.</code> |
581
+ | <code>Liverpool City Region, in case you were wondering, includes Merseyside's five councils (Knowsley, Liverpool, Sefton, St Helens, and Wirral) as well as Halton in Cheshire.<br>Who are the eight candidates desperate for your support on 4 May, though, and what are their priorities?<br>BBC Radio Merseyside's political reporter Claire Hamilton has produced a potted biography for each of them.<br>We're also asking all of them for a "minute manifesto" video.<br>Candidates are listed below in alphabetical order<br>Roger Bannister, Trade Union & Socialist Coalition<br>Veteran trade unionist Roger Bannister believes the Liverpool City Region Combined Authority should never have approved the contract for a fleet of new driver-only Merseyrail trains. He says he would seek to reverse this decision. He also believes local authorities have passed harmful austerity budgets on people struggling to make ends meet. He stood for Liverpool city mayor in 2016, coming fourth with 5% of the vote.<br>Paul Breen, Get the Coppers off the Jury<br>Paul Breen is a resident of Norris Green, Liverpool and became the last candidate to be nominated. He is listed as treasurer of the party on the Electoral Commission's website, with Patricia Breen listed as deputy treasurer. He has not yet released any material detailing his manifesto but told the BBC the title of his campaign speaks for itself. He simply does not believe that police officers should be allowed to serve on juries.<br>Mr Breen declined to provide a "minute manifesto"<br>Tony Caldeira, Conservative<br>Born in Liverpool and educated in St Helens, Tony Caldeira started out working on a stall selling cushions made by his mother at Liverpool's Great Homer Street market. His business expanded and now operates in Kirkby, distributing world-wide. Mr Caldeira has stood for Liverpool mayor twice, coming sixth in 2016 with just under 4% of the vote. He has pledged to improve the area's transport network, speed up the planning process and build homes and workplaces on brownfield sites rather than green spaces.<br>Carl Cashman, Liberal Democrats<br>Born in Whiston, Knowsley, Carl Cashman is leader of the Liberal Democrat group on Knowsley Council. He and his two Lib Dem council colleagues were elected in 2016, breaking a four-year period when Labour was the only party represented. Aged 25, he's the youngest of the candidates. Mr Cashman believes maintaining strong ties with Europe and the region will be key, and has pledged to open a Liverpool City Region embassy in Brussels. He also wants to better integrate ticketing across public transport and make the current Walrus card more similar to the Oyster card used by Londoners.<br>Tom Crone, Green Party<br>Tom Crone is leader of the Green group on Liverpool City Council. He won 10% of the vote in the mayoral elections in Liverpool in 2016 and came third. Originally from Norwich, he has lived in Liverpool since 2000 after arriving as a student. Mr Crone is keen to see a shift away from traditional heavy industry in the city region towards greener "tech" industries. He's also passionate about making public transport more affordable and environmentally friendly. He says he'll look to prioritise new routes for cyclists and pedestrians.<br>Tabitha Morton, Women's Equality Party<br>Tabitha Morton was born in Netherton, Sefton. She left school with no formal qualifications, and started work at 16 at a local market, and later in cleaning. She was taken on for NVQ training by a company in Liverpool, and stayed on to train others. She now works for a global manufacturer, in what she describes as "a male-dominated industry". She says she would prioritise grants for employers offering equal apprenticeships for young women and men and ring-fence funds for training women in sectors in which they're underrepresented.<br>Steve Rotheram, Labour<br>Born in Kirkby, former bricklayer Steve Rotheram was a city councillor in Liverpool and also Lord Mayor during the city's European Capital of Culture year in 2008. He was also elected MP for Liverpool Walton in 2010, and re-elected to the seat in 2015. Mr Rotheram is pledging to cut the cost of the fast tag for motorists driving through the Mersey tunnels. He wants to improve education and offer better careers advice for young people, and also wants to make brownfield sites more attractive to developers.<br>Paula Walters, UKIP<br>Wallasey-born Paula Walters is chairman of UKIP in Wirral and lives in New Brighton with her family. She has campaigned to scrap tunnel tolls for several years. She says her local UKIP branch is one of the most thriving in the North West. A civil servant, she studied English and biomolecular science at degree-level. She has also lived in South Africa where she attended the University of Pretoria. She believes Liverpool city centre has attracted money at the expense of outlying areas, one of the things she wants to tackle.</code> | <code>Those hoping to become the first mayor of the Liverpool City Region have less than a month remaining in which to secure your vote.</code> |
582
  * Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
583
  ```json
584
  {
 
937
  - `per_device_eval_batch_size`: 18
938
  - `learning_rate`: 2e-05
939
  - `weight_decay`: 5e-07
940
+ - `num_train_epochs`: 4
941
  - `lr_scheduler_type`: cosine_with_restarts
942
+ - `lr_scheduler_kwargs`: {'num_cycles': 5}
943
  - `warmup_ratio`: 0.25
944
  - `save_safetensors`: False
945
  - `fp16`: True
 
967
  - `adam_beta2`: 0.999
968
  - `adam_epsilon`: 1e-08
969
  - `max_grad_norm`: 1.0
970
+ - `num_train_epochs`: 4
971
  - `max_steps`: -1
972
  - `lr_scheduler_type`: cosine_with_restarts
973
+ - `lr_scheduler_kwargs`: {'num_cycles': 5}
974
  - `warmup_ratio`: 0.25
975
  - `warmup_steps`: 0
976
  - `log_level`: passive
 
1063
  </details>
1064
 
1065
  ### Training Logs
1066
+ | Epoch | Step | Training Loss | scitail-pairs-pos loss | qnli-contrastive loss | nli-pairs loss | sts-test_spearman_cosine |
1067
+ |:------:|:-----:|:-------------:|:----------------------:|:---------------------:|:--------------:|:------------------------:|
1068
+ | 0.1003 | 281 | 8.4339 | - | - | - | - |
1069
+ | 0.2006 | 562 | 6.8644 | - | - | - | - |
1070
+ | 0.3009 | 843 | 5.1225 | - | - | - | - |
1071
+ | 0.4001 | 1121 | - | 2.4070 | 4.2827 | 3.6032 | - |
1072
+ | 0.4011 | 1124 | 3.9997 | - | - | - | - |
1073
+ | 0.5014 | 1405 | 3.6186 | - | - | - | - |
1074
+ | 0.6017 | 1686 | 3.259 | - | - | - | - |
1075
+ | 0.7020 | 1967 | 3.1712 | - | - | - | - |
1076
+ | 0.8001 | 2242 | - | 1.6090 | 2.5195 | 2.2851 | - |
1077
+ | 0.8023 | 2248 | 3.104 | - | - | - | - |
1078
+ | 0.9026 | 2529 | 2.8549 | - | - | - | - |
1079
+ | 1.0029 | 2810 | 2.8668 | - | - | - | - |
1080
+ | 1.1031 | 3091 | 2.7466 | - | - | - | - |
1081
+ | 1.2002 | 3363 | - | 1.3474 | 2.2222 | 1.8491 | - |
1082
+ | 1.2034 | 3372 | 2.6502 | - | - | - | - |
1083
+ | 1.3037 | 3653 | 2.2191 | - | - | - | - |
1084
+ | 1.4040 | 3934 | 2.2311 | - | - | - | - |
1085
+ | 1.5043 | 4215 | 2.22 | - | - | - | - |
1086
+ | 1.6003 | 4484 | - | 1.2671 | 1.7964 | 1.6444 | - |
1087
+ | 1.6046 | 4496 | 2.1372 | - | - | - | - |
1088
+ | 1.7049 | 4777 | 2.2219 | - | - | - | - |
1089
+ | 1.8051 | 5058 | 2.2618 | - | - | - | - |
1090
+ | 1.9054 | 5339 | 1.9995 | - | - | - | - |
1091
+ | 2.0004 | 5605 | - | 1.2434 | 1.8182 | 1.5385 | - |
1092
+ | 2.0057 | 5620 | 1.9757 | - | - | - | - |
1093
+ | 2.1060 | 5901 | 2.0401 | - | - | - | - |
1094
+ | 2.2063 | 6182 | 1.9818 | - | - | - | - |
1095
+ | 2.3066 | 6463 | 1.7816 | - | - | - | - |
1096
+ | 2.4004 | 6726 | - | 1.0396 | 1.5587 | 1.5077 | - |
1097
+ | 2.4069 | 6744 | 1.9239 | - | - | - | - |
1098
+ | 2.5071 | 7025 | 2.0148 | - | - | - | - |
1099
+ | 2.6074 | 7306 | 1.9629 | - | - | - | - |
1100
+ | 2.7077 | 7587 | 1.7316 | - | - | - | - |
1101
+ | 2.8005 | 7847 | - | 1.0507 | 1.3294 | 1.4039 | - |
1102
+ | 2.8080 | 7868 | 1.7794 | - | - | - | - |
1103
+ | 2.9083 | 8149 | 1.7029 | - | - | - | - |
1104
+ | 3.0086 | 8430 | 1.7996 | - | - | - | - |
1105
+ | 3.1089 | 8711 | 1.9379 | - | - | - | - |
1106
+ | 3.2006 | 8968 | - | 0.9949 | 1.3678 | 1.3436 | - |
1107
+ | 3.2091 | 8992 | 1.844 | - | - | - | - |
1108
+ | 3.3094 | 9273 | 1.358 | - | - | - | - |
1109
+ | 3.4097 | 9554 | 1.5104 | - | - | - | - |
1110
+ | 3.5100 | 9835 | 1.6964 | - | - | - | - |
1111
+ | 3.6006 | 10089 | - | 0.9538 | 1.1866 | 1.3098 | - |
1112
+ | 3.6103 | 10116 | 1.7661 | - | - | - | - |
1113
+ | 3.7106 | 10397 | 1.6529 | - | - | - | - |
1114
+ | 3.8108 | 10678 | 1.6835 | - | - | - | - |
1115
+ | 3.9111 | 10959 | 1.35 | - | - | - | - |
1116
+ | 4.0 | 11208 | - | - | - | - | 0.5551 |
1117
 
1118
 
1119
  ### Framework Versions
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:1b66a67027ae4f996acef0b39dd13825ed90459f841c056fbfc69d972c2597f3
3
- size 565251810
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:eb7927d6446f814065535e12fc14d9042e73ca86dc4ebdd235b5668414cc9613
3
+ size 451824288