File size: 241,171 Bytes
1565e24
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
---
language:
- en
library_name: sentence-transformers
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:187790
- loss:AdaptiveLayerLoss
- loss:CoSENTLoss
- loss:GISTEmbedLoss
- loss:OnlineContrastiveLoss
- loss:MultipleNegativesSymmetricRankingLoss
base_model: microsoft/deberta-v3-small
datasets:
- sentence-transformers/all-nli
- sentence-transformers/stsb
- tals/vitaminc
- nyu-mll/glue
- allenai/scitail
- sentence-transformers/xsum
- sentence-transformers/sentence-compression
- allenai/sciq
- allenai/qasc
- allenai/openbookqa
- sentence-transformers/msmarco-msmarco-distilbert-base-v3
- sentence-transformers/natural-questions
- sentence-transformers/trivia-qa
- sentence-transformers/quora-duplicates
- sentence-transformers/gooaq
metrics:
- pearson_cosine
- spearman_cosine
- pearson_manhattan
- spearman_manhattan
- pearson_euclidean
- spearman_euclidean
- pearson_dot
- spearman_dot
- pearson_max
- spearman_max
widget:
- source_sentence: What are predators?
  sentences:
  - a windmill does not create pollution
  - fire causes burning
  - carnivores are predators
- source_sentence: A man in a black shirt is playing a guitar.
  sentences:
  - The man is wearing black.
  - There are people on the subway.
  - Boy dressed in blue holds a toy.
- source_sentence: The Infrared Detector Laboratory built the Near Infrared Camera
    and Multi-Object Spectrometer (NICMOS) instrument for the Hubble Space Telescope
    and the Multiband Imaging Photometer (MIPS) instrument for the Spitzer Space Telescope.
  sentences:
  - Mollusks can be divided into seven classes.
  - A telescope is used to make objects in space appear closer.
  - The energy content of foods is often expressed in calories.
- source_sentence: Glucocorticoids and mineralocorticoids are the two main types of
    corticosteroids in humans.
  sentences:
  - Glucocorticoids and mineralocorticoids are the two main types of what in humans?
  - The thick skin, found only on the palms of the hands and the soles of the feet,
    has an extra what?
  - Human evolution shows that evolutionary changes typically occur at what pace?
- source_sentence: do yellow finches change color in the winter
  sentences:
  - Hello, Dolly! (musical) The role of Dolly Levi in the musical was originally written
    for Ethel Merman, but Merman turned it down, as did Mary Martin (although each
    eventually played it).[3] Merrick then auditioned Nancy Walker. Eventually, he
    hired Carol Channing, who then created in Dolly her signature role.[5] Director
    Gower Champion was not the producer's first choice, as Hal Prince and others (among
    them Jerome Robbins and Joe Layton) all turned down the job of directing the musical.[6]
  - American goldfinch Once the spring molt is complete, the body of the male is a
    brilliant lemon yellow, a color produced by carotenoid pigments from plant materials
    in its diet,[18] with a striking jet black cap and white rump that is visible
    during flight.[19] The female is mostly brown, lighter on the underside with a
    yellow bib.[17] After the autumn molt, the bright summer feathers are replaced
    by duller plumage, becoming buff below and olive-brown above, with a pale yellow
    face and bib. The autumn plumage is almost identical in both sexes, but the male
    has yellow shoulder patches.[20] In some winter ranges, the goldfinches lose all
    traces of yellow, becoming a predominantly medium tan-gray color with an olive
    tinge evident only on close viewing.
  - Wide boy An early use of the term was in the 1933 film "Friday the Thirteenth",
    where the character played by Max Miller, a loud, quick-witted, Cockney market
    trader, is heard to say "I'm the widest boy ever put on a pair of shoes!"
pipeline_tag: sentence-similarity
model-index:
- name: SentenceTransformer based on microsoft/deberta-v3-small
  results:
  - task:
      type: semantic-similarity
      name: Semantic Similarity
    dataset:
      name: sts test
      type: sts-test
    metrics:
    - type: pearson_cosine
      value: 0.7699051255542444
      name: Pearson Cosine
    - type: spearman_cosine
      value: 0.7830173030447911
      name: Spearman Cosine
    - type: pearson_manhattan
      value: 0.76091917538426
      name: Pearson Manhattan
    - type: spearman_manhattan
      value: 0.7581350404978112
      name: Spearman Manhattan
    - type: pearson_euclidean
      value: 0.757452845704679
      name: Pearson Euclidean
    - type: spearman_euclidean
      value: 0.7540269111333524
      name: Spearman Euclidean
    - type: pearson_dot
      value: 0.46895647260006346
      name: Pearson Dot
    - type: spearman_dot
      value: 0.55297440791417
      name: Spearman Dot
    - type: pearson_max
      value: 0.7699051255542444
      name: Pearson Max
    - type: spearman_max
      value: 0.7830173030447911
      name: Spearman Max
---

# SentenceTransformer based on microsoft/deberta-v3-small

This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small) on the [nli-pairs](https://huggingface.co/datasets/sentence-transformers/all-nli), [sts-label](https://huggingface.co/datasets/sentence-transformers/stsb), [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc), [qnli-contrastive](https://huggingface.co/datasets/nyu-mll/glue), [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail), [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail), [xsum-pairs](https://huggingface.co/datasets/sentence-transformers/xsum), [compression-pairs](https://huggingface.co/datasets/sentence-transformers/sentence-compression), [sciq_pairs](https://huggingface.co/datasets/allenai/sciq), [qasc_pairs](https://huggingface.co/datasets/allenai/qasc), [openbookqa_pairs](https://huggingface.co/datasets/allenai/openbookqa), [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3), [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions), [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa), [quora_pairs](https://huggingface.co/datasets/sentence-transformers/quora-duplicates) and [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq) datasets. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

## Model Details

### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small) <!-- at revision a36c739020e01763fe789b4b85e2df55d6180012 -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
- **Training Datasets:**
    - [nli-pairs](https://huggingface.co/datasets/sentence-transformers/all-nli)
    - [sts-label](https://huggingface.co/datasets/sentence-transformers/stsb)
    - [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc)
    - [qnli-contrastive](https://huggingface.co/datasets/nyu-mll/glue)
    - [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail)
    - [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail)
    - [xsum-pairs](https://huggingface.co/datasets/sentence-transformers/xsum)
    - [compression-pairs](https://huggingface.co/datasets/sentence-transformers/sentence-compression)
    - [sciq_pairs](https://huggingface.co/datasets/allenai/sciq)
    - [qasc_pairs](https://huggingface.co/datasets/allenai/qasc)
    - [openbookqa_pairs](https://huggingface.co/datasets/allenai/openbookqa)
    - [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3)
    - [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions)
    - [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa)
    - [quora_pairs](https://huggingface.co/datasets/sentence-transformers/quora-duplicates)
    - [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq)
- **Language:** en
<!-- - **License:** Unknown -->

### Model Sources

- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)

### Full Model Architecture

```
SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: DebertaV2Model 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```

## Usage

### Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

```bash
pip install -U sentence-transformers
```

Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("bobox/DeBERTa-ST-AllLayers-testing-v3-checkpoints-tmp")
# Run inference
sentences = [
    'do yellow finches change color in the winter',
    'American goldfinch Once the spring molt is complete, the body of the male is a brilliant lemon yellow, a color produced by carotenoid pigments from plant materials in its diet,[18] with a striking jet black cap and white rump that is visible during flight.[19] The female is mostly brown, lighter on the underside with a yellow bib.[17] After the autumn molt, the bright summer feathers are replaced by duller plumage, becoming buff below and olive-brown above, with a pale yellow face and bib. The autumn plumage is almost identical in both sexes, but the male has yellow shoulder patches.[20] In some winter ranges, the goldfinches lose all traces of yellow, becoming a predominantly medium tan-gray color with an olive tinge evident only on close viewing.',
    'Wide boy An early use of the term was in the 1933 film "Friday the Thirteenth", where the character played by Max Miller, a loud, quick-witted, Cockney market trader, is heard to say "I\'m the widest boy ever put on a pair of shoes!"',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```

<!--
### Direct Usage (Transformers)

<details><summary>Click to see the direct usage in Transformers</summary>

</details>
-->

<!--
### Downstream Usage (Sentence Transformers)

You can finetune this model on your own dataset.

<details><summary>Click to expand</summary>

</details>
-->

<!--
### Out-of-Scope Use

*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->

## Evaluation

### Metrics

#### Semantic Similarity
* Dataset: `sts-test`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)

| Metric              | Value     |
|:--------------------|:----------|
| pearson_cosine      | 0.7699    |
| **spearman_cosine** | **0.783** |
| pearson_manhattan   | 0.7609    |
| spearman_manhattan  | 0.7581    |
| pearson_euclidean   | 0.7575    |
| spearman_euclidean  | 0.754     |
| pearson_dot         | 0.469     |
| spearman_dot        | 0.553     |
| pearson_max         | 0.7699    |
| spearman_max        | 0.783     |

<!--
## Bias, Risks and Limitations

*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->

<!--
### Recommendations

*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->

## Training Details

### Training Datasets

#### nli-pairs

* Dataset: [nli-pairs](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab)
* Size: 15,000 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                        |
  |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                           |
  | details | <ul><li>min: 5 tokens</li><li>mean: 16.62 tokens</li><li>max: 62 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 9.46 tokens</li><li>max: 29 tokens</li></ul> |
* Samples:
  | sentence1                                                                  | sentence2                                        |
  |:---------------------------------------------------------------------------|:-------------------------------------------------|
  | <code>A person on a horse jumps over a broken down airplane.</code>        | <code>A person is outdoors, on a horse.</code>   |
  | <code>Children smiling and waving at camera</code>                         | <code>There are children present</code>          |
  | <code>A boy is jumping on skateboard in the middle of a red bridge.</code> | <code>The boy does a skateboarding trick.</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### sts-label

* Dataset: [sts-label](https://huggingface.co/datasets/sentence-transformers/stsb) at [ab7a5ac](https://huggingface.co/datasets/sentence-transformers/stsb/tree/ab7a5ac0e35aa22088bdcf23e7fd99b220e53308)
* Size: 5,749 training samples
* Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                        | sentence2                                                                        | score                                                          |
  |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------|
  | type    | string                                                                           | string                                                                           | float                                                          |
  | details | <ul><li>min: 6 tokens</li><li>mean: 9.81 tokens</li><li>max: 27 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 9.74 tokens</li><li>max: 25 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.54</li><li>max: 1.0</li></ul> |
* Samples:
  | sentence1                                                  | sentence2                                                             | score             |
  |:-----------------------------------------------------------|:----------------------------------------------------------------------|:------------------|
  | <code>A plane is taking off.</code>                        | <code>An air plane is taking off.</code>                              | <code>1.0</code>  |
  | <code>A man is playing a large flute.</code>               | <code>A man is playing a flute.</code>                                | <code>0.76</code> |
  | <code>A man is spreading shreded cheese on a pizza.</code> | <code>A man is spreading shredded cheese on an uncooked pizza.</code> | <code>0.76</code> |
* Loss: [<code>CoSENTLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters:
  ```json
  {
      "scale": 20.0,
      "similarity_fct": "pairwise_cos_sim"
  }
  ```

#### vitaminc-pairs

* Dataset: [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc) at [be6febb](https://huggingface.co/datasets/tals/vitaminc/tree/be6febb761b0b2807687e61e0b5282e459df2fa0)
* Size: 15,000 training samples
* Columns: <code>label</code>, <code>sentence1</code>, and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | label                        | sentence1                                                                         | sentence2                                                                           |
  |:--------|:-----------------------------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
  | type    | int                          | string                                                                            | string                                                                              |
  | details | <ul><li>1: 100.00%</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 16.94 tokens</li><li>max: 88 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 39.24 tokens</li><li>max: 502 tokens</li></ul> |
* Samples:
  | label          | sentence1                                                                                          | sentence2                                                                                                                                                                                                                                                                                                                        |
  |:---------------|:---------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>1</code> | <code>Fantastic Four has a rating below 5 % .</code>                                               | <code>On Rotten Tomatoes , the film holds an approval rating of 4 % based on , with a weighted average rating of 3.5/10 .</code>                                                                                                                                                                                                 |
  | <code>1</code> | <code>The Proclaimers were guests on the Jeremy Kyle show .</code>                                 | <code>They are best known for their appearance on Jeremy Kyle where it was proved the duo were not actually brothers , they are also known for songs `` I 'm Gon na Be ( 500 Miles ) '' , `` Sunshine on Leith '' , `` I 'm On My Way '' and `` Letter from America '' , and their singing style with a Scottish accent .</code> |
  | <code>1</code> | <code>Maurice Harkless was interested in working with Puerto Rico former coach Paco Olmos .</code> | <code>On January 29 , 2014 , Harkless declared his interest in playing for Puerto Rico at the 2014 FIBA Basketball World Cup following a series of reunions with Puerto Rico former coach Paco Olmos .</code>                                                                                                                    |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### qnli-contrastive

* Dataset: [qnli-contrastive](https://huggingface.co/datasets/nyu-mll/glue) at [bcdcba7](https://huggingface.co/datasets/nyu-mll/glue/tree/bcdcba79d07bc864c1c254ccfcedcce55bcc9a8c)
* Size: 15,000 training samples
* Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                          | label                        |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------|
  | type    | string                                                                            | string                                                                             | int                          |
  | details | <ul><li>min: 6 tokens</li><li>mean: 13.75 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 35.51 tokens</li><li>max: 180 tokens</li></ul> | <ul><li>0: 100.00%</li></ul> |
* Samples:
  | sentence1                                                                                                                         | sentence2                                                                                                                                                                                                                                     | label          |
  |:----------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------|
  | <code>Can real stacking be accomplished?</code>                                                                                   | <code>A mark handled this way will appear over whatever character precedes it, but will not adjust its position relative to the width or height of the base glyph; it may be visually awkward and it may overlap some glyphs.</code>          | <code>0</code> |
  | <code>Drug-resistant TB is one of the barriers to success of the Stop TB Partnership's initiative; what's the other other?</code> | <code>The World Health Organization declared TB a "global health emergency" in 1993, and in 2006, the Stop TB Partnership developed a Global Plan to Stop Tuberculosis that aims to save 14 million lives between its launch and 2015.</code> | <code>0</code> |
  | <code>In what year did the king demand ale-sellers post signage on pain of forfeiture?</code>                                     | <code>In 1393 King Richard II compelled landlords to erect signs outside their premises.</code>                                                                                                                                               | <code>0</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "OnlineContrastiveLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 0.25,
      "prior_layers_weight": 2.5,
      "kl_div_weight": 5,
      "kl_temperature": 0.25
  }
  ```

#### scitail-pairs-qa

* Dataset: [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44)
* Size: 14,537 training samples
* Columns: <code>sentence2</code> and <code>sentence1</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence2                                                                         | sentence1                                                                         |
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                            |
  | details | <ul><li>min: 7 tokens</li><li>mean: 15.76 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.02 tokens</li><li>max: 34 tokens</li></ul> |
* Samples:
  | sentence2                                                            | sentence1                                                                       |
  |:---------------------------------------------------------------------|:--------------------------------------------------------------------------------|
  | <code>We call the solid form of hydrocarbons coal.</code>            | <code>What do we call the solid form of hydrocarbons?</code>                    |
  | <code>Blood flow decrease when blood vessels constrict.</code>       | <code>Does blood flow increase or decrease when blood vessels constrict?</code> |
  | <code>Exact wavelength determines the color of visible light.</code> | <code>What determines the color of visible light?</code>                        |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### scitail-pairs-pos

* Dataset: [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44)
* Size: 8,600 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                         |
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                            |
  | details | <ul><li>min: 8 tokens</li><li>mean: 23.09 tokens</li><li>max: 64 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.55 tokens</li><li>max: 35 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                                                                                                 | sentence2                                                                                          |
  |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------|
  | <code>Light waves bend when passing diagonally from one material to another because the speed of light changes slightly according to the density of the material it is traversing.</code> | <code>When light passes from one medium to another, it changes speed.</code>                       |
  | <code>Weight is the force exerted on the object by gravity.</code>                                                                                                                        | <code>Weight is the term for the measure of the force of gravity pulling down on an object.</code> |
  | <code>Then, if carbon dioxide is formed of one atom of carbon and two atoms of oxygen, the proportion must naturally consist of 3 parts of carbon to 8 of oxygen.</code>                  | <code>Carbon dioxide molecules consist of a central carbon atom bonded to two oxygen atoms.</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### xsum-pairs

* Dataset: [xsum-pairs](https://huggingface.co/datasets/sentence-transformers/xsum) at [788ddaf](https://huggingface.co/datasets/sentence-transformers/xsum/tree/788ddafe04e539956d56b567bc32a036ee7b9206)
* Size: 4,750 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                            | sentence2                                                                         |
  |:--------|:-------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                               | string                                                                            |
  | details | <ul><li>min: 14 tokens</li><li>mean: 349.39 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 27.01 tokens</li><li>max: 67 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                  | sentence2                                                                                                                    |
  |:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------|
  | <code>Kerry Smith was selected for the South Basildon and Thurrock seat this week, after former Tory MP Neil Hamilton pulled out of the contest.<br>In a recording obtained by the Mail On Sunday, he is heard making offensive remarks about gay people, other UKIP members and Chigwell in Essex.<br>He later issued a "wholehearted and unreserved apology".<br>UKIP said the phone call was made some time ago, when Mr Smith had been prescribed sedatives after an injury.<br>During the recorded conversation, Mr Smith talks about UKIP's lesbian, gay, bisexual and transgender (LGBT) group and he can be heard jokingly referring to it as BLT UKIP, and adds "what the old poofter groups call themselves".<br>He jokes about "shooting peasants" from the Essex town of Chigwell and supporting "a peasant's hunt through Chigwell village".<br>Last week Mr Smith was chosen as UKIP's candidate for South Basildon and East Thurrock, a seat in which the party hopes to make a serious challenge.<br>A UKIP spokesman confirmed to the BBC that Mr Smith had apologised to the party's leader Nigel Farage for allegations made against him during the phone call - which Mr Smith has since retracted.<br>Patrick O'Flynn, UKIP MEP for the East of England, told BBC One's Sunday Politics that the phone call had been made "some time ago while he was on sedatives" and he had not been "speaking and thinking rationally".<br>He said the party's candidates had to "watch how to express themselves" adding: "What many people call political correctness is often just politeness and using derogatory terminology, pejorative slang is not right at this level of politics and you shouldn't do it."<br>He said Mr Smith was not homophobic but needed to "learn to express himself more respectfully about minorities of all kinds".<br>He pointed out that this week other parties had suffered gaffes by members - with Labour MP Frank Doran apologising for suggesting the post of fisheries minister was not a "job for a woman", while Conservative peer Baroness Jenkin of Kennington apologised for saying "poor people don't know how to cook".<br>"The hand grenades are rolling down the corridor.  We're still way up in the polls, we've had a fantastic year, we've won two by-elections."<br>"He's a young man he's learning politics - we also have to have a balance, we don't want to become so anodyne speaking in such non-colloquial language that we lose touch, and I think some other parties risk doing that.<br>"But clearly what he has said there is unacceptable - he's apologised unreservedly there.  There are big mitigating circumstances here, it was from some time ago, and so we are willing to judge him on his performance from now on."<br>In a statement made by a UKIP spokesman on his behalf, Mr Smith said: "I wish to issue a wholehearted and unreserved apology to those who I have offended within the party and anyone else.<br>"With regards to the leadership and management of the party I was completely wrong and my comments were fuelled by frustrations."<br>Former Conservative MP Neil Hamilton pulled out of the selection contest for South Basildon and East Thurrock after a letter from UKIP's finance committee was leaked to Channel 4 News querying Mr Hamilton's expenses claims for the party.<br>The former MP has suggested there is a "dirty tricks" campaign against him.<br>A report in the Financial Times said party officials were accusing one of its biggest funders of trying to pressure them into accepting Mr Hamilton's candidacy. He previously pulled out of another selection process, in Boston and Skegness.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   | <code>A UKIP candidate for one of the party's target seats has apologised for offensive remarks made in a phone call.</code> |
  | <code>Thousands of gaming fans voted for their favourite games, and the top 15 finalists have been revealed.<br>The award was created to celebrate games from all platforms including arcade, console, computer, handheld, and mobile.<br>Games had to pass four categories:<br>Icon-status, longevity, geographical reach and influence.<br>This year's winners will be announced at a ceremony on the 4th June.<br>To celebrate the World Video Game Hall of Fame we've picked out some of the finalists for a closer look.<br>So put down your controllers and enjoy our guide to some of the best games of all time!<br>If you've never heard of Minecraft you must have been living in outer space! The games popularity has grown and grown since it was first launched in 2009.<br>Players use pixelated blocks to create detailed buildings and worlds. Minecraft became so popular techno giant Microsoft bought it from its original creators last year for £1.5 billion.<br>As of 2014, more than 54 million copies of the game have been sold.<br>And you've even told us why you think it is so popular.<br>Sonic and Mario are two of the heavyweights in the gaming world, and have a long history of rivalry.<br>Super Mario Bros was created by Nintendo and came out in 1985.<br>The game featured an Italian plumber with a big moustache called Mario and became so popular it launched a number of spin off games that lasted generations.<br>More than 509 million copies of the various Mario games have been sold worldwide.<br>Sonic the Hedgehog was launched in 1991 by Sega, and featured a speedy blue hedgehog, who liked to collect gold rings.<br>At one point Sonic was so popular, children in America were able to identify him over other characters like Mickey Mouse or the President.<br>The rivalry between Mario and Sonic came to an end in 2007, when they both appeared in Mario and Sonic at the Olympic Games.<br>Find out more about the history of Nintendo<br>Another huge hit from the Nintendo universe is Pokémon.<br>The game first came out on the Game Boy in 1996, called Pocket Monsters, and was created by Japanese developers Game Freak.<br>Gamers play as a Pokémon trainer who can catch and battle a large number of different Pokémon.<br>Since Pokémon was released it has become the third most-popular franchise worldwide after Mario and Super Mario.<br>As of 2014, the Pokémon series had sold more than 260 million copies of its games, around 21.5 billion trading cards, and has created more than 800 television episodes and 17 movies.<br>Find out how Pokemon became a global hit<br>The FIFA series has become one of the most popular sports game franchises in the world since it was first released by Electronic Arts in 1993.<br>It allows gamers to play as their favourite football teams from different leagues all over the world.<br>FIFA 12 holds the record for the "fastest-selling sports game ever" selling over 3.2 million copies in the first week of its release.<br>First released in 2009 by Finnish developers Rovio Entertainment, Angry Birds became the first ever mobile game to achieve worldwide fame and popularity.<br>Players use a giant slingshot to catapult various bird characters at a number of different structures, in an attempt to knock everything over.<br>Angry Birds has expanded into a number of different consoles and has even joined up with Sony Pictures to create a film.<br>Games in the Angry Birds series have been downloaded more than two billion times.<br>The granddaddy of video games, Pong was created in 1972 and is widely viewed as being the first ever "video game".<br>Pong is essentially a simple tennis style game, where players have to keep a rally going, or score as many points against their opponent as possible.<br>The game was created using simple 2D, black and white graphics, and was made by the company Atari.<br>As Pong became popular it encouraged Atari to design more games, and encouraged many other developers to create new games too.<br>The famous pixel block game was originally designed by a Russian programmer called Alexey Pajitnov in 1984.<br>Tetris was one of the original games for the Game Boy.<br>Players have to fit different coloured and shaped blocks together.<br>Tetris is available on almost every gaming platform, and as of 2010 has sold more than 170 million copies worldwide.</code> | <code>Top gaming experts are voting on the best video games to go into the World Video Game Hall of Fame, in America.</code> |
  | <code>India's western state of Gujarat certainly believes so. Earlier this week, the state's legislators passed a bill which makes it mandatory for candidates to have toilets in their homes to qualify for contesting elections to local municipalities and village councils. Existing elected members will also have to declare within six months that they have toilets at home, failing which they will face disqualification.<br>Prime Minister Narendra Modi, who ruled Gujarat for over a decade before he swept to power in Delhi in May, has made abolishing open defecation a top priority of his government. It is a laudable aim, though critics believe it does not appear to link what is largely an individual-driven campaign to the appalling practice of manual scavenging. Clearly legislators belonging to Mr Modi's ruling BJP in Gujarat have enthusiastically backed their leader's call.<br>Surely, there is nothing wrong in that. Open defecation blights the lives of millions of Indians and is an enduring health hazard. Nearly half of Indians continue to defecate in the open. Gujarat, one of India's most prosperous states, is in a hurry to build more toilets; the state has a spotty record here. Its new Chief Minister Anandi Patel says she wants the state to be "open defecation free" in two years. A recent report said more than 70,000 people defecate in the open in the main city of Ahmedabad alone. Good economics does not always lead to good sanitation.<br>But is the latest move linking a democratic right to building a private utility such a good idea?<br>Some 40% of people in Gujarat live in its 159 municipalities and eight municipal corporation areas in what is one of India's most urbanised states. There are some 13,500 village councils in its more than 18,500 villages. Elections to these bodies are critical to the health of Gujarat's democracy and development. The freedom to contest the polls is also an inalienable right of every citizen living in their cities and villages.<br>That is why critics like economist Hemant Shah feel that the bill is essentially "undemocratic and discriminatory", and should be challenged in the courts.<br>Tens of thousands of people in Gujarat's teeming cities live in sprawling chawls - densely packed buildings with more than a dozen tenements - where many families share a single toilet. Will a chawl resident be barred from contesting because he does not have his private toilet? What happens to the political aspirations of a resident of a grubby shantytown home so small that his living space is sometimes equal to the non-existent toilet?<br>"The government should first provide space and money to build toilets for the poor. The poor are most affected by urban planning because it has always excluded them. Now they can't dream from standing for public office just because they don't have the space or money to build their own toilets?" asks Professor Shah. It's a valid question.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                            | <code>Is banning a person from contesting for public office if he or she does not have a toilet at home a good idea?</code>  |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "MultipleNegativesSymmetricRankingLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 0.75,
      "prior_layers_weight": 2,
      "kl_div_weight": 1,
      "kl_temperature": 0.5
  }
  ```

#### compression-pairs

* Dataset: [compression-pairs](https://huggingface.co/datasets/sentence-transformers/sentence-compression) at [605bc91](https://huggingface.co/datasets/sentence-transformers/sentence-compression/tree/605bc91d95631895ba25b6eda51a3cb596976c90)
* Size: 14,550 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                           | sentence2                                                                         |
  |:--------|:------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                              | string                                                                            |
  | details | <ul><li>min: 10 tokens</li><li>mean: 31.04 tokens</li><li>max: 372 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 10.01 tokens</li><li>max: 25 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                                                                                                                                                                         | sentence2                                                         |
  |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------|
  | <code>The Canadian dollar continued its meteoric rise Wednesday to as high as US98.91¢ before the market open, edging ever closer to parity.</code>                                                                                                               | <code>Canadian dollar edges closer to parity</code>               |
  | <code>NFL Network insider Jason La Canfora reports Anderson has agreed to terms with the Cardinals, according to a league source.</code>                                                                                                                          | <code>Anderson agrees to terms with Cardinals</code>              |
  | <code>Churchill Downs said Monday it will increase its annual dividend 20 percent to 60 cents per share.The Louisville, Ky., racetrack operator and gambling company said it raised the dividend to reflect its strong financial results so far this year.</code> | <code>Churchill Downs increases annual dividend 20 percent</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "MultipleNegativesSymmetricRankingLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 0.75,
      "prior_layers_weight": 2,
      "kl_div_weight": 1,
      "kl_temperature": 0.5
  }
  ```

#### sciq_pairs

* Dataset: [sciq_pairs](https://huggingface.co/datasets/allenai/sciq) at [2c94ad3](https://huggingface.co/datasets/allenai/sciq/tree/2c94ad3e1aafab77146f384e23536f97a4849815)
* Size: 11,328 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                         |
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                            |
  | details | <ul><li>min: 5 tokens</li><li>mean: 16.97 tokens</li><li>max: 54 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 85.2 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
  | sentence1                                                            | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       |
  |:---------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>How do the vast majority of fish reproduce?</code>             | <code>Nearly all fish reproduce sexually, and most species have separate sexes. Those without separate sexes avoid self-fertilization by producing sperm and eggs at different times. Each fish typically produces a large number of gametes. In most fish species, fertilization takes place externally. These fish are oviparous . Eggs are laid and embryos develop outside the mother’s body. In a minority of fish, including sharks, eggs develop inside the mother’s body but without nourishment from the mother. These fish are ovoviviparous .</code>                                                                                                 |
  | <code>What is the loss of energy available to do work called?</code> | <code>15.6 Entropy and the Second Law of Thermodynamics: Disorder and the Unavailability of Energy • Entropy is the loss of energy available to do work. • Another form of the second law of thermodynamics states that the total entropy of a system either increases or remains constant; it never decreases. • Entropy is zero in a reversible process; it increases in an irreversible process. • The ultimate fate of the universe is likely to be thermodynamic equilibrium, where the universal temperature is constant and no energy is available to do work. • Entropy is also associated with the tendency toward disorder in a closed system.</code> |
  | <code>How many vertebrae make up the human vertebral column?</code>  | <code>Human Vertebral Column and Vertebrae. The human vertebral column consists of 33 vertebrae. Two vertebrae are shown here enlarged.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                  |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### qasc_pairs

* Dataset: [qasc_pairs](https://huggingface.co/datasets/allenai/qasc) at [a34ba20](https://huggingface.co/datasets/allenai/qasc/tree/a34ba204eb9a33b919c10cc08f4f1c8dae5ec070)
* Size: 7,889 training samples
* Columns: <code>id</code>, <code>sentence1</code>, and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | id                                                                                 | sentence1                                                                         | sentence2                                                                          |
  |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                             | string                                                                            | string                                                                             |
  | details | <ul><li>min: 17 tokens</li><li>mean: 21.27 tokens</li><li>max: 27 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 11.28 tokens</li><li>max: 23 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 34.86 tokens</li><li>max: 68 tokens</li></ul> |
* Samples:
  | id                                          | sentence1                                                                  | sentence2                                                                                                                                                                                                                    |
  |:--------------------------------------------|:---------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>3KMS4QQVK2P724SORHWYGW4AU78KFR</code> | <code>What is in fruit? </code>                                            | <code>sugar causes food to taste sweet. Fruit is delicious and very sweet.. sugar is in fruit</code>                                                                                                                         |
  | <code>3ATTHHXXWANXWVTLR8H89NP4TSOXIK</code> | <code>What can lead to cancer in genes that control the cell cycle?</code> | <code>Mutations that lead to cancer usually occur in genes that control the cell cycle.. Many carcinogens are capable of causing gene mutations.. Carcinogens can lead to cancer in genes that control the cell cycle</code> |
  | <code>3EQHHY4HQSRAYL3GVEYAWSL4O5BG57</code> | <code>what do plants lose to the atmosphere?</code>                        | <code>transpiration is when water vapor moves from plants into the atmosphere. Plants lose water continually by transpiration.. plants lose water to the atmosphere</code>                                                   |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### openbookqa_pairs

* Dataset: [openbookqa_pairs](https://huggingface.co/datasets/allenai/openbookqa) at [388097e](https://huggingface.co/datasets/allenai/openbookqa/tree/388097ea7776314e93a529163e0fea805b8a6454)
* Size: 2,637 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                        | sentence2                                                                         |
  |:--------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                           | string                                                                            |
  | details | <ul><li>min: 3 tokens</li><li>mean: 13.8 tokens</li><li>max: 65 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 11.24 tokens</li><li>max: 30 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                | sentence2                                                                            |
  |:---------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
  | <code>a toaster converts electricity into radiant waves for</code>                                       | <code>a toaster converts electrical energy into heat energy for toasting</code>      |
  | <code>A person loves spring, and it has just passed by. They will enjoy it again the next time</code>    | <code>each season occurs once per year</code>                                        |
  | <code>a student leaves a nail line on a mineral sample, so that mineral can be described as what?</code> | <code>if a mineral can be scratched by a fingernail then that mineral is soft</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### msmarco_pairs

* Dataset: [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3) at [28ff31e](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3/tree/28ff31e4c97cddd53d298497f766e653f1e666f9)
* Size: 14,550 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                        | sentence2                                                                           |
  |:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
  | type    | string                                                                           | string                                                                              |
  | details | <ul><li>min: 4 tokens</li><li>mean: 8.58 tokens</li><li>max: 32 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 76.09 tokens</li><li>max: 205 tokens</li></ul> |
* Samples:
  | sentence1                             | sentence2                                                                                                                                                                                                                                                                                                                                             |
  |:--------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>when was remy ma born</code>    | <code>Overview (4). Remy Ma was born on May 30, 1980 in South Bronx, The Bronx, New York City, New York, USA as Remy K. Smith. She has been married to Papoose since May 12, 2009.</code>                                                                                                                                                             |
  | <code>what causes bladder pain</code> | <code>Bladder pain may be caused by a number of conditions, including a urinary tract infection. A urine sample can be used to detect a bladder infection. Several forms of cancer may result in pain in the bladder. Pain is a common symptom of bladder tumor growths. The human urinary tract, including the bladder in pink at the bottom.</code> |
  | <code>what is .shp</code>             | <code>SHP for Agencies. SHP for Home Health Agencies (or simply SHP for Agencies) is a web-based analytics and benchmarking solution that gives home health organizations the power to effectively manage performance, stay compliant, and follow best practices. The SHP for Agencies solution helps your organization:</code>                       |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### nq_pairs

* Dataset: [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17)
* Size: 14,550 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                            |
  |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                               |
  | details | <ul><li>min: 9 tokens</li><li>mean: 11.83 tokens</li><li>max: 21 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 134.32 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
  | sentence1                                                          | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                    |
  |:-------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>what is the panthers name in disney's the jungle book</code> | <code>Bagheera In Disney's 1967 animated adaptation, Bagheera the panther is, as in the book, male, and voiced by Sebastian Cabot. The panther is portrayed as an intelligent, mature, and logical character, quite similar to the Bagheera in the books. In the film, it is Bagheera and not the wolves who first finds Mowgli, a young village child. It is Bagheera who brings Mowgli to the care of the wolves and ensures that the baby survives. He is also the one who takes him back to the village, for his own safety, as he knew for years that Mowgli would eventually need to leave his adoptive animal family to return to his place in the human world. During the film, Bagheera often lectures Baloo, for he knows that as long as Shere Khan is in the jungle, the jungle is not safe for Mowgli despite all of Baloo's attempts to protect him. Bagheera is also the narrator of the film's story.</code> |
  | <code>one unit is equal to how many ml</code>                      | <code>Unit of alcohol One unit of alcohol (UK) is defined as 10 millilitres (8 grams) of pure alcohol.[2][3] Typical drinks (i.e., typical quantities or servings of common alcoholic beverages) may contain 1–3 units of alcohol.[3]</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               |
  | <code>who sings can't get you outta my head</code>                 | <code>Can't Get You Out of My Head "Can't Get You Out of My Head" is a song recorded by Australian singer Kylie Minogue for her eighth studio album, titled Fever, which she released in 2001. The song was released in Australia by Parlophone as the lead single from the album on 8 September 2001. It was released on 17 September 2001 in the United Kingdom. In the United States, the single was released on 18 February 2002. Jointly written, composed, and produced by Cathy Dennis and Rob Davis, "Can't Get You Out of My Head" is a midtempo dance-pop song which lyrically details its narrator's obsession towards her lover. The song is famous for its "la la la" hook.</code>                                                                                                                                                                                                                              |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### trivia_pairs

* Dataset: [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa) at [a7c36e3](https://huggingface.co/datasets/sentence-transformers/trivia-qa/tree/a7c36e3c8c8c01526bc094d79bf80d4c848b0ad0)
* Size: 14,550 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                            |
  |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                               |
  | details | <ul><li>min: 8 tokens</li><li>mean: 17.05 tokens</li><li>max: 43 tokens</li></ul> | <ul><li>min: 21 tokens</li><li>mean: 446.31 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
  | sentence1                                                                   | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   |
  |:----------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>Main is French for which part of the body?</code>                     | <code>Les parties du corps - Des os, il en faut - alain le lait (French body parts) - YouTube Les parties du corps - Des os, il en faut - alain le lait (French body parts) Want to watch this again later? Sign in to add this video to a playlist. Need to report the video? Sign in to report inappropriate content. Rating is available when the video has been rented. This feature is not available right now. Please try again later. Uploaded on Oct 30, 2011 Des os, il en faut - alain le lait du CD 'Parapluie' ©2006 Une chanson sur les parties du corps Words and english translation Tu as deux mains et deux pieds Tu as deux jambes et un nez Tu as un ventre et un dos Et des muscles sous la peau Tu as une tête et un cou Deux oreilles et deux genoux Tu as deux yeux et deux joues Et une bouche qui mange tout, et Sous ta peau il y a des os Des petits et des gros Des os, des os, il en faut C'est parce que tu as des os que ... Bones, you must have them You have two hands and two feet You have two legs and a nose You have a belly (stomach) and a back And muscles underneath your skin You have a head and a neck Two ears and two knees You have two eyes and two cheeks And a mouth that eats everything and Under your skin you have bones Small bones and big ones Bones, bones, you must have them It's because you have bones that ... (repeat from top of the song) Category</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               |
  | <code>Ainsley Harriot was once head chef at which UK cricket ground?</code> | <code>Ainsley Harriott - Awards Hosts & Presenter - Speakers Corner Ainsley Harriott Biography Celebrity chef and Ready Steady Cook presenter Ainsley Harriott is quick-witted, charismatic and a consummate professional. Although singing and performing was his first love (he co-founded the Calypso Twins in the early 90’s) Ainsley’s cooking career began when he was offered an apprenticeship at an East End restaurant at the age of 16. After years of hard work in the kitchen, Ainsley rose to Head Chef position at Lord's Cricket Ground's Long Room. He worked as a chef in many restaurants in London including the Dorchester, Brown's, The Hilton, The Westbury, Café Pelican and Quaglino's. Simultaneously his foray into TV and radio began. While at Lords he was asked to present More Nosh, Less Dosh on BBC Radio 5, and he then secured a small role in sci-fi comedy Red Dwarf in 1993, and eventually became resident chef on Good Morning with Anne and Nick. Once Ainsley... Celebrity chef and Ready Steady Cook presenter Ainsley Harriott is quick-witted, charismatic and a consummate professional. Although singing and performing was his first love (he co-founded the Calypso Twins in the early 90’s) Ainsley’s cooking career began when he was offered an apprenticeship at an East End restaurant at the age of 16. After years of hard work in the kitchen, Ainsley rose to Head Chef position at Lord's Cricket Ground's Long Room. He worked as a chef in many restaurants in London including the Dorchester, Brown's, The Hilton, The Westbury, Café Pelican and Quaglino's. Simultaneously his foray into TV and radio began. While at Lords he was asked to present More Nosh, Less Dosh on BBC Radio 5, and he then secured a small role in sci-fi comedy Red Dwarf in 1993, and eventually became resident chef on Good Morning with Anne and Nick. Once Ainsley became the main presenter of Can't Cook, Won't Cook and later Ready, Steady, Cook, he was a household name. Ainsley's Barbecue Bible; Ainsley's Meals in Minutes; Ainsley's Big Cook Out and Ainsley's Gourmet Express all followed, and he became one of the most famous TV cooking faces in the country. In 2000, Ainsley made his US TV debut with The Ainsley Harriott Show and then Ready.. Set... Cook!, the US version of Ready Steady Cook. For further information or to book Ainsley Harriott, call us at Speakers Corner on +44 (0)20 7607 7070 or email info@speakerscorner.co.uk</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                  |
  | <code>Who wrote the 1973 novel ‘The Dressmaker’?</code>                     | <code>The Dressmaker: Beryl Bainbridge: 9780786703227: Amazon.com: Books Beryl Bainbridge Next Special Offers and Product Promotions From AudioFile English actor and audiobook reader Jacqueline King performs this thickly British story with the skill necessary to enliven five distinct characters and stitch them all together through the lucid prose of the novel's guiding narrator. In that the story is beautifully constructed to begin with, the listener is in for a fine artistic experience. The setting is Liverpool, 1944. The war pressures naïve teenaged Rita to dream beyond the fortified shores of her own country. The town is full of Yanks who come from the land of Hollywood. Rita claims one for herself, but her two aunts, who have raised her, see more and less in him than Rita suspects. The ending is inspired and in itself gives reason why this book was runner-up for the Booker Prize. The recording quality is hissy (muffled with Dolby), but it strangely adds to the atmosphere if one knows how radios used to sound during those dark, uncertain days. P.W. Winner of AudioFile Earphones Award © AudioFile 2002, Portland, Maine-- Copyright © AudioFile, Portland, Maine --This text refers to the Hardcover edition. Don't have a Kindle? Get your Kindle here , or download a FREE Kindle Reading App . New York Times best sellers Browse the New York Times best sellers in popular categories like Fiction, Nonfiction, Picture Books and more. See more Product Details Publisher: Carroll & Graf Publishers (July 1996) Language: English Product Dimensions: 8.2 x 5.4 x 0.4 inches Shipping Weight: 6.4 ounces By Cariola VINE VOICE on December 12, 2016 Format: Kindle Edition|Verified Purchase After reading several novels by Bainbridge, I've come to the conclusion that I will never be a huge fan. I didn't outright dislike them; they just didn't do much for me. Aside from a few moments, this one was pretty milquetoast until the very end, when something unexpected occurs--and then it just stops with no satisfying conclusion. I guess the point she is making is simply to show how insular this particular family is. Nellie, the dressmaker of the title though perhaps not the main character, is in a love/hate relationship with his sister, Margo. The two share their home with "our Rita," the seventeen-year old daughter of their brother, Jack. When his wife died, Jack sold his house, gave Rita into the care of his sisters, and moved into a flat above a butcher shop. He is still extremely involved in all of their lives, but Rita, who knows he is her father, calls him "Uncle Jack," probably just to go along with Auntie Nellie" and "Auntie Margo." Nellie is the reserved, responsible one; Margo is the daring and sometimes wild one. She had been married to a soldier who came back from the trenches suffering from the effects of gas attacks. It was Nellie, however, who nursed him until his death. Set in the aftermath of World War II, the story revolves around young Rita falling in love with an American GI named Ira. As teenage girls still do, Rita initially hides her beau from her family, using the ploy that she is visiting her friend Cissie--whom the aunts have never met. If you like reading about the sappiness of teenagers in love, this part of the story should appeal to you, because Rita is one of the sappiest. There are the usual dreams of marrying Ira and flying off to live in the US. And a lot of worrying about whether or not Ira will call, show up for a scheduled rendezvous, write her a letter, doesn't talk enough, wants too much, wants too little. It becomes clear early on that this is an ill-suited pair and a one-sided romance. The remainder of the novel, as one would expect, focuses on what happens when a neighbor tells Nellie that Rita has been stepping out with an American soldier and when Ira decides that she is way too young (i.e., immature) for him. And as I said above, there is an unexpected and rather unresolved conclusion. Good writing, fleshed out if stereotypical characters, but nothing to get excited about. By TonyMess on March 1, 20</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### quora_pairs

* Dataset: [quora_pairs](https://huggingface.co/datasets/sentence-transformers/quora-duplicates) at [451a485](https://huggingface.co/datasets/sentence-transformers/quora-duplicates/tree/451a4850bd141edb44ade1b5828c259abd762cdb)
* Size: 14,550 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                         |
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                            |
  | details | <ul><li>min: 6 tokens</li><li>mean: 13.56 tokens</li><li>max: 43 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 13.58 tokens</li><li>max: 44 tokens</li></ul> |
* Samples:
  | sentence1                                                  | sentence2                                                 |
  |:-----------------------------------------------------------|:----------------------------------------------------------|
  | <code>Do babies dream?</code>                              | <code>Do babies dream while they are sleeping?</code>     |
  | <code>What is the point of being married?</code>           | <code>What are some arguments for getting married?</code> |
  | <code>How do you boost your mobile signal strength?</code> | <code>How can you increase mobile signal strength?</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### gooaq_pairs

* Dataset: [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c)
* Size: 14,550 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                          |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                             |
  | details | <ul><li>min: 8 tokens</li><li>mean: 11.38 tokens</li><li>max: 21 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 58.0 tokens</li><li>max: 143 tokens</li></ul> |
* Samples:
  | sentence1                                                     | sentence2                                                                                                                                                                                                                                                                                                         |
  |:--------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>when can baby sit in jumperoo?</code>                   | <code>The best age for babies to use jumperoos depends on your own baby, how well they're able to hold their head up, how much upper body support they need, and the product you're using. However, we'd say don't put any baby in a jumperoo before they're 4 months old – just to be on the safe side.</code>   |
  | <code>what are the three main categories of mutations?</code> | <code>There are three types of DNA Mutations: base substitutions, deletions and insertions. Single base substitutions are called point mutations, recall the point mutation Glu -----> Val which causes sickle-cell disease. Point mutations are the most common type of mutation and there are two types.</code> |
  | <code>is vpn not working in uae?</code>                       | <code>The UAE actually has laws related to the use of VPNs. Indeed, UAE law says that a VPN is only illegal if it's used to commit a crime. The Telecom Regulatory Authority (TRA) is responsible for internet censorship in the UAE.</code>                                                                      |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

### Evaluation Datasets

#### nli-pairs

* Dataset: [nli-pairs](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab)
* Size: 150 evaluation samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
  |         | anchor                                                                            | positive                                                                         |
  |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                           |
  | details | <ul><li>min: 5 tokens</li><li>mean: 17.17 tokens</li><li>max: 36 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 9.51 tokens</li><li>max: 21 tokens</li></ul> |
* Samples:
  | anchor                                                                                                                                                                         | positive                                                    |
  |:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------|
  | <code>Two women are embracing while holding to go packages.</code>                                                                                                             | <code>Two woman are holding packages.</code>                |
  | <code>Two young children in blue jerseys, one with the number 9 and one with the number 2 are standing on wooden steps in a bathroom and washing their hands in a sink.</code> | <code>Two kids in numbered jerseys wash their hands.</code> |
  | <code>A man selling donuts to a customer during a world exhibition event held in the city of Angeles</code>                                                                    | <code>A man selling donuts to a customer.</code>            |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### vitaminc-pairs

* Dataset: [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc) at [be6febb](https://huggingface.co/datasets/tals/vitaminc/tree/be6febb761b0b2807687e61e0b5282e459df2fa0)
* Size: 150 evaluation samples
* Columns: <code>label</code>, <code>sentence1</code>, and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | label                        | sentence1                                                                        | sentence2                                                                         |
  |:--------|:-----------------------------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | int                          | string                                                                           | string                                                                            |
  | details | <ul><li>1: 100.00%</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 16.8 tokens</li><li>max: 53 tokens</li></ul> | <ul><li>min: 9 tokens</li><li>mean: 36.4 tokens</li><li>max: 145 tokens</li></ul> |
* Samples:
  | label          | sentence1                                                                                                | sentence2                                                                                                                                                                                                         |
  |:---------------|:---------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>1</code> | <code>Oughterard is one of the Catholic parishes which form Connemara .</code>                           | <code>Connemara is composed of the Catholic parishes of Carna , Clifden ( Omey and Ballindoon ) , Ballynakill , Roundstone , Oughterard and Inishbofin .</code>                                                   |
  | <code>1</code> | <code>Miroslav Klose retired in August 2014 .</code>                                                     | <code>He was the highest male international scorer among active players following Miroslav Klose 's retirement in August 2014 .</code>                                                                            |
  | <code>1</code> | <code>The film Office Space made under $ 12.9 million against a budget of more than $ 9 million .</code> | <code>Although not a big success at the box office , making $ 12.8 million against a $ 10 million budget , the film was well received by critics and sold well on home video , and has become a cult film.</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### qnli-contrastive

* Dataset: [qnli-contrastive](https://huggingface.co/datasets/nyu-mll/glue) at [bcdcba7](https://huggingface.co/datasets/nyu-mll/glue/tree/bcdcba79d07bc864c1c254ccfcedcce55bcc9a8c)
* Size: 150 evaluation samples
* Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                          | label                        |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------|
  | type    | string                                                                            | string                                                                             | int                          |
  | details | <ul><li>min: 7 tokens</li><li>mean: 14.44 tokens</li><li>max: 29 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 37.64 tokens</li><li>max: 115 tokens</li></ul> | <ul><li>0: 100.00%</li></ul> |
* Samples:
  | sentence1                                                                 | sentence2                                                                                                                                        | label          |
  |:--------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------|:---------------|
  | <code>What came into force after the new constitution was herald?</code>  | <code>As of that day, the new constitution heralding the Second Republic came into force.</code>                                                 | <code>0</code> |
  | <code>What is the first major city in the stream of the Rhine?</code>     | <code>The most important tributaries in this area are the Ill below of Strasbourg, the Neckar in Mannheim and the Main across from Mainz.</code> | <code>0</code> |
  | <code>What is the minimum required if you want to teach in Canada?</code> | <code>In most provinces a second Bachelor's Degree such as a Bachelor of Education is required to become a qualified teacher.</code>             | <code>0</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "OnlineContrastiveLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 0.25,
      "prior_layers_weight": 2.5,
      "kl_div_weight": 5,
      "kl_temperature": 0.25
  }
  ```

#### scitail-pairs-qa

* Dataset: [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44)
* Size: 150 evaluation samples
* Columns: <code>sentence2</code> and <code>sentence1</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence2                                                                         | sentence1                                                                         |
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                            |
  | details | <ul><li>min: 7 tokens</li><li>mean: 16.18 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 15.43 tokens</li><li>max: 32 tokens</li></ul> |
* Samples:
  | sentence2                                                         | sentence1                                                                              |
  |:------------------------------------------------------------------|:---------------------------------------------------------------------------------------|
  | <code>Antarctica is the only continent without amphibians.</code> | <code>What is the only continent without amphibians?</code>                            |
  | <code>Air can be separated into several elements.</code>          | <code>Which of the following substances can be separated into several elements?</code> |
  | <code>Ice is the common term for water in its solid state.</code> | <code>What is the common term for water in its solid state?</code>                     |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### scitail-pairs-pos

* Dataset: [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44)
* Size: 150 evaluation samples
* Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                        | sentence2                                                                         | label                                           |
  |:--------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:------------------------------------------------|
  | type    | string                                                                           | string                                                                            | int                                             |
  | details | <ul><li>min: 7 tokens</li><li>mean: 23.1 tokens</li><li>max: 61 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 15.48 tokens</li><li>max: 36 tokens</li></ul> | <ul><li>0: ~53.33%</li><li>1: ~46.67%</li></ul> |
* Samples:
  | sentence1                                                                                                                         | sentence2                                                                                          | label          |
  |:----------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------|:---------------|
  | <code>An introduction to atoms and elements, compounds, atomic structure and bonding, the molecule and chemical reactions.</code> | <code>Replace another in a molecule happens to atoms during a substitution reaction.</code>        | <code>0</code> |
  | <code>Wavelength The distance between two consecutive points on a sinusoidal wave that are in phase;</code>                       | <code>Wavelength is the distance between two corresponding points of adjacent waves called.</code> | <code>1</code> |
  | <code>humans normally have 23 pairs of chromosomes.</code>                                                                        | <code>Humans typically have 23 pairs pairs of chromosomes.</code>                                  | <code>1</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### xsum-pairs

* Dataset: [xsum-pairs](https://huggingface.co/datasets/sentence-transformers/xsum) at [788ddaf](https://huggingface.co/datasets/sentence-transformers/xsum/tree/788ddafe04e539956d56b567bc32a036ee7b9206)
* Size: 150 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                            | sentence2                                                                         |
  |:--------|:-------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                               | string                                                                            |
  | details | <ul><li>min: 62 tokens</li><li>mean: 324.05 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 9 tokens</li><li>mean: 27.27 tokens</li><li>max: 44 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                     | sentence2                                                                                                                                                           |
  |:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>Media playback is not supported on this device<br>The Lions started the tour of New Zealand with a scratchy victory over the Provincial Barbarians before a loss to the Blues, but recovered to record a significant 12-3 win in Christchurch.<br>"It's been tough this week, there's been a lot of criticism," Gatland said.<br>"People have written the tour off already after two games.<br>"That's been challenging for all of us. We've had to stay strong in the group and keep the faith.<br>"I hope we didn't disappoint any people tonight with the result."<br>Fly-half Owen Farrell kicked four penalties for the Lions, while a heroic defensive effort managed to keep the Crusaders - who have averaged 37 points across 14 straight victories in Super Rugby - to merely a penalty.<br>The Lions now face the Highlanders, New Zealand Maori and the Chiefs before the first Test against New Zealand on June 24.<br>"This is great preparation for us preparing to play the best team in the world, which is the All Blacks," Gatland added.<br>"It's a like a club side coming together in pre-season, getting a couple of games under its belt and you know the more time together the better you'll get.<br>"This team was outstanding in training on Friday, looked sharp and I knew there would be a performance because they have had time to gel.<br>"The result was pretty important for us. Tonight was another step up, but there is still a lot to work on."<br>One of those areas is their finishing, after the Lions spurned a handful of opportunities to score tries against the Super Rugby side.<br>"We are creating [chances], and we need to get better at [finishing]. The more time we have together, hopefully we will finish those chances."<br>Gatland also confirmed tour captain Sam Warburton would be involved against the Highlanders in Dunedin next week, having recovered from a minor ankle injury.<br>Full-back Stuart Hogg and centre Jonathan Davies will both undergo concussion return-to-play protocols after failing Head Injury Assessments during the game.<br>"We've laid a marker down a little bit tonight, now it's a big challenge for the team that takes the field on Tuesday," Gatland said.<br>Meanwhile, second-row George Kruis was part of the outstanding forward effort and feels the Lions pack has made a statement with the Test series a fortnight away.<br>"We had a good contest today, and probably got the upper hand," the Saracens and England lock said.<br>"There were six internationals in their pack, and we knew it was going to be a tasty game. It got a bit heated at times, but we held our own and did a good job.<br>"We relish the opportunity to go toe-to-toe with a pack like that. We talk about how we want to be a brutal pack and a set-piece dominant pack, and today we showed good signs of that.<br>"It's every boy's dream to play for the Lions, and to get a win like that today, hopefully we can really start to build this culture and build towards the Tests."</code> | <code>British and Irish Lions head coach Warren Gatland says his players had to "keep the faith" as they prepared for Saturday's key win over the Crusaders.</code> |
  | <code>Mr Umunna, who pulled out the race himself earlier this month, said Ms Kendall was best placed to drag the party out of its "comfort zone".<br>He told the New Statesman Ms Kendall had "challenged conventional wisdom" and asked tough questions about Labour's future after its defeat.<br>Andy Burnham, Yvette Cooper and Mary Creagh are also standing.<br>Candidates must get the support of 35 MPs by 15 June, when nominations close, in order to get on the ballot paper. The winner will be announced on 12 September.<br>Ms Kendall was the first candidate to publicly declare her interest in the job after Ed Miliband's resignation.<br>The shadow care services minister, who was elected to Parliament in 2010, had already won the support of shadow education secretary Tristram Hunt and shadow Europe minister Pat McFadden.<br>Now Mr Umunna has said he is throwing his weight behind her and that three other MPs who were part of his short-lived leadership team - Emma Reynolds, Jonathan Reynolds and Stephen Twigg - were also doing the same.<br>"In this time of change our party must move beyond its comfort zone and find new ways of realising its age-old goals of equality and freedom," he wrote in the New Statesman.<br>Labour's next leader, he suggested, must embrace a "vision of a Britain in which all can get on, whose citizens are financially secure and in control of their lives and happiness - and are, collectively, secure and effective in the wider world".<br>"For us, our next leader must get this vision right," he wrote.<br>"On all these big subjects, Liz Kendall has asked the tough questions and started to chart a course to the answers. She has been courageous in challenging conventional wisdom. She has no compunction in moving Labour beyond our comfort zone and is determined to build a team ready to chart a route forward."<br>Ms Kendall has promised a new approach to business, education and defence, claiming Labour lost the election because its policies were wrong and mistakenly believed the county had moved to the left.<br>Mr Burnham has won the backing of frontbenchers Rachel Reeves, Dan Jarvis and Michael Dugher, as well as former deputy prime minister Lord Prescott. Yvette Cooper has been endorsed by Vernon Coaker and John Healey among others.<br>Ms Creagh told the BBC that she was confident that she would get sufficient nominations to get on the ballot paper. "A lot of people have already made a decision but a lot of people are rightly consulting with their parties," she told Radio 4's Woman's Hour.<br>While Labour could win the next election, Ms Creagh warned that the party would "cease to exist" if it took its voters for granted and did not address the separate challenges facing it in Scotland, the north of England and southern England.<br>Mr Umunna pulled out of the race only days after entering, saying he was uncomfortable with media scrutiny of his family.</code>                                                           | <code>Labour leadership candidate Liz Kendall has won the backing of shadow business secretary Chuka Umunna.</code>                                                 |
  | <code>The woman sustained leg and head injuries in the incident on the A12 just south of Chelmsford, at 01:30 GMT.<br>A 41-year-old man from Sevenoaks, Kent, has been arrested on suspicion of drinking and driving and causing grievous bodily harm.<br>The southbound carriageway was closed between junctions 16 and 15 until 07:00 GMT.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           | <code>A woman is in a critical condition after she was hit by a car on a dual carriageway in Essex.</code>                                                          |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "MultipleNegativesSymmetricRankingLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 0.75,
      "prior_layers_weight": 2,
      "kl_div_weight": 1,
      "kl_temperature": 0.5
  }
  ```

#### compression-pairs

* Dataset: [compression-pairs](https://huggingface.co/datasets/sentence-transformers/sentence-compression) at [605bc91](https://huggingface.co/datasets/sentence-transformers/sentence-compression/tree/605bc91d95631895ba25b6eda51a3cb596976c90)
* Size: 150 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                           | sentence2                                                                         |
  |:--------|:------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                              | string                                                                            |
  | details | <ul><li>min: 11 tokens</li><li>mean: 32.65 tokens</li><li>max: 157 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 10.31 tokens</li><li>max: 27 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                                                                                                                                                  | sentence2                                                       |
  |:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------|
  | <code>Oil jumped $4 to record highs over $120 a barrel on Monday on the weaker US dollar and supply concerns from OPEC members Nigeria and Iran.</code>                                                                                    | <code>Oil jumps $4 to record over $120 on weak dollar</code>    |
  | <code>MIAMI - Hurricane Celia has weakened a bit in the Pacific but is still a Category 2 storm.</code>                                                                                                                                    | <code>Hurricane Celia weakens but still Category 2 storm</code> |
  | <code>The Wisconsin recall election is officially underway, with voters heading to the polls to decide whether or not to recall highly publicized Republican governor Scott Walker in favor Democratic Milwaukee Mayor Tom Barrett.</code> | <code>Wisconsin recall election underway</code>                 |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "MultipleNegativesSymmetricRankingLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 0.75,
      "prior_layers_weight": 2,
      "kl_div_weight": 1,
      "kl_temperature": 0.5
  }
  ```

#### sciq_pairs

* Dataset: [sciq_pairs](https://huggingface.co/datasets/allenai/sciq) at [2c94ad3](https://huggingface.co/datasets/allenai/sciq/tree/2c94ad3e1aafab77146f384e23536f97a4849815)
* Size: 150 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                          |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                             |
  | details | <ul><li>min: 7 tokens</li><li>mean: 17.42 tokens</li><li>max: 42 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 84.46 tokens</li><li>max: 464 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                         | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           |
  |:------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>Salt in seawater causes it to have greater what, which is also affected by temperature and pressure?</code> | <code>Seawater has lots of salts in it. This increases its density (mass per volume) over fresh water. Temperature and pressure also affect density.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                         |
  | <code>Halogens tend to form salts with what type of element?</code>                                               | <code>Halogens have filled valence electron configurations. Halogens tend to form salts with metals. As the free elements, halogens are monatomic. Halogens have appreciable nonmetallic character. Halogens tend to have an oxidation state of −1. Halogens are good reductants.</code>                                                                                                                                                                                                                                                                                                                                                                                            |
  | <code>What is type of substance is formed when water vapor condenses or when ice melts?</code>                    | <code>Liquid water is formed when water vapor condenses (i. e. , H 2 O(g) → H 2 O(l) or when ice melts (i. e. , H 2 O(s) → H 2 O(l)). Because water is a molecular substance, it is a poor conductor of electricity in its pure form. However, as we will see later, its conductivity can be improved by the addition of certain substances. Water molecules are polar, and this overall polarity gives rise to many of the properties of water. For example, an interesting effect is seen when water is placed in a static electric field, as shown in the Figure below and the video below. This phenomenon can be explained in terms of the polarity of water molecules.</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### qasc_pairs

* Dataset: [qasc_pairs](https://huggingface.co/datasets/allenai/qasc) at [a34ba20](https://huggingface.co/datasets/allenai/qasc/tree/a34ba204eb9a33b919c10cc08f4f1c8dae5ec070)
* Size: 150 evaluation samples
* Columns: <code>id</code>, <code>sentence1</code>, and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | id                                                                                 | sentence1                                                                         | sentence2                                                                          |
  |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                             | string                                                                            | string                                                                             |
  | details | <ul><li>min: 17 tokens</li><li>mean: 21.05 tokens</li><li>max: 26 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 11.85 tokens</li><li>max: 22 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 35.73 tokens</li><li>max: 56 tokens</li></ul> |
* Samples:
  | id                                          | sentence1                                                                                    | sentence2                                                                                                                                                                                                  |
  |:--------------------------------------------|:---------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>3E7TUJ2EGCLQNOV1WEAJ2NN97VY9DJ</code> | <code>Something that comes from polluted what has a negative impact on water quality?</code> | <code>acid rain has a negative impact on water quality. Acid rain comes from polluted clouds.. Something that comes from polluted clouds has a negative impact on water quality.</code>                    |
  | <code>345LHZDEDXRQPOH710ZYLAOBITP3UH</code> | <code>Plastic and other mulches offer a barrier to what?</code>                              | <code>Spores may be dispersed by moving water, wind, or other organisms.. Plastic and other mulches offer a barrier to spore dispersal.. Plastic and other mulches offer a barrier to spores moving</code> |
  | <code>31JLPPHS2UTVCJXA5ENPM4WMXAI3OO</code> | <code>What happens if Mars becomes too hot?</code>                                           | <code>if a planet becomes too hot then that planet cannot sustain life. Another name for Mars is the Red Planet.. If Mars becomes too hot then Mars cannot sustain life</code>                             |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### openbookqa_pairs

* Dataset: [openbookqa_pairs](https://huggingface.co/datasets/allenai/openbookqa) at [388097e](https://huggingface.co/datasets/allenai/openbookqa/tree/388097ea7776314e93a529163e0fea805b8a6454)
* Size: 103 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                         |
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                            |
  | details | <ul><li>min: 4 tokens</li><li>mean: 12.84 tokens</li><li>max: 37 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 11.15 tokens</li><li>max: 24 tokens</li></ul> |
* Samples:
  | sentence1                                                                                | sentence2                                                    |
  |:-----------------------------------------------------------------------------------------|:-------------------------------------------------------------|
  | <code>Humans sometimes eat what?</code>                                                  | <code>humans sometimes eat seeds</code>                      |
  | <code>if something moves faster than before, it might have been affected by what?</code> | <code>force causes the speed of an object to increase</code> |
  | <code>A person wants to turn on an MP3 player, so they complete a circuit by</code>      | <code>pushing a button sometimes completes a circuit</code>  |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### msmarco_pairs

* Dataset: [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3) at [28ff31e](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3/tree/28ff31e4c97cddd53d298497f766e653f1e666f9)
* Size: 150 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                        | sentence2                                                                           |
  |:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
  | type    | string                                                                           | string                                                                              |
  | details | <ul><li>min: 4 tokens</li><li>mean: 8.55 tokens</li><li>max: 25 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 76.77 tokens</li><li>max: 180 tokens</li></ul> |
* Samples:
  | sentence1                                          | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                    |
  |:---------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>is there a victoria secret in london</code>  | <code>As Victoria's Secret opens the doors of its London Bond Street store, a fashion blogger goes undie-cover to give us their verdict. By Emily Johnston. Published: 12:43 EST, 29 August 2012 | Updated: 12:43 EST, 29 August 2012.</code>                                                                                                                                                                                                                                                                                                                |
  | <code>when was cafe terrace at night create</code> | <code>Café Terrace at Night. Café Terrace at Night, also known as The Cafe Terrace on the Place du Forum, is an oil painting executed by the Dutch artist Vincent van Gogh while at Arles, France, in mid-September 1888. The painting is not signed, but described and mentioned by the artist in three letters.</code>                                                                                                                                                                                                                                   |
  | <code>what does chohan</code>                      | <code>Chauhan, Chouhan or Chohan is a community sometimes described as a tribe and sometimes as a caste. In the medieval period some those associated with it ruled parts of Northern India and one, Prithviraj Chauhan, was the king of Delhi.ajput bardic accounts, which are based on mythology, describe the Chauhans as one of the four Agnikula Rajput clans who claim to have originated from a sacrificial fire-pit (agnikunda) at Mount Abu. These claims of supernatural origin are clearly improbable and unacceptable to the modern mind.</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### nq_pairs

* Dataset: [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17)
* Size: 150 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                          | sentence2                                                                            |
  |:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
  | type    | string                                                                             | string                                                                               |
  | details | <ul><li>min: 10 tokens</li><li>mean: 11.69 tokens</li><li>max: 17 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 128.35 tokens</li><li>max: 332 tokens</li></ul> |
* Samples:
  | sentence1                                                    | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                         |
  |:-------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>how many super bowl rings does the raiders have</code> | <code>Oakland Raiders The Raiders are known for their extensive fan base and distinctive team culture. Since 1963, the team has won 15 division titles (three AFL and 12 NFL), four AFC Championships (1976, 1980, 1983, and 2002), one AFL Championship (1967), and three Super Bowl Championships (XI, XV, and XVIII). The Raiders have 14 former members who have been enshrined in the Pro Football Hall of Fame.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                      |
  | <code>when did libya give up its nuclear weapons</code>      | <code>Disarmament of Libya In 1968, Libya became signatory of Nuclear Non-Proliferation Treaty (NPT), ratified the treaty in 1975, and concluded a safeguards agreement in 1980. Despite its commitment to NPT, there are reports indicating that Muammar Gaddafi of Libya either made unsuccessful attempts to build or entered in an agreement to purchase a nuclear weapon from nuclear-armed nations. In the 1970s–80s, Gaddafi made numerous attempts to accelerate and push forward his ambitions for an active nuclear weapons program, using the nuclear black market sources. However, after the end of the Cold War in 1991, Gaddafi sought to resolve its nuclear crises with the United States aiming to uplift the sanctions against Libya, finally agreeing to authorize rolling back Libya's weapons of mass destruction program on December 2003.</code>                                                        |
  | <code>how did thomas fire get it's name</code>               | <code>Thomas Fire On December 4, 2017, the Thomas Fire was reported at 6:26 p.m. PST,[36] to the north of Santa Paula, near Steckel Park and Thomas Aquinas College,[3][24] after which the fire is named.[37] That night, the small brush fire exploded in size and raced through the rugged mountain terrain that lies west of Santa Paula, between Ventura and Ojai.[19][38] Officials blamed strong Santa Ana winds that gusted up to 60 miles per hour (97 km/h) for the sudden expansion.[28][39] Soon after the fire had started, a second blaze was ignited nearly 30 minutes later, about 4 miles (6.4 km) to the north in Upper Ojai at the top of Koenigstein Road.[40] According to eyewitnesses, this second fire was sparked by an explosion in the power line over the area. The second fire was rapidly expanded by the strong Santa Ana winds, and soon merged into the Thomas Fire later that night.[40]</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### trivia_pairs

* Dataset: [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa) at [a7c36e3](https://huggingface.co/datasets/sentence-transformers/trivia-qa/tree/a7c36e3c8c8c01526bc094d79bf80d4c848b0ad0)
* Size: 150 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                            |
  |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                               |
  | details | <ul><li>min: 9 tokens</li><li>mean: 15.91 tokens</li><li>max: 36 tokens</li></ul> | <ul><li>min: 58 tokens</li><li>mean: 447.24 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
  | sentence1                                                                                            | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                     |
  |:-----------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>In the 2009 animated film ‘Up’ who is the voice of explorer Charles F Muntz?</code>            | <code>Charles F. Muntz | Disney Wiki | Fandom powered by Wikia [Source] Charles F. Muntz is a famous explorer admired by Carl Fredricksen and his wife Ellie as children, and the main antagonist of Disney/ Pixar 's 2009 film Up . In the movie, he found the bones of a tropical bird in South America but the scientific community claimed they were fake. Insulted, Muntz searches the South American wilderness for a live member of the same bird species, traveling in a zeppelin with his many pet dogs, whom he equips with special collars he invented that enable them to speak. They are lead by Alpha and his assistants Beta and Gamma . Contents Background Physical Description Charles F. Muntz has white hair. He wears a white dirty short-sleeved shirt with a brown winter jacket. He wears tan pants and brown shoes and carries a cane. His eyes are aqua. He also has a mustache. Personality At the peak of his career, Muntz was a charismatic, smart and daring young man whose spirit inspired countless fans to look for adventure. However, Muntz's quest for the bird that caused his disgrace destroyed him, as his search turned to obsession. Now a ghost of his former self, he became a heartless man, consumed by bitterness, paranoid to the extreme, and convinced that anyone who comes across him is after the bird. It is implied he killed two explorers already, and was intending to do the same with Russell and Carl . It is shown that Muntz's obsession is such that he doesn't care anymore for the rest of his collection, showing no hesitation to destroy some of his unique specimens when trying to take down Carl. Muntz largely serves as parallel to Carl. He was a man broken because he lived obsessed with his former existence and desperately trying to relive it at any cost, showing what Carl could have become, had he not learned to accept loss and move on with his life. Appearances Up Charles F. Muntz was a renowned explorer and entrepreneur while Carl and Ellie were children. He often traveled in his zeppelin, "The Spirit of Adventure", with his many canine companions. Thanks to Muntz's own ingenuity, he crafted many devices in his dirigible to make his life and his dogs as comfortable as possible. He also created the communicators in their collars later so they could be able to talk to each other. During one eventful return from Paradise Falls in South America, Muntz reveals an astonishing discovery—the skeleton of "The Monster of Paradise Falls". Scientists, however, believed the skeleton was a fabrication and Muntz was publicly disgraced. He vowed to capture the creature alive and not return to the United States until he did. Almost seventy years later, he is all but forgotten on the mainland, but his sole focus is to finally capture the rare bird. He apparently discovers where it hides, a monstrous rocky labyrinth, but can't go in himself and claims to have lost many of his dogs when he sent them in to capture the bird. The time he has spent alone and concentrating only on his mission has made him extremely paranoid and dangerous. It is hinted that he has murdered other visitors to Paradise Falls whom he thinks were after the bird. Later in the film he meets up with Carl and Russell and invites them over to his zeppelin for dinner, telling them of his search for the rare valuable bird, whom Kevin is a perfect match for his description. After Russel blurts out that Kevin is his pet and the bird he's looking for, Muntz becomes convinced that they are out to take credit for the bird's existence, so he sends his dogs after them. Carl, Russell, and Dug manage to escape by getting Kevin to fly over a cliff, but her leg is injured by Alpha. That night, their location is given away by Dug 's collar and Muntz captures Kevin in a net just before she can make it back to her babies. He gives Carl the ultimatum of either rescuing Kevin or saving his house, which he has set on fire. Carl rushes to put out the blaze and Muntz easily incapacitates Russell as he gets away with Kevin. Russell, thinking Carl only cares about his home, goes off to rescue by himself, but</code>  |
  | <code>As at the start of 2003, what is the make and model of the bestselling car of all time?</code> | <code>Top 50: Best Selling Cars Of All Time Top 50: Best Selling Cars Of All Time Updated on February 20, 2009 Introduction With all the millions of cars made and sold over the last 100 years, what are the best selling? This Top 50 has all the biggest sellers from around the world. The info on sales has been found all over the net to compile a current list of the big sellers. Any car with only one date and a + after the number is currently in production. The Chevrolet Camaro is not as "in production" because it not due out till Spring 09. 50. Peugeot 405 (1988-1997) - 3,461,800 50. Peugeot 405 (1988-1997) - 3,461,800 49. Peugeot 504: (1968-2005) - 3,713,400 49. Peugeot 504: (1968-2005) - 3,713,400 48. Fiat 127: (1971-1983) - 3,750,000 48. Fiat 127: (1971-1983) - 3,750,000 47. Citroen 2CV: (1948-1990) - 3,872,583 47. Citroen 2CV: (1948-1990) - 3,872,583 46. Fiat 500: (1957- ) - 3,900,000+ 46. Fiat 500: (1957- ) - 3,900,000+ 45. Pontiac Grand Am: (1973-2005) - 4,000,000 45. Pontiac Grand Am: (1973-2005) - 4,000,000 44. Ford Cortina: (1962-1982) - 4,279,079 44. Ford Cortina: (1962-1982) - 4,279,079 43. Ford Model A: (1927-31) - 4,320,446 43. Ford Model A: (1927-31) - 4,320,446 42. Opel Ascona: (1970-1988) - 4,400,000 42. Opel Ascona: (1970-1988) - 4,400,000 41. Fiat 126: (1973-2000) - 4,671,586 41. Fiat 126: (1973-2000) - 4,671,586 40. Chevrolet Camaro: (1967-2002) - 4,800,000 40. Chevrolet Camaro: (1967-2002) - 4,800,000 39. Ford Ranger: (1983- ) - 5,150,000+ 39. Ford Ranger: (1983-) - 5,150,000+ 38. Ford E-Series: (1961- ) - 5,200,000+ 38. Ford E-Series: (1961- ) - 5,200,000+ 37. Peugeot 205: (1983-1998) - 5,278,000 37. Peugeot 205: (1983-1998) - 5,278,000 36. Toyota Land Cruiser: (1953- ) - 5,300,000+ 36. Toyota Land Cruiser: (1953- ) - 5,300,000+ 35. Ford Crown Victoria: (1980- ) - 5,500,000+ 35. Ford Crown Victoria: (1980- ) - 5,500,000+ 34. Ford Focus: (1998- ) - 5,500,000+ 34. Ford Focus: (1998- ) - 5,500,000+ 33. Mitsubishi Galant: (1969- ) - 5,550,000+ 33. Mitsubishi Galant: (1969- ) - 5,550,000+ 32. Ford Explorer: (1991- ) - 5,700,00+ 32. Ford Explorer: (1991- ) - 5,700,00+ 31. Nissan Sunny: (1966- ) - 5,900,000+ 31. Nissan Sunny: (1966- ) - 5,900,000+ 30. Buick Le Sabre: (1959-2005) - 6,000,000 30. Buick Le Sabre: (1959-2005) - 6,000,000 29. Peugeot 206: (1998- 2007 ) - 6,100,000 29. Peugeot 206: (1998-2007) - 6,100,000 28. Chevrolet Cavalier: (1982-2005) - 6,200,000 28. Chevrolet Cavalier: (1982-2005) - 6,200,000 27. Vauxhall/Opel Vectra: (1988-2008) - 6,500,000 27. Vauxhall/Opel Vectra: (1988-2008) - 6,500,000 26. BMC/BL/BMW Mini: (1959- ) - 6,700,000+ 26. BMC/BL/BMW Mini: (1959- ) - 6,700,000+ 25. Ford Taurus: (1986- ) - 6,750,000+ 25. Ford Taurus: (1986- ) - 6,750,000+ 24. Fiat Punto: (1993- ) - 6,800,000+ 24. Fiat Punto: (1993- ) - 6,800,000+ 23. Renault 4: (1961-1992) - 8,150,000 23. Renault 4: (1961-1992) - 8,150,000 22. Ford Mustang: (1964- ) - 8,300,000+ 22. Ford Mustang: (1964- ) - 8,300,000+ 21. Renault 5: (1972-1996) - 8,800,000 21. Renault 5: (1972-1996) - 8,800,000 20. Renault Clio: (1991- ) - 8,900,000+ 20. Renault Clio: (1991- ) - 8,900,000+ 19. Fiat Uno: (1983- ) - 9,150,000+ 19. Fiat Uno: (1983- ) - 9,150,000+ 18. BMW 3-Series: (1977- ) - 9,800,000+ 18. BMW 3-Series: (1977- ) - 9,800,000+ 17. Vauxhall/Opel Astra: (1991- ) - 10,000,000+ 17. Vauxhall/Opel Astra: (1991- ) - 10,000,000+ 16. Mazda 323: (1963-2003) - 10,480,000 16. Mazda 323: (1963-2003) - 10,480,000 15. Toyota Camry: (1983- ) - 10,500,000+ 15. Toyota Camry: (1983- ) - 10,500,000+ 14. Chrysler Voyager: (1984- ) - 11,700,000+ 14. Chrysler Voyager: (1984- ) - 11,700,000+ 13. Oldsmobile Cutlass: (1961-99) - 11,900,000 13. Oldsmobile Cutlass: (1961-99) - 11,900,000 12. Vauxhall/Opel Corsa: (1982- ) - 12,000,000+ 12. Vauxhall/Opel Corsa: (1982- ) - 12,000,000+ 11. Ford Fiesta: (1976- ) - 12,500,000+ 11. Ford Fiesta: (1976- ) - 12,500,000+ 10. Chevrolet Impala: (1958- ) - 14,000,000+ 10. Chevrolet Impala: (1958- ) - 14,000,000+ 9. Volkswagen Passat: (1973- ) - 14,100,000+ 9. Volkswagen Passat: (1973- ) - 14,100,000+ 8. Honda Accord: (1976- ) - 15</code> |
  | <code>The 1963 film ‘The Birds’ is based on a story by which novelist?</code>                        | <code>The Birds The Birds There are no active dates for this event. Not Available Thursday Jan 21, 2016 7:30 PM - Saturday Feb 20, 2016 7:30 PM | $18.00 - $36.00 Get Tickets The Birds a play by Conor McPherson After weeks of aerial attacks, four strangers find sanctuary from an environmental catastrophe in an isolated and abandoned lakeside cabin. But that sanctuary is disturbed by questions of what constitutes civilized behavior in the absence of civilization. Adapted for a post-9/11 world by Conor McPherson, "The Birds" based on the short story by Daphne du Maurier that inspired Alfred Hitchcock's 1963 film. "A powerful piece of theatre and a reminder of just how important that story has become... truly frightening... a night in the theatre that should not be missed." Tippi Hedren "The Birds" features Sarah Harlett, Sean Nelson, Shawn Belyea, and Meme Garcia-Cosgrove, and opens January 21 at 12th Ave Arts. The play is directed by Greg Carter and stage managed by Gabrielle Strong, with designs by Reed Nakayama, Tommer Peterson, Brendan Patrick Hogan, and Jenny Ampersand. More information at strawshop.org SPECIAL PRICES ON MONDAY PERFORMANCES Discussion</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                    |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### quora_pairs

* Dataset: [quora_pairs](https://huggingface.co/datasets/sentence-transformers/quora-duplicates) at [451a485](https://huggingface.co/datasets/sentence-transformers/quora-duplicates/tree/451a4850bd141edb44ade1b5828c259abd762cdb)
* Size: 150 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                         |
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                            |
  | details | <ul><li>min: 7 tokens</li><li>mean: 14.21 tokens</li><li>max: 37 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 13.43 tokens</li><li>max: 30 tokens</li></ul> |
* Samples:
  | sentence1                                                            | sentence2                                                                                                |
  |:---------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------|
  | <code>How long does it take for alcohol to leave your system?</code> | <code>I'm getting a drug test. How long does it take for alcohol to completely leave your system?</code> |
  | <code>Whom should one follow on Quora? And why?</code>               | <code>Which are some of the most viewed writers I should follow from every topic on Quora?</code>        |
  | <code>What happens when best friends fall in love?</code>            | <code>Love: What is it like to fall in love with your best friend?</code>                                |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

#### gooaq_pairs

* Dataset: [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c)
* Size: 150 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                           |
  |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                              |
  | details | <ul><li>min: 8 tokens</li><li>mean: 11.57 tokens</li><li>max: 16 tokens</li></ul> | <ul><li>min: 21 tokens</li><li>mean: 56.68 tokens</li><li>max: 114 tokens</li></ul> |
* Samples:
  | sentence1                                                  | sentence2                                                                                                                                                                                                                                                                                                                    |
  |:-----------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>are buses running from wellington to taunton?</code> | <code>Is there a direct bus between Wellington and Taunton? Yes, there is a direct bus departing from Wellington, Post Office and arriving at Taunton, County Hall. Services depart every 30 minutes, and operate every day.</code>                                                                                          |
  | <code>1 kwh is equal to ampere?</code>                     | <code>The electrical charge in amp-hours is equal to the energy in kilowatt-hours times 1,000, then divided by the voltage. For example, let's convert 5 kWh at 120 V to Ah. You might also want to convert watt-hours to milliamp-hours.</code>                                                                             |
  | <code>are headaches associated with ptsd?</code>           | <code>When it comes to headaches, patients with migraine or tension headaches report high rates of exposure to traumatic events. In addition, about 17% have symptoms consistent with a PTSD diagnosis. Another study found that 32 percent of OEF/OIF veterans with PTSD say that they have problems with headaches.</code> |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
  ```json
  {
      "loss": "GISTEmbedLoss",
      "n_layers_per_step": -1,
      "last_layer_weight": 1.5,
      "prior_layers_weight": 0.75,
      "kl_div_weight": 1.25,
      "kl_temperature": 1.1
  }
  ```

### Training Hyperparameters
#### Non-Default Hyperparameters

- `eval_strategy`: steps
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 32
- `learning_rate`: 3e-05
- `weight_decay`: 5e-05
- `num_train_epochs`: 5
- `lr_scheduler_type`: cosine_with_restarts
- `lr_scheduler_kwargs`: {'num_cycles': 3}
- `warmup_ratio`: 0.3
- `save_safetensors`: False
- `fp16`: True
- `push_to_hub`: True
- `hub_model_id`: bobox/DeBERTa-ST-AllLayers-testing-v3-checkpoints-tmp
- `hub_strategy`: all_checkpoints
- `batch_sampler`: no_duplicates

#### All Hyperparameters
<details><summary>Click to expand</summary>

- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 32
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `learning_rate`: 3e-05
- `weight_decay`: 5e-05
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 5
- `max_steps`: -1
- `lr_scheduler_type`: cosine_with_restarts
- `lr_scheduler_kwargs`: {'num_cycles': 3}
- `warmup_ratio`: 0.3
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: False
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: True
- `resume_from_checkpoint`: None
- `hub_model_id`: bobox/DeBERTa-ST-AllLayers-testing-v3-checkpoints-tmp
- `hub_strategy`: all_checkpoints
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`: 
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional

</details>

### Training Logs
<details><summary>Click to expand</summary>

| Epoch  | Step  | Training Loss | msmarco pairs loss | openbookqa pairs loss | scitail-pairs-pos loss | trivia pairs loss | gooaq pairs loss | compression-pairs loss | nli-pairs loss | vitaminc-pairs loss | scitail-pairs-qa loss | xsum-pairs loss | quora pairs loss | qasc pairs loss | qnli-contrastive loss | sciq pairs loss | nq pairs loss | sts-test_spearman_cosine |
|:------:|:-----:|:-------------:|:------------------:|:---------------------:|:----------------------:|:-----------------:|:----------------:|:----------------------:|:--------------:|:-------------------:|:---------------------:|:---------------:|:----------------:|:---------------:|:---------------------:|:---------------:|:-------------:|:------------------------:|
| 0.0250 | 147   | 16.851        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.0501 | 294   | 11.2787       | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.0751 | 441   | 8.9166        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.1001 | 588   | 7.9463        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.1251 | 735   | 7.2108        | 8.7543             | 7.8627                | 3.9920                 | 9.0511            | 7.5791           | 3.3038                 | 6.9057         | 5.7209              | 3.7859                | 4.6004          | 4.5232           | 10.5803         | 8.1650                | 10.2145         | 8.4155        | 0.3721                   |
| 0.1502 | 882   | 6.7709        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.1752 | 1029  | 6.1746        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.2002 | 1176  | 5.7706        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.2252 | 1323  | 5.7283        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.2503 | 1470  | 5.1856        | 4.8449             | 5.6524                | 2.4573                 | 5.2907            | 3.9708           | 2.0630                 | 4.3521         | 3.4988              | 1.3250                | 3.0712          | 1.5572           | 9.2457          | 12.9156               | 9.0681          | 5.0240        | 0.6368                   |
| 0.2753 | 1617  | 4.185         | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.3003 | 1764  | 4.6367        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.3253 | 1911  | 4.3615        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.3504 | 2058  | 4.1791        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.3754 | 2205  | 4.1051        | 3.9567             | 5.0910                | 1.8895                 | 4.2864            | 3.2224           | 1.4911                 | 3.2717         | 2.7198              | 0.8107                | 2.2627          | 1.1233           | 8.1039          | 9.6387                | 8.5974          | 4.0091        | 0.6653                   |
| 0.4004 | 2352  | 3.7674        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.4254 | 2499  | 3.8729        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.4505 | 2646  | 3.4527        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.4755 | 2793  | 3.3545        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.5005 | 2940  | 3.3247        | 3.4786             | 4.6194                | 1.4237                 | 3.5245            | 2.6586           | 1.1591                 | 2.7122         | 2.2260              | 0.5898                | 1.8389          | 0.9096           | 7.8180          | 4.9263                | 8.2825          | 3.3450        | 0.6920                   |
| 0.5255 | 3087  | 3.116         | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.5506 | 3234  | 3.2418        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.5756 | 3381  | 3.0757        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.6006 | 3528  | 2.8524        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.6256 | 3675  | 2.6875        | 3.0210             | 4.2169                | 1.1910                 | 3.1736            | 2.3525           | 0.8454                 | 2.4791         | 1.9743              | 0.4400                | 1.4812          | 0.7636           | 6.9316          | 1.7706                | 8.0147          | 2.9561        | 0.7013                   |
| 0.6507 | 3822  | 2.7808        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.6757 | 3969  | 2.5687        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.7007 | 4116  | 2.3034        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.7257 | 4263  | 2.4412        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.7508 | 4410  | 2.3293        | 2.7029             | 3.8574                | 1.0498                 | 2.8798            | 2.0472           | 0.5027                 | 2.3226         | 1.7957              | 0.3697                | 1.1691          | 0.6825           | 6.4047          | 1.0079                | 7.8237          | 2.6794        | 0.7122                   |
| 0.7758 | 4557  | 2.3651        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.8008 | 4704  | 2.6296        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.8258 | 4851  | 2.2108        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.8509 | 4998  | 2.1852        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.8759 | 5145  | 2.2944        | 2.3863             | 3.7141                | 0.9187                 | 2.4948            | 1.8280           | 0.4108                 | 2.0635         | 1.6387              | 0.3160                | 1.0602          | 0.6137           | 6.3538          | 0.9640                | 7.5778          | 2.3543        | 0.7283                   |
| 0.9009 | 5292  | 2.2133        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.9259 | 5439  | 2.2255        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.9510 | 5586  | 2.3502        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 0.9760 | 5733  | 1.8964        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.0010 | 5880  | 1.913         | 2.1638             | 3.4724                | 0.8628                 | 2.3711            | 1.7041           | 0.3047                 | 1.9677         | 1.4394              | 0.2668                | 0.9014          | 0.5216           | 5.9478          | 0.4572                | 1.0916          | 2.1109        | 0.7388                   |
| 1.0260 | 6027  | 1.7772        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.0511 | 6174  | 1.9079        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.0761 | 6321  | 1.8657        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.1011 | 6468  | 1.7144        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.1261 | 6615  | 1.7661        | 2.0444             | 3.3518                | 0.7724                 | 2.3691            | 1.5796           | 0.2659                 | 1.7908         | 1.3404              | 0.2244                | 0.8371          | 0.4785           | 5.7539          | 0.2737                | 0.9384          | 1.9409        | 0.7446                   |
| 1.1512 | 6762  | 1.8066        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.1762 | 6909  | 1.7438        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.2012 | 7056  | 2.0231        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.2263 | 7203  | 1.8966        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.2513 | 7350  | 1.7958        | 1.8952             | 3.1631                | 0.7215                 | 1.9967            | 1.3951           | 0.2498                 | 1.5906         | 1.2226              | 0.1778                | 0.7920          | 0.4054           | 5.4840          | 0.3951                | 0.8344          | 1.6935        | 0.7535                   |
| 1.2763 | 7497  | 1.5109        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.3013 | 7644  | 1.8119        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.3264 | 7791  | 1.6833        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.3514 | 7938  | 1.5917        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.3764 | 8085  | 1.809         | 1.7568             | 2.9011                | 0.6572                 | 1.8419            | 1.1746           | 0.2301                 | 1.5968         | 1.1435              | 0.1577                | 0.7029          | 0.3561           | 5.4334          | 0.3819                | 0.7997          | 1.5408        | 0.7594                   |
| 1.4014 | 8232  | 1.5561        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.4265 | 8379  | 1.5325        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.4515 | 8526  | 1.5085        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.4765 | 8673  | 1.5634        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.5015 | 8820  | 1.3857        | 1.6741             | 2.9235                | 0.6858                 | 1.7547            | 1.1329           | 0.2090                 | 1.4549         | 1.1384              | 0.1451                | 0.6837          | 0.3179           | 5.3092          | 0.3209                | 0.7911          | 1.4419        | 0.7597                   |
| 1.5266 | 8967  | 1.6167        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.5516 | 9114  | 1.6664        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.5766 | 9261  | 1.4785        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.6016 | 9408  | 1.5881        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.6267 | 9555  | 1.3379        | 1.5184             | 2.6945                | 0.5965                 | 1.5940            | 1.0807           | 0.1786                 | 1.3880         | 0.9942              | 0.1363                | 0.6747          | 0.3131           | 5.1157          | 0.2178                | 0.7349          | 1.2934        | 0.7797                   |
| 1.6517 | 9702  | 1.4469        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.6767 | 9849  | 1.3878        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.7017 | 9996  | 1.2764        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.7268 | 10143 | 1.3884        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.7518 | 10290 | 1.2977        | 1.3729             | 2.6562                | 0.5472                 | 1.5840            | 0.9541           | 0.1666                 | 1.3082         | 0.9676              | 0.1264                | 0.5870          | 0.2804           | 5.0991          | 0.2444                | 0.6884          | 1.2175        | 0.7709                   |
| 1.7768 | 10437 | 1.4422        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.8018 | 10584 | 1.4997        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.8269 | 10731 | 1.2797        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.8519 | 10878 | 1.2362        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.8769 | 11025 | 1.2799        | 1.2753             | 2.5103                | 0.5055                 | 1.4894            | 0.8971           | 0.1656                 | 1.2669         | 0.8756              | 0.1176                | 0.5942          | 0.2778           | 4.9792          | 0.2310                | 0.6860          | 1.0572        | 0.7817                   |
| 1.9019 | 11172 | 1.2292        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.9270 | 11319 | 1.0362        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.9520 | 11466 | 1.1851        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 1.9770 | 11613 | 1.0248        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.0020 | 11760 | 1.1305        | 1.2222             | 2.3958                | 0.5056                 | 1.5130            | 0.8715           | 0.1331                 | 1.1982         | 0.8725              | 0.1112                | 0.5418          | 0.2697           | 4.8621          | 0.1064                | 0.5749          | 1.0569        | 0.7788                   |
| 2.0271 | 11907 | 0.9284        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.0521 | 12054 | 1.0998        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.0771 | 12201 | 1.1181        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.1021 | 12348 | 0.9978        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.1272 | 12495 | 1.0565        | 1.1798             | 2.2916                | 0.4980                 | 1.4556            | 0.8206           | 0.1216                 | 1.2102         | 0.7919              | 0.1032                | 0.5051          | 0.2517           | 4.8297          | 0.1032                | 0.5631          | 0.9967        | 0.7839                   |
| 2.1522 | 12642 | 1.1317        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.1772 | 12789 | 1.0682        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.2022 | 12936 | 1.2708        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.2273 | 13083 | 1.2129        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.2523 | 13230 | 1.053         | 1.1227             | 2.2545                | 0.4804                 | 1.2854            | 0.8153           | 0.1264                 | 1.1418         | 0.7908              | 0.0954                | 0.4886          | 0.2444           | 4.7043          | 0.1707                | 0.5329          | 0.9164        | 0.7870                   |
| 2.2773 | 13377 | 0.8897        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.3023 | 13524 | 1.181         | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.3274 | 13671 | 1.0895        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.3524 | 13818 | 1.0347        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.3774 | 13965 | 1.1473        | 1.1150             | 2.2462                | 0.4702                 | 1.2553            | 0.7755           | 0.1256                 | 1.1473         | 0.7656              | 0.0905                | 0.4639          | 0.2363           | 4.6709          | 0.1810                | 0.5232          | 0.8807        | 0.7873                   |
| 2.4025 | 14112 | 1.0026        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.4275 | 14259 | 1.0728        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.4525 | 14406 | 0.8232        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.4775 | 14553 | 1.0261        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.5026 | 14700 | 0.7961        | 1.1003             | 2.2296                | 0.4664                 | 1.2345            | 0.7562           | 0.1180                 | 1.1170         | 0.7670              | 0.0917                | 0.4637          | 0.2341           | 4.6920          | 0.1765                | 0.5210          | 0.8704        | 0.7876                   |
| 2.5276 | 14847 | 1.1167        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.5526 | 14994 | 1.1546        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.5776 | 15141 | 0.9669        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.6027 | 15288 | 1.1057        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.6277 | 15435 | 0.868         | 1.1020             | 2.2289                | 0.4661                 | 1.2312            | 0.7601           | 0.1174                 | 1.1111         | 0.7633              | 0.0906                | 0.4617          | 0.2327           | 4.6633          | 0.1726                | 0.5167          | 0.8702        | 0.7875                   |
| 2.6527 | 15582 | 0.9528        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.6777 | 15729 | 0.9067        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.7028 | 15876 | 0.9652        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.7278 | 16023 | 0.9666        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.7528 | 16170 | 0.9773        | 1.0872             | 2.2682                | 0.5058                 | 1.3410            | 0.8008           | 0.1370                 | 1.1627         | 0.7334              | 0.0944                | 0.4740          | 0.2322           | 4.8347          | 0.2755                | 0.5541          | 0.9481        | 0.7814                   |
| 2.7778 | 16317 | 1.0145        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.8029 | 16464 | 1.1732        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.8279 | 16611 | 0.884         | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.8529 | 16758 | 0.9076        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.8779 | 16905 | 0.9472        | 1.0365             | 2.2319                | 0.4679                 | 1.3139            | 0.7752           | 0.1182                 | 1.1197         | 0.7122              | 0.0913                | 0.4742          | 0.2367           | 4.7843          | 0.2265                | 0.5349          | 0.8607        | 0.7830                   |
| 2.9030 | 17052 | 0.8681        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.9280 | 17199 | 0.7491        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.9530 | 17346 | 0.8847        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |
| 2.9780 | 17493 | 0.8441        | -                  | -                     | -                      | -                 | -                | -                      | -              | -                   | -                     | -               | -                | -               | -                     | -               | -             | -                        |

</details>

### Framework Versions
- Python: 3.10.13
- Sentence Transformers: 3.0.1
- Transformers: 4.41.2
- PyTorch: 2.1.2
- Accelerate: 0.30.1
- Datasets: 2.19.2
- Tokenizers: 0.19.1

## Citation

### BibTeX

#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}
```

#### AdaptiveLayerLoss
```bibtex
@misc{li20242d,
    title={2D Matryoshka Sentence Embeddings}, 
    author={Xianming Li and Zongxi Li and Jing Li and Haoran Xie and Qing Li},
    year={2024},
    eprint={2402.14776},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
```

#### CoSENTLoss
```bibtex
@online{kexuefm-8847,
    title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},
    author={Su Jianlin},
    year={2022},
    month={Jan},
    url={https://kexue.fm/archives/8847},
}
```

#### GISTEmbedLoss
```bibtex
@misc{solatorio2024gistembed,
    title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning}, 
    author={Aivin V. Solatorio},
    year={2024},
    eprint={2402.16829},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}
```

<!--
## Glossary

*Clearly define terms in order to be accessible across audiences.*
-->

<!--
## Model Card Authors

*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->

<!--
## Model Card Contact

*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->