luismsgomes commited on
Commit
9ae0ca1
1 Parent(s): 53cd65b

added trained model

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md CHANGED
@@ -1,3 +1,131 @@
1
  ---
 
2
  license: mit
 
 
 
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ language: pt
3
  license: mit
4
+ library_name: sentence-transformers
5
+ pipeline_tag: sentence-similarity
6
+ tags:
7
+ - sentence-transformers
8
+ - feature-extraction
9
+ - sentence-similarity
10
+ - transformers
11
  ---
12
+
13
+ # {MODEL_NAME}
14
+
15
+ This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
16
+
17
+ <!--- Describe your model here -->
18
+
19
+ ## Usage (Sentence-Transformers)
20
+
21
+ Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
22
+
23
+ ```
24
+ pip install -U sentence-transformers
25
+ ```
26
+
27
+ Then you can use the model like this:
28
+
29
+ ```python
30
+ from sentence_transformers import SentenceTransformer
31
+ sentences = ["This is an example sentence", "Each sentence is converted"]
32
+
33
+ model = SentenceTransformer('{MODEL_NAME}')
34
+ embeddings = model.encode(sentences)
35
+ print(embeddings)
36
+ ```
37
+
38
+
39
+
40
+ ## Usage (HuggingFace Transformers)
41
+ Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
42
+
43
+ ```python
44
+ from transformers import AutoTokenizer, AutoModel
45
+ import torch
46
+
47
+
48
+ #Mean Pooling - Take attention mask into account for correct averaging
49
+ def mean_pooling(model_output, attention_mask):
50
+ token_embeddings = model_output[0] #First element of model_output contains all token embeddings
51
+ input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
52
+ return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
53
+
54
+
55
+ # Sentences we want sentence embeddings for
56
+ sentences = ['This is an example sentence', 'Each sentence is converted']
57
+
58
+ # Load model from HuggingFace Hub
59
+ tokenizer = AutoTokenizer.from_pretrained('{MODEL_NAME}')
60
+ model = AutoModel.from_pretrained('{MODEL_NAME}')
61
+
62
+ # Tokenize sentences
63
+ encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
64
+
65
+ # Compute token embeddings
66
+ with torch.no_grad():
67
+ model_output = model(**encoded_input)
68
+
69
+ # Perform pooling. In this case, mean pooling.
70
+ sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
71
+
72
+ print("Sentence embeddings:")
73
+ print(sentence_embeddings)
74
+ ```
75
+
76
+
77
+
78
+ ## Evaluation Results
79
+
80
+ <!--- Describe how your model was evaluated -->
81
+
82
+ For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
83
+
84
+
85
+ ## Training
86
+ The model was trained with the parameters:
87
+
88
+ **DataLoader**:
89
+
90
+ `torch.utils.data.dataloader.DataLoader` of length 296 with parameters:
91
+ ```
92
+ {'batch_size': 64, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
93
+ ```
94
+
95
+ **Loss**:
96
+
97
+ `sentence_transformers.losses.CoSENTLoss.CoSENTLoss` with parameters:
98
+ ```
99
+ {'scale': 20.0, 'similarity_fct': 'pairwise_cos_sim'}
100
+ ```
101
+
102
+ Parameters of the fit()-Method:
103
+ ```
104
+ {
105
+ "epochs": 20,
106
+ "evaluation_steps": 30,
107
+ "evaluator": "sentence_transformers.evaluation.EmbeddingSimilarityEvaluator.EmbeddingSimilarityEvaluator",
108
+ "max_grad_norm": 1,
109
+ "optimizer_class": "<class 'torch.optim.adamw.AdamW'>",
110
+ "optimizer_params": {
111
+ "lr": 1e-05
112
+ },
113
+ "scheduler": "WarmupLinear",
114
+ "steps_per_epoch": 296,
115
+ "warmup_steps": 592,
116
+ "weight_decay": 0.01
117
+ }
118
+ ```
119
+
120
+
121
+ ## Full Model Architecture
122
+ ```
123
+ SentenceTransformer(
124
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
125
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
126
+ )
127
+ ```
128
+
129
+ ## Citing & Authors
130
+
131
+ <!--- Describe where people can find more information -->
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "models/bertimbau-100m-europarl-eubookshop-ted2020-tatoeba-ct1-nli-gist10-v1",
3
+ "architectures": [
4
+ "BertModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "directionality": "bidi",
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 768,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 3072,
14
+ "layer_norm_eps": 1e-12,
15
+ "max_position_embeddings": 512,
16
+ "model_type": "bert",
17
+ "num_attention_heads": 12,
18
+ "num_hidden_layers": 12,
19
+ "output_past": true,
20
+ "pad_token_id": 0,
21
+ "pooler_fc_size": 768,
22
+ "pooler_num_attention_heads": 12,
23
+ "pooler_num_fc_layers": 3,
24
+ "pooler_size_per_head": 128,
25
+ "pooler_type": "first_token_transform",
26
+ "position_embedding_type": "absolute",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.39.3",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 29794
32
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "2.7.0",
4
+ "transformers": "4.39.3",
5
+ "pytorch": "2.2.2+cu121"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null
9
+ }
eval/similarity_evaluation_assin-ptbr-test_results.csv ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
2
+ -1,-1,0.797505493096023,0.7870871975961616,0.8088892986032153,0.7871493513899368,0.8083246818005017,0.7869904315804185,0.6474502164726192,0.6318991550836346
eval/similarity_evaluation_assin-ptpt-test_results.csv ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
2
+ -1,-1,0.8123595567827149,0.8118871090104925,0.8116259726090579,0.8051506470877671,0.8107882802834672,0.803989012900083,0.6694048031583908,0.6646363899363764
eval/similarity_evaluation_assin2-test_results.csv ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
2
+ -1,-1,0.8512185310133835,0.821371592641844,0.8376831265075906,0.8198856969612986,0.8372190804305725,0.8198364991756667,0.7320554297872173,0.673074806557855
eval/similarity_evaluation_iris-sts-test_results.csv ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
2
+ -1,-1,0.8320663710716895,0.8290070034535971,0.8136610461417283,0.820404698869159,0.8142140755249484,0.8205877578208962,0.7540782569575478,0.7742764262431345
eval/similarity_evaluation_stsb-multi-mt-pt-test_results.csv ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
2
+ -1,-1,0.8283548710481264,0.8350147216142669,0.8183264555391654,0.8282574060643441,0.818740994296916,0.8286212541695588,0.7096061091114463,0.7009164046023304
eval/similarity_evaluation_validation_results.csv ADDED
@@ -0,0 +1,201 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
2
+ 0,30,0.8322851701740984,0.8363629102805482,0.8138593083664031,0.8280569421838662,0.8134167474110882,0.8272842960974248,0.6452088909892917,0.6374541532527893
3
+ 0,60,0.8359083019729926,0.8399552180476482,0.8179900104755258,0.8326956188182227,0.8175778185476443,0.8320809913140648,0.6387384061397987,0.6307540750689943
4
+ 0,90,0.8389796715875066,0.8434634244556917,0.8220160970785708,0.8375970161827517,0.8216643235812339,0.8370409150522887,0.6281243546744145,0.6203114562936157
5
+ 0,120,0.841823857318211,0.8465849905282007,0.8260174625106007,0.8421060656118751,0.8257551420219197,0.8415681905601372,0.614651026011903,0.6059915869294137
6
+ 0,150,0.8451535892384073,0.8501250885658712,0.8287131598071891,0.8453197520343929,0.8285045729951087,0.8449358194104674,0.6108484512594845,0.60312334700184
7
+ 0,180,0.8473031577264061,0.8524656520972039,0.8315847056737622,0.8484500616139319,0.8314528343181907,0.8481783084887634,0.6049034344028199,0.597357595390723
8
+ 0,210,0.8499077503441006,0.8547641270924192,0.8345812574810333,0.8511783841558106,0.8344379066093558,0.8508302673156465,0.6091703483470595,0.6022981078425962
9
+ 0,240,0.8526006998744345,0.8579076712198279,0.8357788211366195,0.8523424721288725,0.8356309995761605,0.8519062426259959,0.6252966866298822,0.6195863252025061
10
+ 0,270,0.8515359925346406,0.8588755488066585,0.8352580168753616,0.852934837608861,0.8350730057191043,0.8525788279522933,0.6244984379752275,0.6238578374256732
11
+ 0,-1,0.8504883332312555,0.8587687734220929,0.8329778014986754,0.8514867895196923,0.8328363925714148,0.8510889413834171,0.6337926476352447,0.634996957717517
12
+ 1,30,0.853733723761878,0.8616401890755507,0.8356641371207333,0.8540430360699834,0.8354830386905892,0.8536084608699929,0.6479751439414021,0.6485983887586186
13
+ 1,60,0.8567383018763806,0.8639745194118222,0.8398378676098514,0.8580061392404276,0.8395718210925442,0.8574380343900709,0.6455270857988665,0.6463169244819061
14
+ 1,90,0.859048170244718,0.8650091968789198,0.8430276658387651,0.8604469898003603,0.842656546556767,0.859704940094431,0.64863066418691,0.6462447503325711
15
+ 1,120,0.8603629643207656,0.8667888104562731,0.8443033173649698,0.861459344442591,0.8440197185324108,0.8608497618501468,0.6564303834672144,0.653212267976446
16
+ 1,150,0.8619157982922627,0.8678163848831991,0.8444187807251964,0.8627990135407582,0.844234377465743,0.8623893014898161,0.6535498684858537,0.6525363740496276
17
+ 1,180,0.8635875888567742,0.8689049566213443,0.845722619075195,0.8637271515788987,0.8455296627911301,0.8632699913219003,0.664689920220935,0.6618617178787027
18
+ 1,210,0.8616832472110949,0.8681393623680358,0.8447072703098245,0.863350504998522,0.844422155100577,0.8628052900510578,0.6551322829076461,0.658168489631066
19
+ 1,240,0.8654951810311227,0.8702304029345357,0.8486685044186173,0.8659004226855941,0.8484098482065111,0.8653191188682531,0.6746383336337832,0.6724006600859571
20
+ 1,270,0.8670550921291318,0.8709046021264455,0.8506950809147474,0.8677438021166892,0.8504025772647033,0.8671594652298709,0.6745734542879068,0.6729972185180928
21
+ 1,-1,0.864657057517508,0.8717377006323449,0.8474672162248419,0.8656004133100206,0.8472019254058731,0.8650368052364672,0.6836995856201917,0.6880705838097012
22
+ 2,30,0.8681706604650294,0.8706296456149331,0.8492633748640945,0.8667451914954464,0.8489541204520724,0.8660840868497326,0.6930356465686219,0.6940698150377265
23
+ 2,60,0.8648589945643922,0.8725108404960294,0.847561670818378,0.8665125703587773,0.8472581582967889,0.8659513800517498,0.698017228300417,0.7017344690492068
24
+ 2,90,0.865899832829438,0.8727402611490769,0.8490906225217113,0.8672270558365339,0.8487972379764834,0.8666230675858156,0.7154729316813506,0.7155118753802615
25
+ 2,120,0.8656718557957958,0.8725599437645171,0.8471113535652466,0.8665083365645385,0.8469406892898965,0.8661597940887859,0.7073090303348463,0.7133161943145285
26
+ 2,150,0.8677048790946739,0.8724672283475055,0.8472417035054405,0.8658841795284913,0.8470357406588478,0.8655639356279451,0.7132455722955827,0.7157497110146075
27
+ 2,180,0.8683276664634653,0.8727664468754576,0.8480025600380098,0.8662456174520831,0.8478317289146452,0.8659238347031408,0.7096601206739671,0.7115672929931217
28
+ 2,210,0.8696201917335699,0.87410187767415,0.8512816468260378,0.8696580082670462,0.8511187761083003,0.8692865502345468,0.6901196134743961,0.6982570872874373
29
+ 2,240,0.8672424615740634,0.8734738165702655,0.8480033826198282,0.8669532207394632,0.8476474075178239,0.8663801556375526,0.6964892644821843,0.704751683585664
30
+ 2,270,0.8690782754971567,0.8736616076674139,0.8473847070296028,0.8667550287259104,0.8470878166806717,0.8662722998967446,0.7119475871307871,0.7169674362424959
31
+ 2,-1,0.8706375043567408,0.8745615379421668,0.8500085404381283,0.8686690354102832,0.8497792968400658,0.8682385989781212,0.7130957807025344,0.7155999140725332
32
+ 3,30,0.8717464419066847,0.8731823896196205,0.8495294192747257,0.8677057213013472,0.8492944626402202,0.8672541569568815,0.7202529354413714,0.7242006988761645
33
+ 3,60,0.8696280040588112,0.8734496248747761,0.8487556632606321,0.867627299307259,0.848472976545704,0.8672064795527621,0.7212758207670664,0.7263197352901031
34
+ 3,90,0.8704626346278884,0.872702677853194,0.8492641049485785,0.8679385008899949,0.8489782962989305,0.8674533424021306,0.7285081645456023,0.7305441249454961
35
+ 3,120,0.8681741120766843,0.8738488442967919,0.8467582603683161,0.8663793489563475,0.8464625868162601,0.8660746615092006,0.7291998078784351,0.7367266071000265
36
+ 3,150,0.8699998938266407,0.8741320800518513,0.8478571985776455,0.8679097934282726,0.8475786993983103,0.8675314601060478,0.7169509004567984,0.7253718533474561
37
+ 3,180,0.8716424876092006,0.8748717090259934,0.8508734067421496,0.8702463807855423,0.8506391842178127,0.8698776176987031,0.7220296401348467,0.7272356041664194
38
+ 3,210,0.8703792534234882,0.8751137088982174,0.8472593472477613,0.8676780059541954,0.8468603647517878,0.8671173779361027,0.7275888039801884,0.7361833737884568
39
+ 3,240,0.8705951260278945,0.8755679597371027,0.8475717557169252,0.8683953865387198,0.8471956019305247,0.8678497181132738,0.7274141209103284,0.7338949829778798
40
+ 3,270,0.8716059347826446,0.8761113518091621,0.8492470947201857,0.8686668582241014,0.8489548262735624,0.8682209992295006,0.7359804386288141,0.7372176512140489
41
+ 3,-1,0.874034608894151,0.875983224166378,0.8517461912253025,0.870948739262503,0.8513840959449784,0.8704463807389766,0.7330130585398112,0.7326258343028589
42
+ 4,30,0.8721646853006318,0.8747650951858813,0.8477371455773026,0.867715413183444,0.8473657806248837,0.8671977172536652,0.748419270900773,0.7490264116730764
43
+ 4,60,0.8716640318138601,0.8749991022083802,0.8477451192567165,0.8683745582093735,0.8474425704738453,0.8680256180907809,0.7399995409301934,0.7464675085895071
44
+ 4,90,0.8731616697354024,0.8755753736667505,0.8484347636906028,0.8694396626553307,0.8482206505780999,0.8691629906089424,0.7384877562209647,0.7451238246074541
45
+ 4,120,0.871788896757648,0.8752607631850613,0.8444971120070595,0.8655996782420675,0.844012693475071,0.8649532905666425,0.7407383580545445,0.747357808279925
46
+ 4,150,0.8712137371461914,0.8760531634485154,0.8461140466750598,0.8677793708696516,0.8458071069969728,0.8674057936570105,0.7463211300926662,0.7534864958795606
47
+ 4,180,0.8715432244171926,0.8757479123063504,0.8478296947496472,0.8692769050268325,0.847637373593701,0.8690120397043916,0.7466079540077222,0.7522040418251817
48
+ 4,210,0.8715133254934789,0.8750828675749959,0.8449133180014928,0.8661115912502695,0.8444035240944078,0.8653530076686949,0.749334392848112,0.7549498946186497
49
+ 4,240,0.8729831398629461,0.8758676138412708,0.8502634161752007,0.870312364140021,0.8498847658543315,0.8697401294245993,0.7504859098626006,0.7514781859603654
50
+ 4,270,0.872515899389117,0.8759898275508545,0.8479737487144918,0.8686358659492326,0.8476139776156392,0.8681165496358743,0.7473000180280067,0.7513698211185128
51
+ 4,-1,0.8735929883050277,0.8761994084167914,0.8498903892157856,0.8706494369913735,0.8496350734108115,0.8702190572081901,0.7441958771712808,0.7499324477527695
52
+ 5,30,0.8715026661291027,0.8741743853191009,0.8440535023140523,0.8647589293074054,0.8437241039510758,0.8643222878038207,0.7629222330032406,0.7684143924671427
53
+ 5,60,0.8725680212827568,0.8750752044927753,0.8469547464050495,0.8673088684673244,0.8465809223005049,0.8667834982290562,0.7559964334395919,0.7625983264731161
54
+ 5,90,0.8726450734008026,0.87508310051798,0.8460573622159043,0.8660978691656673,0.845756724840125,0.8656490682163813,0.7605179609622799,0.7657132058047015
55
+ 5,120,0.8710795478882216,0.874179393625888,0.8422159698205157,0.8637893747763983,0.8418564117670029,0.8632899239948084,0.7586449921298652,0.7671835812355808
56
+ 5,150,0.8716827464961987,0.8755280682521285,0.8445691567487632,0.8658716842868758,0.8442567500209193,0.8654710806147627,0.754779886830394,0.7631774548186254
57
+ 5,180,0.874180238215969,0.8764760453675834,0.8481055228908381,0.8690221482197008,0.8479170910779165,0.8687348470657469,0.7536753743573027,0.7601479616581933
58
+ 5,210,0.8755975064050713,0.8772847820888424,0.8482193087873285,0.8682291587586889,0.8480148120823062,0.8679240090630638,0.7557579431565824,0.7625685897817704
59
+ 5,240,0.8736222939102216,0.8761855804548861,0.8464901282677513,0.8668393129201615,0.8462959568202051,0.8664743705754011,0.7597847346405142,0.7673765440811869
60
+ 5,270,0.8746452468171153,0.8774046520115609,0.8478050017628503,0.868574924829086,0.8475853129804923,0.8682678426638953,0.7546350133855333,0.7614365088667513
61
+ 5,-1,0.8744281504918611,0.8768243700996833,0.8477914061486698,0.8684759276543758,0.8475749142037852,0.8681550649620041,0.7499358727635553,0.7573913856609128
62
+ 6,30,0.8731131451553744,0.8755520983926323,0.8459737897412638,0.8666472869120906,0.845809182333859,0.8663780919732719,0.7562627683089949,0.7652042114334956
63
+ 6,60,0.8734245506778948,0.8761117350109837,0.8450708916855868,0.865559657459838,0.8448603258320637,0.8653718837334502,0.7667365700649705,0.7750886618284756
64
+ 6,90,0.8734708351037948,0.8765821525926444,0.8451774703846332,0.8657943176781214,0.8448587474884025,0.8653099859110505,0.766465765225242,0.7753310276168732
65
+ 6,120,0.8745148142537678,0.8771798885653669,0.8470852165907438,0.8673583146161554,0.8468025493009838,0.8668416617958009,0.7695595059897603,0.7768463731010513
66
+ 6,150,0.8729371681751737,0.8762794514595424,0.8447375187755392,0.8662319787449448,0.8443495614950037,0.8656404692236458,0.7584200163045876,0.7685936191629437
67
+ 6,180,0.8727785212190349,0.8763542916622564,0.8442930384994825,0.8658190022472828,0.8439241451693089,0.8652782471645618,0.7635181602047239,0.7745998662857293
68
+ 6,210,0.8735132681895587,0.8771625650250784,0.8470561361906387,0.868122847930934,0.8467047729636995,0.8677136386096974,0.7647195197690141,0.7747272323195196
69
+ 6,240,0.872701776939754,0.8760850661463602,0.8443874229772024,0.8652080749778408,0.8439716630379382,0.8647147076791133,0.7668171019428996,0.7767282975033326
70
+ 6,270,0.872352110904912,0.8768622595694499,0.8448598797769629,0.8661267634266435,0.844451796594441,0.8655743482599869,0.7635953115714778,0.7741259102066167
71
+ 6,-1,0.8726114880066699,0.8762969155484402,0.8450689189091484,0.86643819949768,0.8448721591342244,0.866108805465803,0.76180870738396,0.7722106374101769
72
+ 7,30,0.872752164790648,0.875881507208981,0.8448134046589963,0.8659864597577627,0.8445625139211742,0.8655943670290019,0.7701389333909774,0.780293343369377
73
+ 7,60,0.8732831443802198,0.8760768171278075,0.8446577699907647,0.8659680187292332,0.8444440479180516,0.8656073501964214,0.7703717353079215,0.7808353030321067
74
+ 7,90,0.8740793060489714,0.8766189847184201,0.8446437173175021,0.865433353651933,0.8443936714424348,0.8650564711148852,0.7759126936700046,0.7858138809090515
75
+ 7,120,0.8734798333562886,0.8767236187180025,0.8456245077701061,0.8671041912659234,0.8454082150460717,0.8667394371709409,0.7709791892797279,0.7817374906203052
76
+ 7,150,0.873105446522849,0.8768263783123756,0.8461277642473085,0.8670864789696097,0.8458869498093838,0.8666981989390828,0.773895812383022,0.7837922592884505
77
+ 7,180,0.8736347692238983,0.8765854594452052,0.8464227840399928,0.8672096058548375,0.8463202041770045,0.8669358240749601,0.7790186652388624,0.7881846026086858
78
+ 7,210,0.8741600266843996,0.8772400645414664,0.8467159269757447,0.8675353819525465,0.8466205011607382,0.8673399579839003,0.7802206213258956,0.7898706423210047
79
+ 7,240,0.8735719630441738,0.8758355419130263,0.845310557858048,0.8658096630963058,0.8451374812127498,0.8656021283008624,0.7721974365910271,0.7827133921264214
80
+ 7,270,0.8741880528851422,0.8765224230482299,0.8474405560693338,0.8676068787392678,0.8473154289282286,0.8674353077464245,0.7697025966610519,0.7803007480073347
81
+ 7,-1,0.8726398060652618,0.8756013732270297,0.8434704366512188,0.8635185229822414,0.8431948770190126,0.8631042922651697,0.7773630448755915,0.787284834531773
82
+ 8,30,0.8736752388999712,0.87630110584394,0.8443364612526596,0.8648954301737767,0.8442493307071213,0.8647801390432861,0.7777322847119971,0.7889487916042908
83
+ 8,60,0.8734653664757374,0.8760068871392601,0.8428416346972151,0.8638155829951014,0.8426415444595056,0.8635690329341777,0.7790164806709908,0.7912006699172555
84
+ 8,90,0.872959504433319,0.8763735611384939,0.8426143039133728,0.8639997425690003,0.8424162415379615,0.8637032873241351,0.7806644642295464,0.7925383586270383
85
+ 8,120,0.8738470498119315,0.8765625063218832,0.8438846256213084,0.8644299518370947,0.8437084603027086,0.8641929463076493,0.7776456885419429,0.7895141762039727
86
+ 8,150,0.872608703011717,0.8755955984910411,0.84270275583869,0.863823292463229,0.8426185467078263,0.8636286132042914,0.7730664675218146,0.7870222299966838
87
+ 8,180,0.8741966286927177,0.8766674839381356,0.8453017940563211,0.8660228850032107,0.8452160114548164,0.8658616722795462,0.7750621480781759,0.788188793774774
88
+ 8,210,0.8731481869230125,0.8763220060291039,0.8446714263259387,0.865272793753289,0.8445222319268776,0.8651201196254319,0.776675918166155,0.7883140475868688
89
+ 8,240,0.8717967024585809,0.8750934748637087,0.8422599626829861,0.8630793741143012,0.8420727753670084,0.8627989590525273,0.7818642033075879,0.7928318819510609
90
+ 8,270,0.8738876844426147,0.8760332686670768,0.846890382428735,0.8670455923651136,0.8467921272076463,0.8668881914451716,0.7824133741272792,0.7910069201015023
91
+ 8,-1,0.8733309237645002,0.8762260822782261,0.8454556503339182,0.8656191072871553,0.845223278977124,0.8653311113494556,0.7803878452244052,0.7898088419098229
92
+ 9,30,0.8742115160375411,0.8764695547419012,0.8456587875161872,0.8654343293520863,0.8455113082734613,0.8652692371046579,0.7876161627119033,0.7952928149532678
93
+ 9,60,0.8728483767942347,0.8761182525332857,0.8423760873094708,0.8631997943133052,0.8422269897450455,0.8629120023541135,0.7862986817467977,0.7970291389086677
94
+ 9,90,0.8745018360324662,0.8769115279805448,0.8451135225010238,0.8652283586474558,0.8449701547668445,0.8649770605128779,0.7862969703415731,0.796672500374531
95
+ 9,120,0.8731397382003885,0.8761196791564063,0.8435683066714063,0.8647375245394907,0.8434585250600499,0.8644537807586948,0.7814147232641175,0.7936941256592354
96
+ 9,150,0.8738796432569612,0.8765072470339245,0.8434538295572795,0.8641227766050487,0.8432845914117152,0.8637754872044372,0.7850185444266166,0.7966080761316443
97
+ 9,180,0.8744384414434294,0.8772596799447842,0.8452040018879057,0.865885339763893,0.8451913974291608,0.865712951927649,0.7891300051276464,0.7995533220965247
98
+ 9,210,0.8739406572031633,0.8773436201060557,0.8456884231669974,0.8665183704987404,0.8456151929763283,0.8664108809172258,0.7850196969411041,0.7963633615390501
99
+ 9,240,0.8747883825537482,0.8771148567609433,0.8462814861309158,0.8663811913698752,0.8461795717706164,0.8661439189647382,0.784806643687839,0.7965613835856572
100
+ 9,270,0.8738788719520503,0.87664843771897,0.8453539194541034,0.8662080283582607,0.8452875285832768,0.8660853864534248,0.7819794554184895,0.7948690161933978
101
+ 9,-1,0.8737812068694487,0.8767407593886831,0.8435122166820381,0.8643273510864229,0.8434742924069328,0.8642217731781988,0.7834730418634362,0.795945416823432
102
+ 10,30,0.8739268394326385,0.8765787746945736,0.8451523978200625,0.8659328344668958,0.8450518031941152,0.8657455041988548,0.7837411225567616,0.7964124229374494
103
+ 10,60,0.8734414332194642,0.8765260132661088,0.8442240291053449,0.8645746877969024,0.8441266037157799,0.864470403530405,0.7868692895935984,0.7994799440485015
104
+ 10,90,0.8739458837427029,0.8768022059635556,0.8448852144866038,0.8652601958459941,0.8448423371126742,0.8652273021672975,0.790056994607373,0.8018984300611165
105
+ 10,120,0.8742991815127288,0.8770369301018413,0.8448211384183807,0.8652111827225483,0.8447495594045958,0.8650783723819526,0.7899563055281489,0.802686323758534
106
+ 10,150,0.8739283252250019,0.8765911213555232,0.843860558775669,0.8640480489498813,0.8437014145200279,0.8638243288182681,0.7917384423110577,0.8043615529709318
107
+ 10,180,0.8742229702854495,0.8766953561858567,0.8430135357154739,0.8630525747766009,0.8428639538078093,0.8628313827502447,0.7895872907098195,0.8020829191717245
108
+ 10,210,0.874290157861953,0.8771106226337066,0.8447872186513088,0.8648589564077374,0.8445742085292284,0.8645987113139961,0.7868885928026186,0.7997872891375116
109
+ 10,240,0.8738716672370578,0.876667170038468,0.8434891984676942,0.8637824174313948,0.8433627666161352,0.8636220879984808,0.7898200973268285,0.8028483129119685
110
+ 10,270,0.8742413909475067,0.876692309690106,0.8448108301811401,0.864802239119454,0.8447424398748818,0.8646602761779022,0.790034682011746,0.8020323168007634
111
+ 10,-1,0.874729133023416,0.8773609588468395,0.845162627706018,0.8648306090049428,0.845043226016315,0.864741074683763,0.7892324635297859,0.8016540281902851
112
+ 11,30,0.8744395208945591,0.8764853865620299,0.8448800153232157,0.8644858973409666,0.8448363732946265,0.8643919392408439,0.7901433362960126,0.8030135906112598
113
+ 11,60,0.8736126190358874,0.8763788651330996,0.8430584608615206,0.8635861519378806,0.8430216695150149,0.8634979171408176,0.7914955967062214,0.8050238774251804
114
+ 11,90,0.874712803475192,0.8771908214415775,0.8459618378012361,0.8656176544319998,0.845980482355434,0.8656633327504097,0.7941777322294065,0.8051736015625439
115
+ 11,120,0.8738442181805751,0.8762484476886458,0.8437876819849661,0.8634284242228522,0.8436974367599696,0.863234541318056,0.7959327359431883,0.8072751354176504
116
+ 11,150,0.8732909584511138,0.8758484161781191,0.8433350027194445,0.8631597331035029,0.8432460878108657,0.8629886301298793,0.791156223623971,0.8037149231670166
117
+ 11,180,0.8735381981897224,0.8761851651754979,0.8446095476589229,0.8644240903891742,0.8445118238021148,0.8642674715980145,0.7903878567447731,0.8026033443429206
118
+ 11,210,0.8745229808732458,0.876775384779411,0.8460863356958649,0.8656429589587749,0.8459981524122181,0.8655216855303486,0.791687774171323,0.8037253083951305
119
+ 11,240,0.8731278530844621,0.8757256909651154,0.8443238481898749,0.8644369818054741,0.8442392203790182,0.8642747541994311,0.7905284094097971,0.8033999895549045
120
+ 11,270,0.873589650845775,0.8761584603540238,0.8452167046443072,0.8653783107758809,0.8450518427038479,0.8651670569073878,0.7886724460493845,0.8014390104553935
121
+ 11,-1,0.8733198440825437,0.8764238580786797,0.8449180575343127,0.8653725790814417,0.8447507165506497,0.8651955535738217,0.7903621913181147,0.8031150824725363
122
+ 12,30,0.8741840037021947,0.8765736202082085,0.8452507681974968,0.8653608375465836,0.8451666586002543,0.8652528490271618,0.7957069897107767,0.8076024761943525
123
+ 12,60,0.8734995951650874,0.8760638054927309,0.843635621019458,0.8640403891462052,0.8434770008590465,0.8638688854684767,0.7938560997840315,0.8068956362428534
124
+ 12,90,0.8739217653826081,0.8764197798000193,0.8437255336996093,0.8640082931472899,0.8435771263643532,0.863828711475533,0.7947129568014741,0.8075196890058095
125
+ 12,120,0.8739060691072909,0.8763033213296809,0.8430492315999871,0.8632556591182448,0.8429338223667889,0.8630835735947098,0.7941500183221754,0.807260324805643
126
+ 12,150,0.8739023044197453,0.876250985466929,0.842567863027284,0.8627919891510933,0.8424068308156527,0.8626076863901248,0.7902083862449383,0.8045863198769542
127
+ 12,180,0.8741015096489819,0.8769277847566461,0.8425965597276862,0.8626656711854364,0.842477751676237,0.8625302727960851,0.7930270848744876,0.8067447896954814
128
+ 12,210,0.8735996102125021,0.8762456864194244,0.8421780984247669,0.8624695370657475,0.8420245667575067,0.862354476515733,0.79324966784902,0.8071327371117846
129
+ 12,240,0.8743589932622585,0.8770415657432454,0.8429436452632522,0.8632519089202978,0.8428390372177559,0.8630639425154621,0.7936314806744128,0.8075608027588589
130
+ 12,270,0.8748432955552992,0.8771378056384304,0.842746685052799,0.8624695432634656,0.8426009328707449,0.8622538211033097,0.7955772749623662,0.8090423689566718
131
+ 12,-1,0.8737594055216662,0.8763635974502806,0.8415342868372827,0.8618517897161171,0.8413692585264498,0.861614585367416,0.7893427501450107,0.8045618916250635
132
+ 13,30,0.8742640663084832,0.8765540801052748,0.8416705508674898,0.8615758535703465,0.8415016079046799,0.8613639715864416,0.7940948319297804,0.8078879550845479
133
+ 13,60,0.8746861660923403,0.8770904964272747,0.8418976797191559,0.8619730212311394,0.8417437219148076,0.8617014693625036,0.7968917275676146,0.8106703035903657
134
+ 13,90,0.8745937826828812,0.876708132316512,0.8427590093322863,0.8628813975777948,0.8425718854874923,0.8625778463695238,0.7942922428991699,0.8084150528692122
135
+ 13,120,0.8743559879744027,0.877080701336031,0.842490615462284,0.8626631854817346,0.8423471795130216,0.8624849637165471,0.7931970654643229,0.808080125742213
136
+ 13,150,0.8740474559239566,0.8766690385959246,0.8424781135135865,0.8624945483321333,0.842322978963036,0.8622869277921996,0.7974463547884585,0.8115179644052358
137
+ 13,180,0.8743225624951955,0.8771318273087894,0.843094568861492,0.8633868602413942,0.8429680500700946,0.863243930066955,0.7956343868015376,0.810156014162781
138
+ 13,210,0.8739090413637819,0.8767441083376338,0.842096223452927,0.8621868581291442,0.8418943241624041,0.8618693287878778,0.7960514203420027,0.8107646052501317
139
+ 13,240,0.8743279245524971,0.8767651891494999,0.8428048719867615,0.8626126172741116,0.8425976408674083,0.8623332726260845,0.7954948313981788,0.8093807623158403
140
+ 13,270,0.8737611798975509,0.8766236092388645,0.8420420282590052,0.8620255992786855,0.8418121630227284,0.8616720551520279,0.795881788146394,0.8103803166318484
141
+ 13,-1,0.8738972901944388,0.8765465733223068,0.8421746664712154,0.8619960634454955,0.8419426229739901,0.8617777025834406,0.7971216945537317,0.8115609701523712
142
+ 14,30,0.8732930217941399,0.8760397921625057,0.8424060457289076,0.8625886770497231,0.842229312245652,0.8623921891955845,0.7977751400537706,0.8119950006846178
143
+ 14,60,0.8736591655134842,0.8760730920009235,0.8421822453659418,0.8621615727480609,0.842013645779908,0.8619277900590556,0.7983266183787643,0.8118019295859552
144
+ 14,90,0.8738901174342243,0.8763690898548733,0.8427593309495758,0.8624480596804724,0.8425843345428388,0.8622313210172917,0.7992701221211256,0.8122934178642016
145
+ 14,120,0.8737619472880575,0.8764945654333502,0.8427565624478454,0.8629967572151471,0.8426236116223941,0.8628139850935299,0.7967524851910167,0.8109085307788172
146
+ 14,150,0.8741087612118447,0.8765839635945457,0.8424710204662718,0.8621811661484786,0.8423344641783118,0.8619946657934532,0.7975217659624957,0.8117943152290947
147
+ 14,180,0.8740372827018623,0.8766618469387357,0.842822175826698,0.8627858916311151,0.8426828778850382,0.8625511542389589,0.7948716967591652,0.8096228136203736
148
+ 14,210,0.8735568199983754,0.8763308761067256,0.8412727727773364,0.8612684347160883,0.8411133806852761,0.8610858847380549,0.7967137796343879,0.8116366636565011
149
+ 14,240,0.8740066873142988,0.8766539829576769,0.8417618314230251,0.8617313073416448,0.8416160507024327,0.8615485920831853,0.7977443085047342,0.8122473514806677
150
+ 14,270,0.8740368763589274,0.876702373957126,0.8424860159387353,0.8625550917331589,0.8423217591071993,0.8623176688721722,0.7950954486571128,0.8101782250379198
151
+ 14,-1,0.8736814723100198,0.8764652902722404,0.8419198322040803,0.8619282272129393,0.8417507041963732,0.8617225537003844,0.7968242694008139,0.8116364952852061
152
+ 15,30,0.8739117716140442,0.8767731607501009,0.8427775120608849,0.8628470351860439,0.84262972157515,0.8626675000812688,0.7989471544646534,0.8134625595261366
153
+ 15,60,0.8741350867653501,0.8767923024089739,0.842654402249565,0.8628185634720604,0.8424770054568562,0.8625661110811184,0.7987200964177814,0.813191905513992
154
+ 15,90,0.8738451914591725,0.8764620451138537,0.8421907503263014,0.8624327103825568,0.8420084787802887,0.8621835315825981,0.7966239317188754,0.8114855883214187
155
+ 15,120,0.8740667249748921,0.8767911230196183,0.8424764243215828,0.8628324195057302,0.8422956912513274,0.8625715103745928,0.7951948481748329,0.8101983250524758
156
+ 15,150,0.8741262530821848,0.8766855705462218,0.8423158999058018,0.8625719670168633,0.842165493475616,0.8623416490008021,0.7976583367484525,0.8123788627484364
157
+ 15,180,0.873703067110387,0.8764443800924853,0.8412754777270272,0.8616082137883053,0.8411179396090636,0.8612813221089427,0.7987046636997496,0.8135947212854184
158
+ 15,210,0.8739885687675195,0.8766608513122435,0.8422394556708288,0.8626452441432295,0.8421193491919747,0.8625033276018527,0.7982012021415495,0.812858384785726
159
+ 15,240,0.8742723193914316,0.8766281858977177,0.8424815265193072,0.8626003937657313,0.842368379098127,0.8624414332308189,0.7989903727385239,0.8131533258245089
160
+ 15,270,0.8744452820269333,0.8766196130344898,0.8425808956784966,0.862555839308865,0.8424709152158923,0.8623654047989867,0.799210267952171,0.8130474847123123
161
+ 15,-1,0.8742525943828982,0.8766711462198254,0.8425067592519159,0.8626695583085394,0.8423981003329453,0.86252613931338,0.7967845269188123,0.8113549915397174
162
+ 16,30,0.8741083833685421,0.8764726951753133,0.841996930240608,0.8621175677404158,0.8418865270848974,0.8619406551123091,0.7970530328911843,0.8119033792827557
163
+ 16,60,0.8742973743303535,0.8765517546055354,0.8415509593375978,0.8613571597167183,0.8413954510719622,0.8610677326312046,0.7985793016815893,0.8131429833350187
164
+ 16,90,0.8747107047620232,0.876761007713913,0.8428615449595486,0.862626660868499,0.8427165784208666,0.8624065336287268,0.7975966680606474,0.8117590540379659
165
+ 16,120,0.8745961471556334,0.876884700689192,0.842020366548013,0.8617582501514934,0.8418390230318848,0.8615324017053367,0.7990830302847949,0.8134551240222687
166
+ 16,150,0.8743864540736546,0.8767015594155998,0.8414619525921508,0.861279882133652,0.8412703811644857,0.861070034253795,0.7988399464810163,0.8132784707385313
167
+ 16,180,0.874047732259652,0.876472513164202,0.8409830148761016,0.8610005308946617,0.8408149475992571,0.8607834916479965,0.7981950198595147,0.8130271690576117
168
+ 16,210,0.8741511777568592,0.8765144241580586,0.8423159798390336,0.8624476369242536,0.8421514608353549,0.8622115546815862,0.7972686577211566,0.812344234834102
169
+ 16,240,0.8745207700938457,0.8766915900131027,0.8418693097692649,0.861738902033802,0.8416920482093039,0.8615117918296252,0.798764870402664,0.813428704710733
170
+ 16,270,0.8744493809678795,0.8767430143768201,0.8418717100871036,0.8618509041013891,0.8416968275552491,0.8615909643356922,0.798880650737211,0.8134944564466158
171
+ 16,-1,0.8747223881269954,0.8769508336039393,0.8424342889413922,0.8624141863486493,0.8422951873910859,0.8622328523630596,0.7996267057434961,0.8138652882299497
172
+ 17,30,0.8744788407342248,0.8768624791557473,0.8422800661729207,0.8623995946560132,0.8421515125215333,0.8622189734280281,0.7986050698278196,0.8132963273067124
173
+ 17,60,0.8742656590866859,0.8767121339599466,0.8416961695686833,0.8617497381868218,0.8415627880699004,0.8615396466836893,0.7985324981816415,0.8132389939657925
174
+ 17,90,0.8742637638703962,0.8766119677810553,0.8418037738412858,0.8616655992994212,0.8416654549224547,0.8614967259902768,0.7984500227645578,0.8127991028189061
175
+ 17,120,0.8743930618666554,0.8766761004582758,0.8421642974633657,0.8620409089323939,0.8420388842129074,0.861869884549849,0.7991391227425224,0.8133071896672388
176
+ 17,150,0.8745214045282328,0.8769968850543574,0.8424521598581175,0.8625069797872479,0.842364800495326,0.8624106379939338,0.7978713494589222,0.8124759819481814
177
+ 17,180,0.8744147105474939,0.8769179732717574,0.8422036053450977,0.8623290268550134,0.8420914488305176,0.8621914207959154,0.7969872293183138,0.8117837375734129
178
+ 17,210,0.8743882781812855,0.8769124922461613,0.8413260494718606,0.8613549927404545,0.8412118704758897,0.861202459729976,0.7986859525474553,0.81336535715031
179
+ 17,240,0.8745173113205452,0.8769588328361367,0.8414152901492243,0.8614453521237859,0.8413449731098777,0.8613671154051008,0.7985603535190525,0.8135182889292936
180
+ 17,270,0.8746524390084588,0.8770259688771082,0.8414466815754285,0.8614007234508738,0.8413731749294553,0.8613333671234528,0.7990976821967783,0.8139277394534545
181
+ 17,-1,0.8747007208413216,0.8771763613840018,0.8418834698151731,0.8620590656350836,0.8417977664905453,0.8619359284535739,0.7986375726061945,0.8138364454128432
182
+ 18,30,0.8747381328990622,0.8771301401946482,0.8416653201982642,0.8617597992872744,0.8415730934408568,0.8616355043877004,0.7996293943549787,0.8146545637312066
183
+ 18,60,0.8746368870432896,0.8769936751846644,0.8415382784047006,0.8615265828478552,0.8414188146712683,0.861362748232251,0.799777909000059,0.8148051971066369
184
+ 18,90,0.874856046934299,0.8771336366083831,0.8418981351895871,0.8619224271280872,0.8417910915856859,0.8618078036075588,0.7992287447324743,0.8144072013988578
185
+ 18,120,0.8746424448611453,0.8771485462922045,0.8413867457438998,0.8613381288879391,0.8412671928507913,0.861178954113927,0.8005050610815163,0.8155644246444803
186
+ 18,150,0.8747382823940909,0.8772687318250916,0.8418632309652908,0.8618337956223575,0.841750618781849,0.8616715495773435,0.8003005597999109,0.8152047479847033
187
+ 18,180,0.8746307563127922,0.8772087441090584,0.8422950438919035,0.862600521547271,0.8421914795929799,0.8624202300753162,0.798212759474029,0.8135631196647609
188
+ 18,210,0.8746606816176522,0.8771892939894501,0.8423895875870494,0.8626855095807564,0.8422889619177133,0.8625287163464272,0.7977015663388834,0.8130758030194635
189
+ 18,240,0.8745905961849402,0.8770954996290394,0.8419755953447137,0.8622068104791553,0.8418701287642679,0.8620173080173339,0.7982129076589695,0.8134657287454844
190
+ 18,270,0.8745295708875699,0.8769887960719039,0.841589659165631,0.8616663031393237,0.8414785301637461,0.861510568714598,0.7993906775082099,0.8144703895442976
191
+ 18,-1,0.8744203034241856,0.8769681435711502,0.8415434114378075,0.8616353600063429,0.8414353474494594,0.8615041627083037,0.7994469258527959,0.8146446566896025
192
+ 19,30,0.8742641124063617,0.8768471351211692,0.8415414728878907,0.8617432296960235,0.8414376228877135,0.8615838846544892,0.7988930944369638,0.8142675053589978
193
+ 19,60,0.8743234165530136,0.8768612964458228,0.841393819204701,0.8615164279299885,0.8412878606093455,0.8614004028657928,0.7989620285753967,0.8142811692000329
194
+ 19,90,0.8742497962821625,0.876855142370517,0.8415055849140577,0.8617288371403652,0.8413996359082967,0.8615824227770268,0.7988165692546929,0.814235022222535
195
+ 19,120,0.8743906790623467,0.8769818442326011,0.8416675801030937,0.8618439811856554,0.8415618229754075,0.8617188501059843,0.7991910806420205,0.8144768686692467
196
+ 19,150,0.87448127538384,0.8770275200726148,0.8418183155415365,0.8619603599958214,0.8417137879682862,0.8618311719094505,0.7994338852607249,0.8146224224128396
197
+ 19,180,0.8744693293174003,0.8770205435555875,0.8417581946979592,0.861889928261294,0.8416542387431615,0.8617637531031211,0.7996793805396629,0.81479916935148
198
+ 19,210,0.8744666449471437,0.8770233041574498,0.8416831964583525,0.8618409872190761,0.8415815725150233,0.8616891340534879,0.799838671348307,0.814976162797122
199
+ 19,240,0.8744403926544074,0.8770084231302263,0.8417029790197643,0.861904835942882,0.8416008998971243,0.8617683348002817,0.7995806078670444,0.8147839324858114
200
+ 19,270,0.8744257447318369,0.876989756175739,0.8416821305524427,0.8618818783376963,0.8415777052590676,0.8617512617720354,0.7995747104165077,0.8147788859156038
201
+ 19,-1,0.8744230211359798,0.8769986079828186,0.8416898327508714,0.861892074204782,0.8415858290967034,0.8617689783749898,0.79956889682937,0.8147763128107215
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:81418fd194e88b8018d06f61598c71df59e8b8136f024dbaec0fbece012bef84
3
+ size 435714904
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 128,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": {
3
+ "content": "[CLS]",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "mask_token": {
10
+ "content": "[MASK]",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "[PAD]",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "sep_token": {
24
+ "content": "[SEP]",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "unk_token": {
31
+ "content": "[UNK]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ }
37
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,64 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "100": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "101": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "102": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "103": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "clean_up_tokenization_spaces": true,
45
+ "cls_token": "[CLS]",
46
+ "do_basic_tokenize": true,
47
+ "do_lower_case": false,
48
+ "mask_token": "[MASK]",
49
+ "max_length": 128,
50
+ "model_max_length": 1000000000000000019884624838656,
51
+ "never_split": null,
52
+ "pad_to_multiple_of": null,
53
+ "pad_token": "[PAD]",
54
+ "pad_token_type_id": 0,
55
+ "padding_side": "right",
56
+ "sep_token": "[SEP]",
57
+ "stride": 0,
58
+ "strip_accents": null,
59
+ "tokenize_chinese_chars": true,
60
+ "tokenizer_class": "BertTokenizer",
61
+ "truncation_side": "right",
62
+ "truncation_strategy": "longest_first",
63
+ "unk_token": "[UNK]"
64
+ }
train-config.yaml ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ trainer: "sts"
2
+ model_name: "bertimbau-100m-europarl-eubookshop-ted2020-tatoeba-ct1-nli-gist10-sts-cosent20-v1"
3
+ base_model_name: "bertimbau-100m-europarl-eubookshop-ted2020-tatoeba-ct1-nli-gist10-v1"
4
+ loss_function: "cosent"
5
+ seed: 1 # best seed for this model from first 30 seeds
6
+ learning_rate: 1e-5
7
+ warmup_ratio: 0.1
8
+ weight_decay: 0.01
9
+ batch_size: 64
10
+ use_amp: True
11
+ epochs: 20
12
+ validations_per_epoch: 10
13
+
14
+ # HPs used by JRodrigues to train albertina-100m-portuguese-ptpt-encoder:
15
+ # learning_rate 1e-5
16
+ # lr_scheduler_type linear
17
+ # weight_decay 0.01
18
+ # per_device_train_batch_size 192
19
+ # gradient_accumulation_steps 1
20
+ # num_train_epochs 150
21
+ # num_warmup_steps 10000
vocab.txt ADDED
The diff for this file is too large to render. See raw diff