Add new SentenceTransformer model.
Browse files- .gitattributes +1 -0
- 1_Pooling/config.json +7 -0
- README.md +87 -0
- config.json +28 -0
- config_sentence_transformers.json +7 -0
- eval/similarity_evaluation_results.csv +121 -0
- merges.txt +0 -0
- modules.json +20 -0
- pytorch_model.bin +3 -0
- sentence_bert_config.json +4 -0
- similarity_evaluation_sts-test_results.csv +2 -0
- special_tokens_map.json +1 -0
- tokenizer.json +0 -0
- tokenizer_config.json +1 -0
- vocab.json +0 -0
.gitattributes
CHANGED
@@ -25,3 +25,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
25 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
26 |
*.zstandard filter=lfs diff=lfs merge=lfs -text
|
27 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
25 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
26 |
*.zstandard filter=lfs diff=lfs merge=lfs -text
|
27 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
28 |
+
pytorch_model.bin filter=lfs diff=lfs merge=lfs -text
|
1_Pooling/config.json
ADDED
@@ -0,0 +1,7 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"word_embedding_dimension": 768,
|
3 |
+
"pooling_mode_cls_token": false,
|
4 |
+
"pooling_mode_mean_tokens": true,
|
5 |
+
"pooling_mode_max_tokens": false,
|
6 |
+
"pooling_mode_mean_sqrt_len_tokens": false
|
7 |
+
}
|
README.md
ADDED
@@ -0,0 +1,87 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
pipeline_tag: sentence-similarity
|
3 |
+
tags:
|
4 |
+
- sentence-transformers
|
5 |
+
- feature-extraction
|
6 |
+
- sentence-similarity
|
7 |
+
---
|
8 |
+
|
9 |
+
# {MODEL_NAME}
|
10 |
+
|
11 |
+
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
|
12 |
+
|
13 |
+
<!--- Describe your model here -->
|
14 |
+
|
15 |
+
## Usage (Sentence-Transformers)
|
16 |
+
|
17 |
+
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
|
18 |
+
|
19 |
+
```
|
20 |
+
pip install -U sentence-transformers
|
21 |
+
```
|
22 |
+
|
23 |
+
Then you can use the model like this:
|
24 |
+
|
25 |
+
```python
|
26 |
+
from sentence_transformers import SentenceTransformer
|
27 |
+
sentences = ["This is an example sentence", "Each sentence is converted"]
|
28 |
+
|
29 |
+
model = SentenceTransformer('{MODEL_NAME}')
|
30 |
+
embeddings = model.encode(sentences)
|
31 |
+
print(embeddings)
|
32 |
+
```
|
33 |
+
|
34 |
+
|
35 |
+
|
36 |
+
## Evaluation Results
|
37 |
+
|
38 |
+
<!--- Describe how your model was evaluated -->
|
39 |
+
|
40 |
+
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
|
41 |
+
|
42 |
+
|
43 |
+
## Training
|
44 |
+
The model was trained with the parameters:
|
45 |
+
|
46 |
+
**DataLoader**:
|
47 |
+
|
48 |
+
`torch.utils.data.dataloader.DataLoader` of length 11 with parameters:
|
49 |
+
```
|
50 |
+
{'batch_size': 15, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
|
51 |
+
```
|
52 |
+
|
53 |
+
**Loss**:
|
54 |
+
|
55 |
+
`sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss`
|
56 |
+
|
57 |
+
Parameters of the fit()-Method:
|
58 |
+
```
|
59 |
+
{
|
60 |
+
"epochs": 10,
|
61 |
+
"evaluation_steps": 1,
|
62 |
+
"evaluator": "sentence_transformers.evaluation.EmbeddingSimilarityEvaluator.EmbeddingSimilarityEvaluator",
|
63 |
+
"max_grad_norm": 1,
|
64 |
+
"optimizer_class": "<class 'transformers.optimization.AdamW'>",
|
65 |
+
"optimizer_params": {
|
66 |
+
"lr": 2e-05
|
67 |
+
},
|
68 |
+
"scheduler": "WarmupLinear",
|
69 |
+
"steps_per_epoch": null,
|
70 |
+
"warmup_steps": 11,
|
71 |
+
"weight_decay": 0.01
|
72 |
+
}
|
73 |
+
```
|
74 |
+
|
75 |
+
|
76 |
+
## Full Model Architecture
|
77 |
+
```
|
78 |
+
SentenceTransformer(
|
79 |
+
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: RobertaModel
|
80 |
+
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
|
81 |
+
(2): Normalize()
|
82 |
+
)
|
83 |
+
```
|
84 |
+
|
85 |
+
## Citing & Authors
|
86 |
+
|
87 |
+
<!--- Describe where people can find more information -->
|
config.json
ADDED
@@ -0,0 +1,28 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"_name_or_path": "/root/.cache/torch/sentence_transformers/sentence-transformers_all-distilroberta-v1/",
|
3 |
+
"architectures": [
|
4 |
+
"RobertaModel"
|
5 |
+
],
|
6 |
+
"attention_probs_dropout_prob": 0.1,
|
7 |
+
"bos_token_id": 0,
|
8 |
+
"classifier_dropout": null,
|
9 |
+
"eos_token_id": 2,
|
10 |
+
"gradient_checkpointing": false,
|
11 |
+
"hidden_act": "gelu",
|
12 |
+
"hidden_dropout_prob": 0.1,
|
13 |
+
"hidden_size": 768,
|
14 |
+
"initializer_range": 0.02,
|
15 |
+
"intermediate_size": 3072,
|
16 |
+
"layer_norm_eps": 1e-05,
|
17 |
+
"max_position_embeddings": 514,
|
18 |
+
"model_type": "roberta",
|
19 |
+
"num_attention_heads": 12,
|
20 |
+
"num_hidden_layers": 6,
|
21 |
+
"pad_token_id": 1,
|
22 |
+
"position_embedding_type": "absolute",
|
23 |
+
"torch_dtype": "float32",
|
24 |
+
"transformers_version": "4.16.2",
|
25 |
+
"type_vocab_size": 1,
|
26 |
+
"use_cache": true,
|
27 |
+
"vocab_size": 50265
|
28 |
+
}
|
config_sentence_transformers.json
ADDED
@@ -0,0 +1,7 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"__version__": {
|
3 |
+
"sentence_transformers": "2.0.0",
|
4 |
+
"transformers": "4.6.1",
|
5 |
+
"pytorch": "1.8.1"
|
6 |
+
}
|
7 |
+
}
|
eval/similarity_evaluation_results.csv
ADDED
@@ -0,0 +1,121 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
|
2 |
+
0,1,0.45892171995983294,0.5136906760016178,0.4480500006536557,0.5136906760016178,0.4538208185638512,0.49970404868474205,0.45892148035278574,0.5136906760016178
|
3 |
+
0,2,0.46287208359683946,0.5136906760016178,0.45183414846616077,0.5136906760016178,0.458768442765568,0.49970404868474205,0.46287185157409466,0.5136906760016178
|
4 |
+
0,3,0.47098134462383795,0.5136906760016178,0.46088664261162293,0.5136906760016178,0.4680490539168848,0.49970404868474205,0.4709814862527341,0.5136906760016178
|
5 |
+
0,4,0.48241002489372564,0.5454784653581536,0.47347795881740895,0.5454784653581536,0.4810191575970378,0.5416639306353692,0.4824100449644243,0.5454784653581536
|
6 |
+
0,5,0.4834325854718865,0.517505210724402,0.4747435503373138,0.517505210724402,0.4813599729262406,0.5124191644273564,0.48343262214804655,0.517505210724402
|
7 |
+
0,6,0.4768037542818537,0.5124191644273564,0.46881201138725875,0.5124191644273564,0.47473622799385606,0.5289488148927549,0.47680432151062485,0.5124191644273564
|
8 |
+
0,7,0.45588708064453887,0.4857174213678663,0.4467765935431983,0.4857174213678663,0.4517271151492546,0.4691877709024677,0.45588705978216665,0.4857174213678663
|
9 |
+
0,8,0.4725019492735504,0.5136906760016178,0.4637000254145635,0.5136906760016178,0.46832849918258973,0.4971610255362192,0.47250205649248045,0.5136906760016178
|
10 |
+
0,9,0.46643885824183434,0.5302203264670163,0.45557390091280525,0.5302203264670163,0.458858203250543,0.455201143585592,0.46643872899764655,0.5302203264670163
|
11 |
+
0,10,0.4291228113591291,0.3916255648725205,0.41331566796529506,0.3916255648725205,0.41199890417222346,0.32804998615944897,0.4291232955539209,0.3916255648725205
|
12 |
+
0,11,0.3662004421234983,0.27718952318899176,0.3432002653321558,0.27718952318899176,0.33431263117859167,0.25557382642654747,0.3662005297116826,0.27718952318899176
|
13 |
+
0,-1,0.3662004421234983,0.27718952318899176,0.3432002653321558,0.27718952318899176,0.33431263117859167,0.25557382642654747,0.3662005297116826,0.27718952318899176
|
14 |
+
1,1,0.28472780989299423,0.13223720372318873,0.25343121566505517,0.13223720372318873,0.23569863535433935,0.12206511112909728,0.2847273836940616,0.13223720372318873
|
15 |
+
1,2,0.21087796384878202,0.09409185649534582,0.17394251409538972,0.09409185649534582,0.15236878394986605,-0.00762906944556858,0.21087791945755516,0.09409185649534582
|
16 |
+
1,3,0.16065265647451682,-0.054674997693241495,0.12136160359636007,-0.054674997693241495,0.10290354315545613,-0.08646278704977725,0.16065299355331386,-0.054674997693241495
|
17 |
+
1,4,0.13273721152937742,-0.09027732177256154,0.09466570786563243,-0.09027732177256154,0.07891201310055543,-0.11825057640631301,0.13273722245944666,-0.09027732177256154
|
18 |
+
1,5,0.13026127879099608,-0.08137674075273153,0.09368493275508132,-0.08137674075273153,0.07986670102446536,-0.13732325002023446,0.13026118155317767,-0.08137674075273153
|
19 |
+
1,6,0.15067240051499337,-0.05594650926750292,0.11514112325485511,-0.05594650926750292,0.10170553701877333,-0.0762906944556858,0.15067226084324975,-0.05594650926750292
|
20 |
+
1,7,0.19441904562181084,0.11062150696074444,0.16132005045289483,0.11062150696074444,0.15071366610072592,0.01652965046539859,0.194418873805368,0.11062150696074444
|
21 |
+
1,8,0.26197426180499156,0.14876685418858734,0.23338507758332513,0.14876685418858734,0.22827468867369757,0.15766743520841733,0.2619741898401606,0.14876685418858734
|
22 |
+
1,9,0.34462868508131894,0.24158719910967172,0.3223939747150595,0.24158719910967172,0.3260836912383004,0.19962731715904453,0.34462901903481113,0.24158719910967172
|
23 |
+
1,10,0.41087528086743824,0.40052614589235047,0.39672155755319316,0.40052614589235047,0.4105948043821146,0.3712813796843376,0.410875332029457,0.40052614589235047
|
24 |
+
1,11,0.4357018309283556,0.4539296320113305,0.42913737194692214,0.4539296320113305,0.44982638348170223,0.44757207414002337,0.4357014922738194,0.4539296320113305
|
25 |
+
1,-1,0.4357018309283556,0.4539296320113305,0.42913737194692214,0.4539296320113305,0.44982638348170223,0.44757207414002337,0.4357014922738194,0.4539296320113305
|
26 |
+
2,1,0.442509574705825,0.5607366042492907,0.44174775085709295,0.5607366042492907,0.4657660723609641,0.5607366042492907,0.44250922711384927,0.5607366042492907
|
27 |
+
2,2,0.43839657123310816,0.5531075348037221,0.44228747858280915,0.5531075348037221,0.4671563099769105,0.5416639306353692,0.4383966901392806,0.5531075348037221
|
28 |
+
2,3,0.4343157276689449,0.5531075348037221,0.44146700362240526,0.5531075348037221,0.4670896239386643,0.537849395912585,0.4343158497951398,0.5531075348037221
|
29 |
+
2,4,0.4275600113732149,0.5314918380412778,0.4348884113861302,0.5314918380412778,0.46040549570643335,0.5531075348037221,0.42756008022939607,0.5314918380412778
|
30 |
+
2,5,0.41861438015662056,0.5391209074868464,0.42398652113674856,0.5391209074868464,0.44773321084457346,0.5531075348037221,0.4186143712380187,0.5391209074868464
|
31 |
+
2,6,0.41209126950350256,0.5391209074868464,0.415144191552038,0.5391209074868464,0.4354924063092754,0.5136906760016178,0.41209130532651184,0.5391209074868464
|
32 |
+
2,7,0.4036069763782575,0.48317439821934344,0.4041004196372062,0.48317439821934344,0.42199532913157345,0.46791625932820624,0.40360722077597716,0.48317439821934344
|
33 |
+
2,8,0.4001217611399098,0.48317439821934344,0.4002026502115193,0.48317439821934344,0.4160439377923956,0.47173079405099055,0.40012169315201157,0.48317439821934344
|
34 |
+
2,9,0.3975166419616139,0.48317439821934344,0.3983087482563628,0.48317439821934344,0.41221332651358533,0.47173079405099055,0.39751663687742533,0.48317439821934344
|
35 |
+
2,10,0.39469748226541046,0.47554532877377487,0.39553510869707037,0.47554532877377487,0.4072188733565807,0.47173079405099055,0.39469728244550023,0.47554532877377487
|
36 |
+
2,11,0.3952693128757673,0.42722788895184055,0.39380041071962957,0.42722788895184055,0.40341257489368554,0.4437575394172391,0.39526978067302954,0.42722788895184055
|
37 |
+
2,-1,0.3952693128757673,0.42722788895184055,0.39380041071962957,0.42722788895184055,0.40341257489368554,0.4437575394172391,0.39526978067302954,0.42722788895184055
|
38 |
+
3,1,0.39607741839593563,0.42722788895184055,0.3933966184946814,0.42722788895184055,0.4022242654294058,0.4157842847834876,0.39607744512706083,0.42722788895184055
|
39 |
+
3,2,0.39338596085050903,0.4310424236746248,0.3899027854314537,0.4310424236746248,0.39763070573624215,0.39798312274382763,0.39338607176063656,0.4310424236746248
|
40 |
+
3,3,0.38383766774106526,0.43867149312019343,0.38011373185670355,0.43867149312019343,0.38707499576300003,0.4513866088628077,0.38383756153798926,0.43867149312019343
|
41 |
+
3,4,0.37505676220935785,0.43867149312019343,0.37193467314765255,0.43867149312019343,0.37886479219155034,0.47300230562525203,0.3750571233227296,0.43867149312019343
|
42 |
+
3,5,0.35917099358381127,0.4666447477539449,0.3577710216920566,0.4666447477539449,0.36397701391034865,0.4513866088628077,0.3591703457520157,0.4666447477539449
|
43 |
+
3,6,0.3413057882778169,0.4857174213678663,0.34291424157455797,0.4857174213678663,0.34910692429291773,0.44884358571428484,0.3413064811078754,0.4857174213678663
|
44 |
+
3,7,0.3191503510562873,0.44884358571428484,0.3256301064093268,0.44884358571428484,0.33237217968265476,0.44884358571428484,0.3191501337505434,0.44884358571428484
|
45 |
+
3,8,0.29649476867610447,0.4857174213678663,0.30712026689533745,0.4857174213678663,0.3152535383488951,0.4971610255362192,0.2964946288754811,0.4857174213678663
|
46 |
+
3,9,0.2769927100143652,0.4170557963577491,0.2907655925117645,0.4170557963577491,0.3005988055486653,0.4170557963577491,0.2769928510066589,0.4170557963577491
|
47 |
+
3,10,0.25849758257029115,0.4259563773775791,0.2746865457954047,0.4259563773775791,0.28435017904518106,0.3967116111695662,0.2584975418063925,0.4259563773775791
|
48 |
+
3,11,0.24885688857250848,0.42977091210036333,0.26395026486733886,0.42977091210036333,0.27453316537691785,0.3967116111695662,0.2488570919787764,0.42977091210036333
|
49 |
+
3,-1,0.24885688857250848,0.42977091210036333,0.26395026486733886,0.42977091210036333,0.27453316537691785,0.3967116111695662,0.2488570919787764,0.42977091210036333
|
50 |
+
4,1,0.2491652765725898,0.4043406806151348,0.2633310463412356,0.4043406806151348,0.27435720384943674,0.4170557963577491,0.24916487523544195,0.4043406806151348
|
51 |
+
4,2,0.2563072009570627,0.4501150972885462,0.2685525603295987,0.4501150972885462,0.2800131563133945,0.4195988195062719,0.25630707815005493,0.4501150972885462
|
52 |
+
4,3,0.26268259780064496,0.44757207414002337,0.2735680358008246,0.44757207414002337,0.28532728425837445,0.4501150972885462,0.2626826465048624,0.44757207414002337
|
53 |
+
4,4,0.26887205367397343,0.44757207414002337,0.2794445303161154,0.44757207414002337,0.2922961905091087,0.44630056256576195,0.2688718938741411,0.44757207414002337
|
54 |
+
4,5,0.2647479382743477,0.4437575394172391,0.2754968863885376,0.4437575394172391,0.28997910248180886,0.4437575394172391,0.264747856685325,0.4437575394172391
|
55 |
+
4,6,0.25261323339957176,0.3827249838526905,0.26053448805515733,0.3827249838526905,0.2758295856900683,0.3827249838526905,0.25261344880893877,0.3827249838526905
|
56 |
+
4,7,0.23708784113823778,0.3356790556050176,0.24137601540466921,0.3356790556050176,0.2559322111693486,0.36110928709024614,0.23708816729907853,0.3356790556050176
|
57 |
+
4,8,0.2228277711008562,0.2657459190206389,0.2232224003206234,0.2657459190206389,0.23549330299526405,0.2657459190206389,0.22282828631519738,0.2657459190206389
|
58 |
+
4,9,0.21523535653609982,0.2593883611493318,0.21317214761779626,0.2593883611493318,0.22361047512742885,0.2593883611493318,0.21523507980589415,0.2593883611493318
|
59 |
+
4,10,0.21251328709570902,0.2593883611493318,0.2094338145870909,0.2593883611493318,0.21830936481014318,0.24158719910967172,0.21251314148630607,0.2593883611493318
|
60 |
+
4,11,0.21252544272668888,0.23777266438688743,0.20882087345071765,0.23777266438688743,0.2167802990130245,0.2670174305949003,0.21252543605710394,0.23777266438688743
|
61 |
+
4,-1,0.21252544272668888,0.23777266438688743,0.20882087345071765,0.23777266438688743,0.2167802990130245,0.2670174305949003,0.21252543605710394,0.23777266438688743
|
62 |
+
5,1,0.21506721665015624,0.21234243290165886,0.2130563019370643,0.21234243290165886,0.2220391919966218,0.2670174305949003,0.21506676086818005,0.21234243290165886
|
63 |
+
5,2,0.21651601832140704,0.23777266438688743,0.21664185944397574,0.23777266438688743,0.22660457357735955,0.29244766208012896,0.2165162168498603,0.23777266438688743
|
64 |
+
5,3,0.21659542336113238,0.26320289587211604,0.2184898626247062,0.26320289587211604,0.22975731902223645,0.29244766208012896,0.21659551525279735,0.26320289587211604
|
65 |
+
5,4,0.21976727761502385,0.28863312735734464,0.22418849181846345,0.28863312735734464,0.23727168003247562,0.3038912662484818,0.21976725629447294,0.28863312735734464
|
66 |
+
5,5,0.22627221416658588,0.3216924282881418,0.23433140995993132,0.3216924282881418,0.24957978646567439,0.33695056717927896,0.22627199471237605,0.3216924282881418
|
67 |
+
5,6,0.23377224712947542,0.354751729218939,0.24660113976975812,0.354751729218939,0.2642642717933218,0.36110928709024614,0.23377225990548997,0.354751729218939
|
68 |
+
5,7,0.23854903518977444,0.38526800700121333,0.2552391122869852,0.38526800700121333,0.27432557506391775,0.41069823848644194,0.23854896277954177,0.38526800700121333
|
69 |
+
5,8,0.2421549811156325,0.4068837037636577,0.261214429854061,0.4068837037636577,0.2812105310041568,0.43358544682314765,0.24215519028841695,0.4068837037636577
|
70 |
+
5,9,0.24194450032546125,0.3801819607041676,0.2613069053297078,0.3801819607041676,0.28150754447471565,0.4590156783083763,0.24194500877809821,0.3801819607041676
|
71 |
+
5,10,0.2398740091566418,0.3801819607041676,0.26006762873888656,0.3801819607041676,0.28058205879901743,0.4590156783083763,0.23987413631408314,0.3801819607041676
|
72 |
+
5,11,0.2394306463095407,0.42977091210036333,0.2618618196156897,0.42977091210036333,0.28119384022633026,0.48698893294212775,0.23943071197848992,0.42977091210036333
|
73 |
+
5,-1,0.2394306463095407,0.42977091210036333,0.2618618196156897,0.42977091210036333,0.28119384022633026,0.48698893294212775,0.23943071197848992,0.42977091210036333
|
74 |
+
6,1,0.23765205580009463,0.4501150972885462,0.2621739024148695,0.4501150972885462,0.27964671492166615,0.546749976932415,0.23765178418514457,0.4501150972885462
|
75 |
+
6,2,0.2321356664006965,0.44757207414002337,0.25744650062941465,0.44757207414002337,0.27289643991747065,0.5187767222986636,0.23213565461929697,0.44757207414002337
|
76 |
+
6,3,0.22780191079300677,0.4946180023876964,0.25383553767631456,0.4946180023876964,0.2675531954311998,0.5340348611898006,0.227801652487387,0.4946180023876964
|
77 |
+
6,4,0.22192230034685675,0.4742738171995134,0.2481524066728236,0.4742738171995134,0.2603026580768394,0.537849395912585,0.22192229569888422,0.4742738171995134
|
78 |
+
6,5,0.21606514931270002,0.4946180023876964,0.24105949643476832,0.4946180023876964,0.25153452214865313,0.4513866088628077,0.21606527529351435,0.4946180023876964
|
79 |
+
6,6,0.21183534892044303,0.44630056256576195,0.2372639596563307,0.44630056256576195,0.24730128514087774,0.44757207414002337,0.2118354893875451,0.44630056256576195
|
80 |
+
6,7,0.2054077252073939,0.39798312274382763,0.22961663230829435,0.39798312274382763,0.23859238774789823,0.4056121921893962,0.20540805164640266,0.39798312274382763
|
81 |
+
6,8,0.20168126437617792,0.4259563773775791,0.22612035192848343,0.4259563773775791,0.23447503747925624,0.4094267269121805,0.20168157804177897,0.4259563773775791
|
82 |
+
6,9,0.19715580987842551,0.4043406806151348,0.22084881666091966,0.4043406806151348,0.2292453883411995,0.4056121921893962,0.1971559193289432,0.4043406806151348
|
83 |
+
6,10,0.1943483400477529,0.3827249838526905,0.21771588328485103,0.3827249838526905,0.2264802102108752,0.3839964954269519,0.19434832603449606,0.3827249838526905
|
84 |
+
6,11,0.19168563032703193,0.36110928709024614,0.2147598185683928,0.36110928709024614,0.2237313041606571,0.3827249838526905,0.19168558717064496,0.36110928709024614
|
85 |
+
6,-1,0.19168563032703193,0.36110928709024614,0.2147598185683928,0.36110928709024614,0.2237313041606571,0.3827249838526905,0.19168558717064496,0.36110928709024614
|
86 |
+
7,1,0.18605431798081978,0.36110928709024614,0.20801603833616894,0.36110928709024614,0.2173548273235146,0.36110928709024614,0.1860545203691322,0.36110928709024614
|
87 |
+
7,2,0.1817798821013212,0.32804998615944897,0.20334128390398357,0.32804998615944897,0.2136227470299721,0.32804998615944897,0.18178015466360056,0.32804998615944897
|
88 |
+
7,3,0.17632311728023778,0.3356790556050176,0.1966136252280411,0.3356790556050176,0.20726030195207717,0.3077058009712661,0.17632315155992123,0.3356790556050176
|
89 |
+
7,4,0.1722339304375806,0.3331360324564947,0.1915387463055754,0.3331360324564947,0.20289546636630312,0.3077058009712661,0.17223397421891856,0.3331360324564947
|
90 |
+
7,5,0.16420422378181704,0.31660638199109614,0.18157573821586626,0.31660638199109614,0.19273263561687212,0.3051627778227432,0.16420421238974703,0.31660638199109614
|
91 |
+
7,6,0.1567153598836714,0.2949906852286518,0.17239621586486525,0.2949906852286518,0.18323081298653965,0.2949906852286518,0.1567156330565568,0.2949906852286518
|
92 |
+
7,7,0.1515689569930386,0.2949906852286518,0.16691743598190775,0.2949906852286518,0.17714608367172838,0.2949906852286518,0.15156916063100032,0.2949906852286518
|
93 |
+
7,8,0.14897221655685186,0.2949906852286518,0.16491740894248272,0.2949906852286518,0.1746252862992346,0.2949906852286518,0.1489724120303926,0.2949906852286518
|
94 |
+
7,9,0.14985273253913725,0.2949906852286518,0.16703244573702594,0.2949906852286518,0.17665495784400906,0.2949906852286518,0.14985274467192397,0.2949906852286518
|
95 |
+
7,10,0.150261568229029,0.2988052199514361,0.16831869142365444,0.2988052199514361,0.1770934399548046,0.2949906852286518,0.15026188784142624,0.2988052199514361
|
96 |
+
7,11,0.15636378848999444,0.2988052199514361,0.17690449449995488,0.2988052199514361,0.18636630565509274,0.2949906852286518,0.15636385091734717,0.2988052199514361
|
97 |
+
7,-1,0.15636378848999444,0.2988052199514361,0.17690449449995488,0.2988052199514361,0.18636630565509274,0.2949906852286518,0.15636385091734717,0.2988052199514361
|
98 |
+
8,1,0.1608959067113598,0.3331360324564947,0.18314342054810695,0.3331360324564947,0.1931027953152842,0.2988052199514361,0.16089595141064375,0.3331360324564947
|
99 |
+
8,2,0.1663562160785864,0.354751729218939,0.19072533021938395,0.354751729218939,0.20120879075237957,0.35729475236746183,0.16635601733740418,0.354751729218939
|
100 |
+
8,3,0.17058220238387478,0.35729475236746183,0.1967895709361493,0.35729475236746183,0.20802680218910188,0.35729475236746183,0.17058222972520454,0.35729475236746183
|
101 |
+
8,4,0.17584466614255745,0.3496656829218933,0.2042014572462542,0.3496656829218933,0.2162860976814363,0.3496656829218933,0.17584503396913603,0.3496656829218933
|
102 |
+
8,5,0.1797619589553295,0.3496656829218933,0.2096960949656008,0.3496656829218933,0.222433657583595,0.3496656829218933,0.17976215864369655,0.3496656829218933
|
103 |
+
8,6,0.18220821620282973,0.3827249838526905,0.2135743615324668,0.3827249838526905,0.22661021134228984,0.354751729218939,0.18220817380140114,0.3827249838526905
|
104 |
+
8,7,0.1858539152700423,0.3941685880210433,0.21907843828937434,0.3941685880210433,0.23279229628752804,0.3827249838526905,0.1858539584533409,0.3941685880210433
|
105 |
+
8,8,0.18750066407521776,0.3827249838526905,0.2220390567178416,0.3827249838526905,0.23580660043761015,0.41832730793201056,0.18750071026936949,0.3827249838526905
|
106 |
+
8,9,0.18828138887030288,0.3967116111695662,0.2238463548259161,0.3967116111695662,0.23769557519978787,0.4323139352488863,0.18828127456635219,0.3967116111695662
|
107 |
+
8,10,0.18800742478431529,0.45265812043706916,0.22411147612290386,0.45265812043706916,0.2379394174642716,0.4323139352488863,0.18800727575218376,0.45265812043706916
|
108 |
+
8,11,0.18779611857440498,0.45265812043706916,0.2243872246267858,0.45265812043706916,0.23820215560793473,0.4323139352488863,0.18779623389802783,0.45265812043706916
|
109 |
+
8,-1,0.18779611857440498,0.45265812043706916,0.2243872246267858,0.45265812043706916,0.23820215560793473,0.4323139352488863,0.18779623389802783,0.45265812043706916
|
110 |
+
9,1,0.1869702308539439,0.45265812043706916,0.2240763408854383,0.45265812043706916,0.23776301182757179,0.4323139352488863,0.1869701255288319,0.45265812043706916
|
111 |
+
9,2,0.18605015726694,0.45265812043706916,0.22352688349738198,0.45265812043706916,0.23702946927613858,0.4323139352488863,0.18604996958349532,0.45265812043706916
|
112 |
+
9,3,0.1844588180427019,0.45265812043706916,0.22188067067945125,0.45265812043706916,0.2350935778176437,0.4323139352488863,0.1844589063830042,0.45265812043706916
|
113 |
+
9,4,0.1838713789847406,0.45265812043706916,0.22154618578711316,0.45265812043706916,0.23455193938650742,0.4323139352488863,0.18387133930803284,0.45265812043706916
|
114 |
+
9,5,0.1836221807061482,0.45265812043706916,0.2215956316659524,0.45265812043706916,0.23448708861931086,0.4323139352488863,0.18362239915500916,0.45265812043706916
|
115 |
+
9,6,0.18314228269863816,0.45265812043706916,0.22122014420357963,0.45265812043706916,0.23401176098655202,0.4323139352488863,0.18314209058193176,0.45265812043706916
|
116 |
+
9,7,0.18273884777098312,0.43867149312019343,0.22082505309931072,0.43867149312019343,0.23350379267206847,0.4323139352488863,0.18273899225667145,0.43867149312019343
|
117 |
+
9,8,0.18250927313624912,0.43867149312019343,0.2206110473962686,0.43867149312019343,0.2332116392865751,0.4323139352488863,0.18250905260824812,0.43867149312019343
|
118 |
+
9,9,0.18223923503954786,0.43867149312019343,0.22029013622154522,0.43867149312019343,0.23285114138311291,0.4323139352488863,0.18223902959145707,0.43867149312019343
|
119 |
+
9,10,0.18214074611491501,0.43867149312019343,0.22020677530514665,0.43867149312019343,0.232731443115767,0.4323139352488863,0.18214070620419556,0.43867149312019343
|
120 |
+
9,11,0.1820209316218946,0.43867149312019343,0.22012759492361453,0.43867149312019343,0.23259596224937062,0.4323139352488863,0.18202066895703434,0.43867149312019343
|
121 |
+
9,-1,0.1820209316218946,0.43867149312019343,0.22012759492361453,0.43867149312019343,0.23259596224937062,0.4323139352488863,0.18202066895703434,0.43867149312019343
|
merges.txt
ADDED
The diff for this file is too large to render.
See raw diff
|
|
modules.json
ADDED
@@ -0,0 +1,20 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
[
|
2 |
+
{
|
3 |
+
"idx": 0,
|
4 |
+
"name": "0",
|
5 |
+
"path": "",
|
6 |
+
"type": "sentence_transformers.models.Transformer"
|
7 |
+
},
|
8 |
+
{
|
9 |
+
"idx": 1,
|
10 |
+
"name": "1",
|
11 |
+
"path": "1_Pooling",
|
12 |
+
"type": "sentence_transformers.models.Pooling"
|
13 |
+
},
|
14 |
+
{
|
15 |
+
"idx": 2,
|
16 |
+
"name": "2",
|
17 |
+
"path": "2_Normalize",
|
18 |
+
"type": "sentence_transformers.models.Normalize"
|
19 |
+
}
|
20 |
+
]
|
pytorch_model.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:32a7987d0712a64a00da1b7c17cbc9afb430a8d4caf4c7dcb9caf4b380a55a20
|
3 |
+
size 328517361
|
sentence_bert_config.json
ADDED
@@ -0,0 +1,4 @@
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"max_seq_length": 512,
|
3 |
+
"do_lower_case": false
|
4 |
+
}
|
similarity_evaluation_sts-test_results.csv
ADDED
@@ -0,0 +1,2 @@
|
|
|
|
|
|
|
1 |
+
epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
|
2 |
+
-1,-1,0.7610089503330165,0.3554617123898049,0.742876353261571,0.3554617123898049,0.7367532850334602,0.36433444234156653,0.7610089214529765,0.3553931114057364
|
special_tokens_map.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"bos_token": "<s>", "eos_token": "</s>", "unk_token": "<unk>", "sep_token": "</s>", "pad_token": "<pad>", "cls_token": "<s>", "mask_token": {"content": "<mask>", "single_word": false, "lstrip": true, "rstrip": false, "normalized": false}}
|
tokenizer.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
tokenizer_config.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"unk_token": "<unk>", "bos_token": "<s>", "eos_token": "</s>", "add_prefix_space": false, "errors": "replace", "sep_token": "</s>", "cls_token": "<s>", "pad_token": "<pad>", "mask_token": "<mask>", "trim_offsets": true, "model_max_length": 512, "special_tokens_map_file": null, "name_or_path": "/root/.cache/torch/sentence_transformers/sentence-transformers_all-distilroberta-v1/", "tokenizer_class": "RobertaTokenizer"}
|
vocab.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|