michal-stefanik commited on
Commit
c32a07c
1 Parent(s): f71b96b

Upload with huggingface_hub

Browse files
.ipynb_checkpoints/README-checkpoint.md ADDED
@@ -0,0 +1,91 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ pipeline_tag: sentence-similarity
3
+ tags:
4
+ - sentence-transformers
5
+ - feature-extraction
6
+ - sentence-similarity
7
+
8
+ ---
9
+
10
+ # {MODEL_NAME}
11
+
12
+ This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.
13
+
14
+ <!--- Describe your model here -->
15
+
16
+ ## Usage (Sentence-Transformers)
17
+
18
+ Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
19
+
20
+ ```
21
+ pip install -U sentence-transformers
22
+ ```
23
+
24
+ Then you can use the model like this:
25
+
26
+ ```python
27
+ from sentence_transformers import SentenceTransformer
28
+ sentences = ["This is an example sentence", "Each sentence is converted"]
29
+
30
+ model = SentenceTransformer('{MODEL_NAME}')
31
+ embeddings = model.encode(sentences)
32
+ print(embeddings)
33
+ ```
34
+
35
+
36
+
37
+ ## Evaluation Results
38
+
39
+ <!--- Describe how your model was evaluated -->
40
+
41
+ For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
42
+
43
+
44
+ ## Training
45
+ The model was trained with the parameters:
46
+
47
+ **DataLoader**:
48
+
49
+ `torch.utils.data.dataloader.DataLoader` of length 1988 with parameters:
50
+ ```
51
+ {'batch_size': 32, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
52
+ ```
53
+
54
+ **Loss**:
55
+
56
+ `sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss` with parameters:
57
+ ```
58
+ {'scale': 20.0, 'similarity_fct': 'cos_sim'}
59
+ ```
60
+
61
+ Parameters of the fit()-Method:
62
+ ```
63
+ {
64
+ "epochs": 10,
65
+ "evaluation_steps": 500,
66
+ "evaluator": "sentence_transformers.evaluation.EmbeddingSimilarityEvaluator.EmbeddingSimilarityEvaluator",
67
+ "max_grad_norm": 1,
68
+ "optimizer_class": "<class 'torch.optim.adamw.AdamW'>",
69
+ "optimizer_params": {
70
+ "lr": 2e-05
71
+ },
72
+ "scheduler": "WarmupLinear",
73
+ "steps_per_epoch": null,
74
+ "warmup_steps": 1000,
75
+ "weight_decay": 0.01
76
+ }
77
+ ```
78
+
79
+
80
+ ## Full Model Architecture
81
+ ```
82
+ SentenceTransformer(
83
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
84
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
85
+ (2): Normalize()
86
+ )
87
+ ```
88
+
89
+ ## Citing & Authors
90
+
91
+ <!--- Describe where people can find more information -->
.ipynb_checkpoints/config-checkpoint.json ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "/home/xstefan3/.cache/torch/sentence_transformers/sentence-transformers_all-MiniLM-L12-v2/",
3
+ "architectures": [
4
+ "BertModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "gradient_checkpointing": false,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 384,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 1536,
14
+ "layer_norm_eps": 1e-12,
15
+ "max_position_embeddings": 512,
16
+ "model_type": "bert",
17
+ "num_attention_heads": 12,
18
+ "num_hidden_layers": 12,
19
+ "pad_token_id": 0,
20
+ "position_embedding_type": "absolute",
21
+ "torch_dtype": "float32",
22
+ "transformers_version": "4.19.1",
23
+ "type_vocab_size": 2,
24
+ "use_cache": true,
25
+ "vocab_size": 30522
26
+ }
.ipynb_checkpoints/config_sentence_transformers-checkpoint.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "2.0.0",
4
+ "transformers": "4.6.1",
5
+ "pytorch": "1.8.1"
6
+ }
7
+ }
.ipynb_checkpoints/sentence_bert_config-checkpoint.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 128,
3
+ "do_lower_case": false
4
+ }
.ipynb_checkpoints/special_tokens_map-checkpoint.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]"}
.ipynb_checkpoints/tokenizer_config-checkpoint.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"do_lower_case": true, "unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]", "tokenize_chinese_chars": true, "strip_accents": null, "name_or_path": "/home/xstefan3/.cache/torch/sentence_transformers/sentence-transformers_all-MiniLM-L12-v2/", "do_basic_tokenize": true, "never_split": null, "model_max_length": 512, "special_tokens_map_file": "/home/xstefan3/.cache/torch/sentence_transformers/sentence-transformers_all-MiniLM-L12-v2/special_tokens_map.json", "tokenizer_class": "BertTokenizer"}
1_Pooling/config.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 384,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false
7
+ }
README.md ADDED
@@ -0,0 +1,91 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ pipeline_tag: sentence-similarity
3
+ tags:
4
+ - sentence-transformers
5
+ - feature-extraction
6
+ - sentence-similarity
7
+
8
+ ---
9
+
10
+ # {MODEL_NAME}
11
+
12
+ This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.
13
+
14
+ <!--- Describe your model here -->
15
+
16
+ ## Usage (Sentence-Transformers)
17
+
18
+ Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
19
+
20
+ ```
21
+ pip install -U sentence-transformers
22
+ ```
23
+
24
+ Then you can use the model like this:
25
+
26
+ ```python
27
+ from sentence_transformers import SentenceTransformer
28
+ sentences = ["This is an example sentence", "Each sentence is converted"]
29
+
30
+ model = SentenceTransformer('{MODEL_NAME}')
31
+ embeddings = model.encode(sentences)
32
+ print(embeddings)
33
+ ```
34
+
35
+
36
+
37
+ ## Evaluation Results
38
+
39
+ <!--- Describe how your model was evaluated -->
40
+
41
+ For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
42
+
43
+
44
+ ## Training
45
+ The model was trained with the parameters:
46
+
47
+ **DataLoader**:
48
+
49
+ `torch.utils.data.dataloader.DataLoader` of length 1988 with parameters:
50
+ ```
51
+ {'batch_size': 32, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
52
+ ```
53
+
54
+ **Loss**:
55
+
56
+ `sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss` with parameters:
57
+ ```
58
+ {'scale': 20.0, 'similarity_fct': 'cos_sim'}
59
+ ```
60
+
61
+ Parameters of the fit()-Method:
62
+ ```
63
+ {
64
+ "epochs": 10,
65
+ "evaluation_steps": 500,
66
+ "evaluator": "sentence_transformers.evaluation.EmbeddingSimilarityEvaluator.EmbeddingSimilarityEvaluator",
67
+ "max_grad_norm": 1,
68
+ "optimizer_class": "<class 'torch.optim.adamw.AdamW'>",
69
+ "optimizer_params": {
70
+ "lr": 2e-05
71
+ },
72
+ "scheduler": "WarmupLinear",
73
+ "steps_per_epoch": null,
74
+ "warmup_steps": 1000,
75
+ "weight_decay": 0.01
76
+ }
77
+ ```
78
+
79
+
80
+ ## Full Model Architecture
81
+ ```
82
+ SentenceTransformer(
83
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
84
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
85
+ (2): Normalize()
86
+ )
87
+ ```
88
+
89
+ ## Citing & Authors
90
+
91
+ <!--- Describe where people can find more information -->
config.json ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "/home/xstefan3/.cache/torch/sentence_transformers/sentence-transformers_all-MiniLM-L12-v2/",
3
+ "architectures": [
4
+ "BertModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "gradient_checkpointing": false,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 384,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 1536,
14
+ "layer_norm_eps": 1e-12,
15
+ "max_position_embeddings": 512,
16
+ "model_type": "bert",
17
+ "num_attention_heads": 12,
18
+ "num_hidden_layers": 12,
19
+ "pad_token_id": 0,
20
+ "position_embedding_type": "absolute",
21
+ "torch_dtype": "float32",
22
+ "transformers_version": "4.19.1",
23
+ "type_vocab_size": 2,
24
+ "use_cache": true,
25
+ "vocab_size": 30522
26
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "2.0.0",
4
+ "transformers": "4.6.1",
5
+ "pytorch": "1.8.1"
6
+ }
7
+ }
eval/.ipynb_checkpoints/similarity_evaluation_results-checkpoint.csv ADDED
@@ -0,0 +1,41 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
2
+ 0,500,0.7977655805439354,0.7858280872890149,0.7908820889886768,0.7858280872890149,0.7884345551907009,0.7835365837790076,0.7977655757275364,0.7858280869943294
3
+ 0,1000,0.828514513703648,0.8098810799144674,0.8159875528116669,0.8098810796107621,0.8096063851805175,0.8058557933254595,0.8285145158809512,0.8098810799144674
4
+ 0,1500,0.8557401070022763,0.8285266095412878,0.8374777832583536,0.8285266098519852,0.827787039419996,0.8240683098845891,0.8557400938158504,0.8285266092305901
5
+ 0,-1,0.839190389506968,0.8172561528851028,0.8233192917524121,0.8172561528851028,0.813855593186147,0.8101720644871668,0.8391903897504693,0.8172561528851028
6
+ 1,500,0.8541481701926952,0.8306656919386365,0.8324138999907384,0.8306656922501361,0.8205529708555696,0.8232178725211481,0.8541481750178238,0.8306656922501361
7
+ 1,1000,0.8643054917895009,0.8359120745003975,0.8411947658364021,0.8359120745003975,0.8315270823290593,0.8299572840021635,0.8643054931462113,0.8359120748138644
8
+ 1,1500,0.8430328895056891,0.8228506777033557,0.8214979260665463,0.8228506783204939,0.8127029849769519,0.8158168187873288,0.8430328861876593,0.8228506777033557
9
+ 1,-1,0.8539850493963506,0.8304959512493948,0.8321772992122612,0.8304959512493948,0.8244578368105775,0.8248563931023166,0.8539850452854774,0.8304959509379588
10
+ 2,500,0.8431659807678916,0.8224661623752901,0.8205140944175497,0.8224652972750499,0.8142470927269688,0.8165927776478604,0.8431659788624808,0.8224661626837148
11
+ 2,1000,0.8552248238640915,0.8315542344274859,0.8302062829159846,0.8315542344274859,0.8244222654383239,0.8268049505087857,0.8552248265206253,0.8315542344274859
12
+ 2,1500,0.8487553181567077,0.8253517593868099,0.8248118747719464,0.8253517596963168,0.8184275413151078,0.8198507656314766,0.848755314252476,0.8253517600058239
13
+ 2,-1,0.8496328896168104,0.825931996480963,0.8253053976541286,0.8259319967906876,0.8186208129499668,0.8201902479405305,0.8496328839035954,0.8259319967906876
14
+ 3,500,0.8233837513046839,0.8035937346829277,0.8055481989452375,0.8035937346829277,0.8003618894223714,0.7980580995975278,0.8233837514778484,0.8035937346829277
15
+ 3,1000,0.8371421028878636,0.813490674561821,0.8142967504033701,0.813490674561821,0.807847091967419,0.8071513674943831,0.837142107963226,0.8134906739517029
16
+ 3,1500,0.8478602662406107,0.8248200203400369,0.8236433614557872,0.8248200209586521,0.8172566209949228,0.8181360351137839,0.8478602647029366,0.8248200200307293
17
+ 3,-1,0.8470273875183728,0.8210112395123388,0.8218285056266799,0.821011239820218,0.8159193065796385,0.8146320961896829,0.8470273897126187,0.821011239820218
18
+ 4,500,0.8491773403117635,0.8216243858841203,0.8246730283404582,0.8216243861922297,0.819931719341428,0.81688549457792,0.8491773402626674,0.8216243855760111
19
+ 4,1000,0.829967605974198,0.80415665096546,0.8091941102324128,0.8041566515685776,0.8043371467121364,0.7981706829143461,0.8299676115468416,0.80415665096546
20
+ 4,1500,0.8556241831884515,0.8285612502511496,0.828991217473242,0.8285612505618601,0.8235734720074352,0.8234638240758269,0.8556241828976419,0.8285612502511496
21
+ 4,-1,0.8496278815585492,0.8242034098647711,0.8251588940684741,0.8242034095556948,0.8197120629931225,0.8175575300704411,0.8496278804875668,0.8242034095556948
22
+ 5,500,0.8553987843112298,0.8270439732405044,0.8286836207061664,0.8270439738607874,0.8230834837476747,0.8199425640284805,0.8553987791619193,0.8270439732405044
23
+ 5,1000,0.8509098120347989,0.8223258662420768,0.8252058556877848,0.8223258662420768,0.8196108754341372,0.8149750416821185,0.8509098101708481,0.8223258662420768
24
+ 5,1500,0.8503399959121478,0.8192306913634624,0.8255389635431001,0.8192306916706741,0.8201343463324118,0.8129935756117338,0.8503399984789911,0.8192306910562509
25
+ 5,-1,0.8439237328214304,0.8145238430003939,0.8200928615968343,0.8145238430003939,0.8149674478306836,0.8087535146952574,0.8439237331308782,0.8145229763639873
26
+ 6,500,0.8462564546657907,0.8188600321364038,0.820657446872743,0.8188600321364038,0.8156212418377533,0.811181850540668,0.8462564591925522,0.8188600321364038
27
+ 6,1000,0.8486227344698585,0.8210961100126777,0.823361217906478,0.8210961103205887,0.8186511637371892,0.8136604149523806,0.8486227380041068,0.8210961100126777
28
+ 6,1500,0.853802079303031,0.826839591529345,0.8268592131123635,0.8268395918394101,0.8225021922197036,0.8209800632095822,0.853802077312466,0.8268395912192803
29
+ 6,-1,0.8538667055268471,0.8247403456834426,0.8276934072885848,0.8247403456834426,0.8229943716893059,0.8180875373782178,0.853866706427185,0.8247403456834426
30
+ 7,500,0.8479135346771263,0.8192029785470147,0.8234414322607395,0.8192029785470147,0.8195337840297467,0.8134421771330196,0.8479135376141167,0.8192029782398137
31
+ 7,1000,0.8520297825437032,0.8222808332237218,0.8267389576498774,0.8222808335320771,0.8223971085343211,0.8170119339966297,0.8520297793926205,0.8222808332237218
32
+ 7,1500,0.8515474541964165,0.8225614257987139,0.8260703687481402,0.8225614254902532,0.822045205581262,0.8177636441427698,0.8515474470975705,0.8225614254902532
33
+ 7,-1,0.851750601615398,0.8216226538330923,0.8269107040453618,0.8216226538330923,0.8229243640352999,0.8172838657015401,0.8517506046089345,0.8216226538330923
34
+ 8,500,0.848640495255741,0.819128500352812,0.8232238004303561,0.8191276346344711,0.8189648811588301,0.8141124802705119,0.8486404873971487,0.819128500352812
35
+ 8,1000,0.8525794011334951,0.8213056881869837,0.8269021613642725,0.8213056888029631,0.8228289105530151,0.8170656272720972,0.8525793979587482,0.8213056881869837
36
+ 8,1500,0.8478885899086473,0.8174085736820355,0.8235392998585906,0.8174085736820355,0.819148265481197,0.8125345820893212,0.847888595985836,0.8174085733755073
37
+ 8,-1,0.8501384184858987,0.8190748067709449,0.8248685654403436,0.8190748070780982,0.8207937250138362,0.8142579728621536,0.8501384184417146,0.8190748067709449
38
+ 9,500,0.8502264510228783,0.8198005364590907,0.8246212103036825,0.8198005364590907,0.8205450942124303,0.8147308227927901,0.8502264453903755,0.8198005358442401
39
+ 9,1000,0.8496276893343591,0.8196394554060639,0.8238645132971182,0.8196394554060639,0.8197950807293268,0.8147602676602655,0.8496276845979934,0.8196394554060639
40
+ 9,1500,0.8501599877653114,0.8204344668279032,0.8241634482152239,0.8204344671355662,0.8200897871391095,0.8156695947558273,0.8501599879648907,0.8204344668279032
41
+ 9,-1,0.8502688166631348,0.8206249927487147,0.8239699451675087,0.8206249927487147,0.8198276558590097,0.8157683213585455,0.8502688097129415,0.8206249924409802
eval/.ipynb_checkpoints/similarity_evaluation_results_MarginMSELoss-checkpoint.csv ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
2
+ 0,500,0.9508058158060511,0.8525137839169797,0.9455348555469325,0.8525137842366726,0.9446346909333546,0.8504647678698128,0.950805816884907,0.852513783597287
eval/similarity_evaluation_results.csv ADDED
@@ -0,0 +1,41 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
2
+ 0,500,0.7977655805439354,0.7858280872890149,0.7908820889886768,0.7858280872890149,0.7884345551907009,0.7835365837790076,0.7977655757275364,0.7858280869943294
3
+ 0,1000,0.828514513703648,0.8098810799144674,0.8159875528116669,0.8098810796107621,0.8096063851805175,0.8058557933254595,0.8285145158809512,0.8098810799144674
4
+ 0,1500,0.8557401070022763,0.8285266095412878,0.8374777832583536,0.8285266098519852,0.827787039419996,0.8240683098845891,0.8557400938158504,0.8285266092305901
5
+ 0,-1,0.839190389506968,0.8172561528851028,0.8233192917524121,0.8172561528851028,0.813855593186147,0.8101720644871668,0.8391903897504693,0.8172561528851028
6
+ 1,500,0.8541481701926952,0.8306656919386365,0.8324138999907384,0.8306656922501361,0.8205529708555696,0.8232178725211481,0.8541481750178238,0.8306656922501361
7
+ 1,1000,0.8643054917895009,0.8359120745003975,0.8411947658364021,0.8359120745003975,0.8315270823290593,0.8299572840021635,0.8643054931462113,0.8359120748138644
8
+ 1,1500,0.8430328895056891,0.8228506777033557,0.8214979260665463,0.8228506783204939,0.8127029849769519,0.8158168187873288,0.8430328861876593,0.8228506777033557
9
+ 1,-1,0.8539850493963506,0.8304959512493948,0.8321772992122612,0.8304959512493948,0.8244578368105775,0.8248563931023166,0.8539850452854774,0.8304959509379588
10
+ 2,500,0.8431659807678916,0.8224661623752901,0.8205140944175497,0.8224652972750499,0.8142470927269688,0.8165927776478604,0.8431659788624808,0.8224661626837148
11
+ 2,1000,0.8552248238640915,0.8315542344274859,0.8302062829159846,0.8315542344274859,0.8244222654383239,0.8268049505087857,0.8552248265206253,0.8315542344274859
12
+ 2,1500,0.8487553181567077,0.8253517593868099,0.8248118747719464,0.8253517596963168,0.8184275413151078,0.8198507656314766,0.848755314252476,0.8253517600058239
13
+ 2,-1,0.8496328896168104,0.825931996480963,0.8253053976541286,0.8259319967906876,0.8186208129499668,0.8201902479405305,0.8496328839035954,0.8259319967906876
14
+ 3,500,0.8233837513046839,0.8035937346829277,0.8055481989452375,0.8035937346829277,0.8003618894223714,0.7980580995975278,0.8233837514778484,0.8035937346829277
15
+ 3,1000,0.8371421028878636,0.813490674561821,0.8142967504033701,0.813490674561821,0.807847091967419,0.8071513674943831,0.837142107963226,0.8134906739517029
16
+ 3,1500,0.8478602662406107,0.8248200203400369,0.8236433614557872,0.8248200209586521,0.8172566209949228,0.8181360351137839,0.8478602647029366,0.8248200200307293
17
+ 3,-1,0.8470273875183728,0.8210112395123388,0.8218285056266799,0.821011239820218,0.8159193065796385,0.8146320961896829,0.8470273897126187,0.821011239820218
18
+ 4,500,0.8491773403117635,0.8216243858841203,0.8246730283404582,0.8216243861922297,0.819931719341428,0.81688549457792,0.8491773402626674,0.8216243855760111
19
+ 4,1000,0.829967605974198,0.80415665096546,0.8091941102324128,0.8041566515685776,0.8043371467121364,0.7981706829143461,0.8299676115468416,0.80415665096546
20
+ 4,1500,0.8556241831884515,0.8285612502511496,0.828991217473242,0.8285612505618601,0.8235734720074352,0.8234638240758269,0.8556241828976419,0.8285612502511496
21
+ 4,-1,0.8496278815585492,0.8242034098647711,0.8251588940684741,0.8242034095556948,0.8197120629931225,0.8175575300704411,0.8496278804875668,0.8242034095556948
22
+ 5,500,0.8553987843112298,0.8270439732405044,0.8286836207061664,0.8270439738607874,0.8230834837476747,0.8199425640284805,0.8553987791619193,0.8270439732405044
23
+ 5,1000,0.8509098120347989,0.8223258662420768,0.8252058556877848,0.8223258662420768,0.8196108754341372,0.8149750416821185,0.8509098101708481,0.8223258662420768
24
+ 5,1500,0.8503399959121478,0.8192306913634624,0.8255389635431001,0.8192306916706741,0.8201343463324118,0.8129935756117338,0.8503399984789911,0.8192306910562509
25
+ 5,-1,0.8439237328214304,0.8145238430003939,0.8200928615968343,0.8145238430003939,0.8149674478306836,0.8087535146952574,0.8439237331308782,0.8145229763639873
26
+ 6,500,0.8462564546657907,0.8188600321364038,0.820657446872743,0.8188600321364038,0.8156212418377533,0.811181850540668,0.8462564591925522,0.8188600321364038
27
+ 6,1000,0.8486227344698585,0.8210961100126777,0.823361217906478,0.8210961103205887,0.8186511637371892,0.8136604149523806,0.8486227380041068,0.8210961100126777
28
+ 6,1500,0.853802079303031,0.826839591529345,0.8268592131123635,0.8268395918394101,0.8225021922197036,0.8209800632095822,0.853802077312466,0.8268395912192803
29
+ 6,-1,0.8538667055268471,0.8247403456834426,0.8276934072885848,0.8247403456834426,0.8229943716893059,0.8180875373782178,0.853866706427185,0.8247403456834426
30
+ 7,500,0.8479135346771263,0.8192029785470147,0.8234414322607395,0.8192029785470147,0.8195337840297467,0.8134421771330196,0.8479135376141167,0.8192029782398137
31
+ 7,1000,0.8520297825437032,0.8222808332237218,0.8267389576498774,0.8222808335320771,0.8223971085343211,0.8170119339966297,0.8520297793926205,0.8222808332237218
32
+ 7,1500,0.8515474541964165,0.8225614257987139,0.8260703687481402,0.8225614254902532,0.822045205581262,0.8177636441427698,0.8515474470975705,0.8225614254902532
33
+ 7,-1,0.851750601615398,0.8216226538330923,0.8269107040453618,0.8216226538330923,0.8229243640352999,0.8172838657015401,0.8517506046089345,0.8216226538330923
34
+ 8,500,0.848640495255741,0.819128500352812,0.8232238004303561,0.8191276346344711,0.8189648811588301,0.8141124802705119,0.8486404873971487,0.819128500352812
35
+ 8,1000,0.8525794011334951,0.8213056881869837,0.8269021613642725,0.8213056888029631,0.8228289105530151,0.8170656272720972,0.8525793979587482,0.8213056881869837
36
+ 8,1500,0.8478885899086473,0.8174085736820355,0.8235392998585906,0.8174085736820355,0.819148265481197,0.8125345820893212,0.847888595985836,0.8174085733755073
37
+ 8,-1,0.8501384184858987,0.8190748067709449,0.8248685654403436,0.8190748070780982,0.8207937250138362,0.8142579728621536,0.8501384184417146,0.8190748067709449
38
+ 9,500,0.8502264510228783,0.8198005364590907,0.8246212103036825,0.8198005364590907,0.8205450942124303,0.8147308227927901,0.8502264453903755,0.8198005358442401
39
+ 9,1000,0.8496276893343591,0.8196394554060639,0.8238645132971182,0.8196394554060639,0.8197950807293268,0.8147602676602655,0.8496276845979934,0.8196394554060639
40
+ 9,1500,0.8501599877653114,0.8204344668279032,0.8241634482152239,0.8204344671355662,0.8200897871391095,0.8156695947558273,0.8501599879648907,0.8204344668279032
41
+ 9,-1,0.8502688166631348,0.8206249927487147,0.8239699451675087,0.8206249927487147,0.8198276558590097,0.8157683213585455,0.8502688097129415,0.8206249924409802
eval/similarity_evaluation_results_MarginMSELoss.csv ADDED
@@ -0,0 +1,41 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
2
+ 0,500,0.9508058158060511,0.8525137839169797,0.9455348555469325,0.8525137842366726,0.9446346909333546,0.8504647678698128,0.950805816884907,0.852513783597287
3
+ 0,1000,0.9539755384379003,0.8526159749276301,0.9450685714618356,0.8526159752473613,0.9427531598437838,0.8508960482568535,0.9539755379494188,0.8526159749276301
4
+ 0,1500,0.9444472410844784,0.8463251655940344,0.940674726755037,0.8463251662287784,0.9396422184884452,0.8432473109173274,0.9444472426825783,0.8463251652766625
5
+ 0,-1,0.9488124575192454,0.8486963431330674,0.9404818364437921,0.8486963434513285,0.9384940749343149,0.8486634341635483,0.9488124575251631,0.8486963431330674
6
+ 1,500,0.9539935934766833,0.8521448570480217,0.950854147241438,0.8521448570480217,0.9482326648092534,0.8496766843331609,0.9539935927305574,0.8521448573675762
7
+ 1,1000,0.9482261716254992,0.8478874753033078,0.9415883123458777,0.8478874756212655,0.9390707315042024,0.8455093695598594,0.9482261701986361,0.8478874753033078
8
+ 1,1500,0.9472870839851452,0.8457189483685333,0.9429150278123204,0.8457189480513887,0.9406848405791204,0.8429649872319918,0.947287083575018,0.8457189480513887
9
+ 1,-1,0.9509117361005857,0.8454556759779923,0.9433266796340335,0.8454556759779923,0.9411629975948678,0.8444510866984376,0.9509117396772285,0.8454556759779923
10
+ 2,500,0.9137732466079218,0.8350252650006771,0.901728701177651,0.8350252646875426,0.8996461677082157,0.8292783190657509,0.9137732534375644,0.835025264374408
11
+ 2,1000,0.9323662148031191,0.838276324152693,0.921046202942637,0.8382763244670468,0.9183175845937478,0.8346927112031817,0.9323662181786906,0.8382780562037205
12
+ 2,1500,0.939216183344826,0.8469868090867199,0.930716135965864,0.8469868090867199,0.9304011302385298,0.844529028678027,0.9392161852007741,0.8469868090867199
13
+ 2,-1,0.9204903810668984,0.832913894796787,0.9091282218428779,0.8329138944844442,0.9059161723540599,0.8305479130916895,0.9204903833306048,0.8329138944844442
14
+ 3,500,0.9411227216708058,0.8429805750590197,0.9340453206119939,0.8429805750590197,0.9326399181658762,0.8410562663669424,0.9411227225902241,0.842980574742902
15
+ 3,1000,0.951788065565726,0.8432802205193193,0.9457049443729969,0.8432802202030892,0.943357192133254,0.8425856677406118,0.9517880670256803,0.8432802202030892
16
+ 3,1500,0.9543911213309263,0.8371643480210252,0.9483699270507293,0.8371643477070884,0.9462197888711356,0.8361840065116871,0.9543911240403493,0.8371643473931518
17
+ 3,-1,0.9283935297878301,0.836395317364317,0.920147444600629,0.8363953167370205,0.918372574409384,0.8342423776228991,0.9283935320235275,0.8363953167370205
18
+ 4,500,0.9534648584475756,0.8428229584154742,0.9492149379156735,0.842822958731533,0.9476816115845796,0.8429268814771526,0.9534648565939724,0.8428229584154742
19
+ 4,1000,0.9295608391338462,0.8357215492007876,0.9197153125395554,0.8357215495141832,0.917922445621206,0.8322435907366189,0.9295608429546766,0.8357215488873919
20
+ 4,1500,0.9082305904657395,0.812543242344461,0.8988299126158092,0.8125432426491649,0.8957539816307216,0.8056011821264483,0.9082305908694235,0.8125432420397573
21
+ 4,-1,0.9162287749271716,0.8228073764276725,0.9108110339530724,0.8228073767362254,0.9092852342433169,0.816771179210031,0.9162287731686032,0.8228073764276725
22
+ 5,500,0.9538375469818654,0.840146939577257,0.9500162330474298,0.8401469398923122,0.9485472396645712,0.8372855909650645,0.9538375478054301,0.8401469392622019
23
+ 5,1000,0.9106163295724403,0.8015551112241707,0.9050698094104006,0.8015551109235876,0.9038844113307235,0.8023345335855917,0.9106163247909813,0.8015551103224211
24
+ 5,1500,0.9389639538469259,0.8445757940557822,0.928374150384638,0.8445757937390663,0.9253349174574763,0.8433772147444254,0.9389639543767233,0.8445757937390663
25
+ 5,-1,0.9370561067624105,0.82954332318401,0.9317225357805279,0.8295441898316822,0.9303723624820659,0.8259475864888609,0.9370561063026972,0.8295450555461176
26
+ 6,500,0.9547794036176656,0.8399841267806275,0.9502132975116778,0.8399841267806275,0.9492474146194956,0.8384685821311516,0.9547794054689522,0.8399841264656335
27
+ 6,1000,0.9313009596288018,0.8247801828570859,0.925371313823967,0.8247801828570859,0.9244072627300987,0.8226410995290496,0.9313009641870755,0.8247801825477933
28
+ 6,1500,0.9328578935652398,0.832428920196612,0.9253649902560785,0.8324289198844511,0.9247148065138268,0.8311402748551558,0.9328578903106775,0.832428920196612
29
+ 6,-1,0.9472841037958402,0.8352989287499623,0.9425526675246947,0.8352997954019512,0.9418723982203363,0.8335478254732627,0.9472841039360299,0.8352989287499623
30
+ 7,500,0.9239990451649384,0.8126766105783687,0.918367982683002,0.8126766105783687,0.9171360385881335,0.8125761513139925,0.9239990389185597,0.8126766102736148
31
+ 7,1000,0.9347930304357442,0.8303842348922731,0.9281826267227418,0.8303851006063935,0.9267884494681661,0.8283222272092895,0.9347930304193605,0.8303842342694848
32
+ 7,1500,0.9574098367068935,0.8413247349072719,0.9530279653285634,0.8413247349072719,0.9518026706843088,0.8408103151209703,0.9574098383448872,0.8413247339607814
33
+ 7,-1,0.8913487610310906,0.8047143716980258,0.8808746459078725,0.8047143716980258,0.879007227212644,0.7949300157391087,0.8913487615787309,0.804714371396258
34
+ 8,500,0.9158014062276468,0.8188080715267434,0.909061961964901,0.8188072055012288,0.9089361575176658,0.818889477925119,0.9158014048731113,0.8188080709126372
35
+ 8,1000,0.9440929208756881,0.8318798600207448,0.9401107496879283,0.831881591759817,0.9395815721692443,0.8316702815344837,0.9440929220552678,0.8318798597087897
36
+ 8,1500,0.9044061835070611,0.8176372044177278,0.8974921562584701,0.8176363390054412,0.8976113775556294,0.814633827629736,0.904406181879595,0.8176372044177278
37
+ 8,-1,0.9235136474242446,0.8199893297137143,0.918547783613428,0.8199893300212104,0.9180877934500353,0.8184997661365957,0.9235136490711618,0.8199893294062183
38
+ 9,500,0.9193847109533778,0.8136275062879716,0.9123024135876252,0.8136275072033029,0.9118442624518888,0.815425375255007,0.9193847097862016,0.8136275059828614
39
+ 9,1000,0.9252056991597599,0.8148780471301678,0.9218904151393763,0.8148771814102329,0.921826774878277,0.8166412753828843,0.9252057019698137,0.8148763147735613
40
+ 9,1500,0.9246579463674915,0.8153075954793646,0.9202731622274061,0.8153075960908455,0.9202769260997001,0.8168699064249886,0.9246579477679262,0.8153075954793646
41
+ 9,-1,0.9243153862491004,0.8141982174070131,0.9197228431881034,0.8141982171016886,0.9197210636450469,0.816390127983734,0.9243153904878258,0.8141990828218779
modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:baba29b7906eee6d0de63edde697e000044ff98b9b9f38269bb369e960e63192
3
+ size 133511213
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 128,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]"}
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"do_lower_case": true, "unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]", "tokenize_chinese_chars": true, "strip_accents": null, "name_or_path": "/home/xstefan3/.cache/torch/sentence_transformers/sentence-transformers_all-MiniLM-L12-v2/", "do_basic_tokenize": true, "never_split": null, "model_max_length": 512, "special_tokens_map_file": "/home/xstefan3/.cache/torch/sentence_transformers/sentence-transformers_all-MiniLM-L12-v2/special_tokens_map.json", "tokenizer_class": "BertTokenizer"}
vocab.txt ADDED
The diff for this file is too large to render. See raw diff