Your Name
commited on
Commit
•
bc633dc
1
Parent(s):
cf16348
adding models
Browse files- 1_Pooling/config.json +7 -0
- README.md +122 -0
- config.json +24 -0
- config_sentence_transformers.json +7 -0
- eval/similarity_evaluation_results.csv +101 -0
- modules.json +14 -0
- pytorch_model.bin +3 -0
- sentence_bert_config.json +4 -0
- special_tokens_map.json +15 -0
- tokenizer.json +0 -0
- tokenizer_config.json +16 -0
- vocab.txt +0 -0
1_Pooling/config.json
ADDED
@@ -0,0 +1,7 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"word_embedding_dimension": 768,
|
3 |
+
"pooling_mode_cls_token": true,
|
4 |
+
"pooling_mode_mean_tokens": false,
|
5 |
+
"pooling_mode_max_tokens": false,
|
6 |
+
"pooling_mode_mean_sqrt_len_tokens": false
|
7 |
+
}
|
README.md
ADDED
@@ -0,0 +1,122 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
pipeline_tag: sentence-similarity
|
3 |
+
tags:
|
4 |
+
- sentence-transformers
|
5 |
+
- feature-extraction
|
6 |
+
- sentence-similarity
|
7 |
+
- transformers
|
8 |
+
---
|
9 |
+
|
10 |
+
# {MODEL_NAME}
|
11 |
+
|
12 |
+
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
|
13 |
+
|
14 |
+
<!--- Describe your model here -->
|
15 |
+
|
16 |
+
## Usage (Sentence-Transformers)
|
17 |
+
|
18 |
+
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
|
19 |
+
|
20 |
+
```
|
21 |
+
pip install -U sentence-transformers
|
22 |
+
```
|
23 |
+
|
24 |
+
Then you can use the model like this:
|
25 |
+
|
26 |
+
```python
|
27 |
+
from sentence_transformers import SentenceTransformer
|
28 |
+
sentences = ["This is an example sentence", "Each sentence is converted"]
|
29 |
+
|
30 |
+
model = SentenceTransformer('{MODEL_NAME}')
|
31 |
+
embeddings = model.encode(sentences)
|
32 |
+
print(embeddings)
|
33 |
+
```
|
34 |
+
|
35 |
+
|
36 |
+
|
37 |
+
## Usage (HuggingFace Transformers)
|
38 |
+
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
|
39 |
+
|
40 |
+
```python
|
41 |
+
from transformers import AutoTokenizer, AutoModel
|
42 |
+
import torch
|
43 |
+
|
44 |
+
|
45 |
+
def cls_pooling(model_output, attention_mask):
|
46 |
+
return model_output[0][:,0]
|
47 |
+
|
48 |
+
|
49 |
+
# Sentences we want sentence embeddings for
|
50 |
+
sentences = ['This is an example sentence', 'Each sentence is converted']
|
51 |
+
|
52 |
+
# Load model from HuggingFace Hub
|
53 |
+
tokenizer = AutoTokenizer.from_pretrained('{MODEL_NAME}')
|
54 |
+
model = AutoModel.from_pretrained('{MODEL_NAME}')
|
55 |
+
|
56 |
+
# Tokenize sentences
|
57 |
+
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
|
58 |
+
|
59 |
+
# Compute token embeddings
|
60 |
+
with torch.no_grad():
|
61 |
+
model_output = model(**encoded_input)
|
62 |
+
|
63 |
+
# Perform pooling. In this case, cls pooling.
|
64 |
+
sentence_embeddings = cls_pooling(model_output, encoded_input['attention_mask'])
|
65 |
+
|
66 |
+
print("Sentence embeddings:")
|
67 |
+
print(sentence_embeddings)
|
68 |
+
```
|
69 |
+
|
70 |
+
|
71 |
+
|
72 |
+
## Evaluation Results
|
73 |
+
|
74 |
+
<!--- Describe how your model was evaluated -->
|
75 |
+
|
76 |
+
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
|
77 |
+
|
78 |
+
|
79 |
+
## Training
|
80 |
+
The model was trained with the parameters:
|
81 |
+
|
82 |
+
**DataLoader**:
|
83 |
+
|
84 |
+
`torch.utils.data.dataloader.DataLoader` of length 6369 with parameters:
|
85 |
+
```
|
86 |
+
{'batch_size': 32, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
|
87 |
+
```
|
88 |
+
|
89 |
+
**Loss**:
|
90 |
+
|
91 |
+
`sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss`
|
92 |
+
|
93 |
+
Parameters of the fit()-Method:
|
94 |
+
```
|
95 |
+
{
|
96 |
+
"epochs": 10,
|
97 |
+
"evaluation_steps": 1000,
|
98 |
+
"evaluator": "sentence_transformers.evaluation.EmbeddingSimilarityEvaluator.EmbeddingSimilarityEvaluator",
|
99 |
+
"max_grad_norm": 1,
|
100 |
+
"optimizer_class": "<class 'transformers.optimization.AdamW'>",
|
101 |
+
"optimizer_params": {
|
102 |
+
"lr": 2e-05
|
103 |
+
},
|
104 |
+
"scheduler": "WarmupLinear",
|
105 |
+
"steps_per_epoch": null,
|
106 |
+
"warmup_steps": 10000,
|
107 |
+
"weight_decay": 0.01
|
108 |
+
}
|
109 |
+
```
|
110 |
+
|
111 |
+
|
112 |
+
## Full Model Architecture
|
113 |
+
```
|
114 |
+
SentenceTransformer(
|
115 |
+
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: MPNetModel
|
116 |
+
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
|
117 |
+
)
|
118 |
+
```
|
119 |
+
|
120 |
+
## Citing & Authors
|
121 |
+
|
122 |
+
<!--- Describe where people can find more information -->
|
config.json
ADDED
@@ -0,0 +1,24 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"_name_or_path": "/root/.cache/torch/sentence_transformers/sentence-transformers_multi-qa-mpnet-base-dot-v1/",
|
3 |
+
"architectures": [
|
4 |
+
"MPNetModel"
|
5 |
+
],
|
6 |
+
"attention_probs_dropout_prob": 0.1,
|
7 |
+
"bos_token_id": 0,
|
8 |
+
"eos_token_id": 2,
|
9 |
+
"hidden_act": "gelu",
|
10 |
+
"hidden_dropout_prob": 0.1,
|
11 |
+
"hidden_size": 768,
|
12 |
+
"initializer_range": 0.02,
|
13 |
+
"intermediate_size": 3072,
|
14 |
+
"layer_norm_eps": 1e-05,
|
15 |
+
"max_position_embeddings": 514,
|
16 |
+
"model_type": "mpnet",
|
17 |
+
"num_attention_heads": 12,
|
18 |
+
"num_hidden_layers": 12,
|
19 |
+
"pad_token_id": 1,
|
20 |
+
"relative_attention_num_buckets": 32,
|
21 |
+
"torch_dtype": "float32",
|
22 |
+
"transformers_version": "4.21.1",
|
23 |
+
"vocab_size": 30527
|
24 |
+
}
|
config_sentence_transformers.json
ADDED
@@ -0,0 +1,7 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"__version__": {
|
3 |
+
"sentence_transformers": "2.0.0",
|
4 |
+
"transformers": "4.6.1",
|
5 |
+
"pytorch": "1.8.1"
|
6 |
+
}
|
7 |
+
}
|
eval/similarity_evaluation_results.csv
ADDED
@@ -0,0 +1,101 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
|
2 |
+
0,1000,0.2058566777699414,0.24262368735323064,0.19186527118951194,0.22863244739402044,0.18972484077926577,0.22984367886392887,0.20777641924315976,0.24898318667157987
|
3 |
+
0,2000,0.2445415142841956,0.2774778612650294,0.2298937631144543,0.26336507394267644,0.22605968731117523,0.26192560271796833,0.24264099831317892,0.28231325109966954
|
4 |
+
0,3000,0.26622053054879335,0.2979813446410511,0.25422763644141416,0.2893823769262051,0.25385447913694725,0.29067938512756797,0.2645309657479694,0.2990682638529133
|
5 |
+
0,4000,0.2945410617611566,0.31350470151895843,0.2824889635002717,0.3047511035617605,0.2816480923952097,0.303711796066744,0.2962243236859586,0.3179306934150959
|
6 |
+
0,5000,0.28858575655691915,0.32421231115761606,0.2809573513136473,0.3178617051886967,0.27929452145662914,0.3147209685503325,0.2844046961876533,0.3248700114597473
|
7 |
+
0,6000,0.2984859729180789,0.31213306272973945,0.2858000239102901,0.3020687210759328,0.2849804787483769,0.299319630178453,0.29935238503276634,0.31796446187038785
|
8 |
+
0,-1,0.3101500940327566,0.33406432304400335,0.29886844796974954,0.3277393750509322,0.2984196985975499,0.3246091874818143,0.3100740673569341,0.33630916926808735
|
9 |
+
1,1000,0.3000917277748326,0.328460926821619,0.2905382027979691,0.32162630688175153,0.2911602147107314,0.3205822620870476,0.2997813360952058,0.33007575888698626
|
10 |
+
1,2000,0.31951543375655883,0.33685403605064435,0.307117491920759,0.32852345937232735,0.30373185404852304,0.32318153193185023,0.32044132427229766,0.3408928142449456
|
11 |
+
1,3000,0.3194951848987596,0.33770175321997825,0.3091671563630221,0.33237205668222214,0.3071164253055025,0.32958697189249414,0.3183217020966866,0.33918928787372166
|
12 |
+
1,4000,0.32695630057577313,0.35101100963285614,0.31191380848545847,0.343285822264705,0.3115596541789012,0.33981684987665794,0.32561156308220107,0.3532371809628665
|
13 |
+
1,5000,0.31881757019091894,0.3438577806597359,0.3048519818214589,0.3381384793405181,0.30426280901122404,0.33569917969501706,0.3185723142459543,0.3460494373883932
|
14 |
+
1,6000,0.32222424251553405,0.35166684209721905,0.31429937795958823,0.34598756406288494,0.3120069269555324,0.34104431823036724,0.3197211023500605,0.3541662198440627
|
15 |
+
1,-1,0.3224774147590433,0.34803626672319565,0.3178119749381597,0.345089280994776,0.3160055709093051,0.3409293316511072,0.3191686367438201,0.3472209475669861
|
16 |
+
2,1000,0.33071541954146555,0.3581722801416925,0.32445531298000346,0.3540066871230379,0.3241758074923644,0.35206981441988505,0.33046191787112195,0.359247837418217
|
17 |
+
2,2000,0.3453543954513638,0.3684223875610152,0.33570100858647456,0.3649550596646675,0.33371107707739517,0.36133525630815805,0.34276345111464074,0.3676762282689256
|
18 |
+
2,3000,0.350738915563881,0.3798941002154024,0.34382242381416317,0.377130475829457,0.34203227764942545,0.3737531234026748,0.3480959392843365,0.37953187867547206
|
19 |
+
2,4000,0.3425642033258699,0.36994352536962605,0.3328452544612345,0.36584535613824726,0.332586019068928,0.3630901067104447,0.3419565762169167,0.3706612582229452
|
20 |
+
2,5000,0.34270537207465823,0.3606616519100851,0.3321849958412845,0.3555764059496749,0.3298177275804416,0.35064619505618166,0.3421521477509548,0.36192365633598267
|
21 |
+
2,6000,0.3487095477511232,0.3729192616670821,0.3409905968675784,0.3678004874687229,0.33946069482572505,0.3651416392562273,0.3483382574968233,0.37413455760687864
|
22 |
+
2,-1,0.35725909736598094,0.39172772132499845,0.3513694184757105,0.38796037479996426,0.35104448257338366,0.3870508434712394,0.35701628274907565,0.3934297336592638
|
23 |
+
3,1000,0.34393061809658065,0.37836205183739363,0.33514028546642594,0.3742623228779305,0.33367388204440596,0.37193106434600254,0.3436601474252054,0.3783411357488233
|
24 |
+
3,2000,0.3580971663410825,0.3835592166508479,0.3498274463616749,0.3804647655476673,0.3483232651104043,0.37824393063916284,0.3575085954410882,0.3837308853180979
|
25 |
+
3,3000,0.35748720420270286,0.38272883592218243,0.35316878548350206,0.3823983135688408,0.3526767724398664,0.3817684541470852,0.3542647993385417,0.37925387269062605
|
26 |
+
3,4000,0.35530544800760633,0.3811737715320843,0.3423374191931701,0.37682671848004906,0.34252440878784424,0.3762733199170197,0.3558482633903368,0.38254643183569753
|
27 |
+
3,5000,0.35554906146435766,0.37487384340588886,0.34316735055205905,0.3705836396182945,0.34285670589642303,0.37024279211875777,0.3555104019584383,0.37527329076220967
|
28 |
+
3,6000,0.36158943380215136,0.37287806783678507,0.3499571993810133,0.3674647135374962,0.3483897558116495,0.36565848592094696,0.36099401492875594,0.37348597167417713
|
29 |
+
3,-1,0.3665603798614457,0.382048057114085,0.3544705754208046,0.3782028449220719,0.35358704349123893,0.37713850937879895,0.3666019494971457,0.3825284065369192
|
30 |
+
4,1000,0.3623676432129411,0.3859204356188919,0.35065087833172315,0.38170358192158915,0.3504595870960082,0.3811575745106409,0.3612220858792494,0.38587800811108397
|
31 |
+
4,2000,0.3686149756308597,0.3948877887775605,0.35775317824916814,0.391041973198409,0.35808126862256434,0.3911504688775873,0.368164493797387,0.3949364649165609
|
32 |
+
0,1000,0.20110326459382732,0.23914386459786266,0.19085936776501153,0.22908708080517717,0.18889826722146516,0.230605608329301,0.20117708424332106,0.24271065468242678
|
33 |
+
0,2000,0.2555243307981072,0.28665195804858873,0.2414332608913011,0.27459123553062215,0.2385745889454529,0.2731841908760928,0.2546227167074964,0.2902590899536698
|
34 |
+
0,3000,0.26877379160901577,0.2997177549066786,0.2595920024244048,0.2904345567509068,0.2569152999849902,0.2885377119452365,0.26694449619705213,0.3027195639019854
|
35 |
+
0,4000,0.28532586242345775,0.32237261059201416,0.27596365667923684,0.3168086553965637,0.27642611319791954,0.31913334147036687,0.2847996439961598,0.3220491311099871
|
36 |
+
0,5000,0.2913109058264196,0.33093002584338943,0.286301053955822,0.32578381412124724,0.2863084976071902,0.3260996658119491,0.2837668443721523,0.327063078012374
|
37 |
+
0,6000,0.3065209866808935,0.3282982849879864,0.3014506744672539,0.32197441104068913,0.2992986224060492,0.31865727558354345,0.3037281504543232,0.32851040345458304
|
38 |
+
0,-1,0.30876664593153463,0.32965313768605703,0.29501102849083827,0.3183189132318504,0.29413617912897966,0.31644175754451065,0.30776657597129425,0.33220658374520584
|
39 |
+
1,1000,0.30582719670324937,0.33566625684132373,0.29508192352005497,0.32961737577872224,0.29616259077437834,0.33024880993317574,0.3042376644788336,0.3361611145712415
|
40 |
+
1,2000,0.31557701319450054,0.3394605969600944,0.30062410219194163,0.33263651553302626,0.3009654308693217,0.3304878425568967,0.3169319400305172,0.34257794032075156
|
41 |
+
1,3000,0.3240200166611532,0.3390107418285207,0.30710830306006626,0.32985708965134525,0.30646060464220165,0.32730656716597906,0.32579373416169494,0.34326374082797784
|
42 |
+
1,4000,0.3194298353496507,0.3326799578923793,0.31380346016304933,0.32631334556869707,0.311336406473834,0.3229395704639278,0.32168638939863103,0.33701050108412073
|
43 |
+
1,5000,0.32130305976814055,0.34241128064093945,0.3098374206301584,0.3344261886066852,0.3075125240231076,0.3304616239792311,0.32314749333274784,0.34652341320680924
|
44 |
+
1,6000,0.31893612222527995,0.3418443153119563,0.30792484073029375,0.3368580272078514,0.30810793947585846,0.3357020488923973,0.3196691602669793,0.344030350319398
|
45 |
+
1,-1,0.3301587404919292,0.35263685670960254,0.31840791813450187,0.3450711732135187,0.3170385915678879,0.34289339177613193,0.3300645611572398,0.3553263146006561
|
46 |
+
2,1000,0.33902387252259103,0.3599459576906133,0.3282212611679078,0.3536114176619944,0.3275222167328783,0.3519635171268427,0.34031848510713025,0.3637338877384291
|
47 |
+
2,2000,0.34337493181773726,0.3670836767967914,0.336543108280501,0.36298663955628857,0.3361464293658729,0.36179875416417556,0.3438904591680131,0.3692540330161885
|
48 |
+
2,3000,0.3422723072809497,0.3668662442526574,0.3339051801897969,0.3647591267810262,0.33375334964636966,0.3638100151734975,0.3401809021911294,0.36673469809899156
|
49 |
+
2,4000,0.35322522616392293,0.36743222294549216,0.3412922878429738,0.3619944655498773,0.34089123198494314,0.361357100290573,0.3534668070646917,0.36869386592795433
|
50 |
+
2,5000,0.3583438174532453,0.38012265439083714,0.3467793983272175,0.3749876583658222,0.3472545311535105,0.37532606735052876,0.3574626114926572,0.3807239237250775
|
51 |
+
2,6000,0.34476552701943586,0.3739000566778911,0.33465003179050173,0.3672987435403614,0.3344525546855495,0.366123263734083,0.3461633229621725,0.3767083043843917
|
52 |
+
2,-1,0.35401301727507273,0.3768043068220867,0.3499155097733806,0.37267099496616324,0.35017598363825586,0.3725822900427925,0.35443871760733486,0.37912005646777824
|
53 |
+
3,1000,0.36036025927950477,0.3831109922063363,0.3484592553033518,0.3780245999256532,0.34829301445863653,0.37738316659824017,0.3604467942530253,0.3843966630570247
|
54 |
+
3,2000,0.3494755031793955,0.3739737520099258,0.341216022150852,0.36822288033075057,0.3414289574863789,0.3681399563587575,0.3522596830322747,0.37809217041552984
|
55 |
+
3,3000,0.35101119451997964,0.3724265787064519,0.3407070554058176,0.3677070441472427,0.34159963524318354,0.36823412286763363,0.352139099710055,0.3743225730897449
|
56 |
+
3,4000,0.3605224950686015,0.3851154148061673,0.35447901270449866,0.3816290419365367,0.35551305158830054,0.38217563634748103,0.36041033594640554,0.38559464046570857
|
57 |
+
3,5000,0.3589202619675829,0.37999364949161474,0.34867238135773926,0.37477506412374956,0.3481030862671712,0.3740659762326303,0.3598295809385141,0.3823142042995804
|
58 |
+
3,6000,0.3671243679404418,0.39054781413143846,0.35839657390766316,0.38530799424388906,0.35921334396461685,0.38603114527831767,0.3666298124553457,0.3905125009401693
|
59 |
+
3,-1,0.3660921433981078,0.3835204337487041,0.35328089309439226,0.37672551909496865,0.35324596415000076,0.37641350426650244,0.3664971677912254,0.3850666351112541
|
60 |
+
4,1000,0.3608599622895476,0.3870292443977001,0.34870548974122006,0.38113349844552974,0.34792628020090344,0.3805226329524423,0.36151101256594614,0.38829887319620005
|
61 |
+
4,2000,0.36123151393197034,0.38713078266079554,0.3498736780767613,0.38144363551633687,0.34936257792151604,0.38072401283546253,0.3617680950781434,0.38816377663870694
|
62 |
+
4,3000,0.3637448515745093,0.3891925115531625,0.3510352666933458,0.3823945544116493,0.3510762977138264,0.38239086891145135,0.3635654478304231,0.39027107055896787
|
63 |
+
4,4000,0.3629689451538316,0.3854524696621389,0.352712987015554,0.3805271312203382,0.3521327353668341,0.37994196530019814,0.36198752402282514,0.3854964944359704
|
64 |
+
4,5000,0.3640547084790299,0.38528786819605476,0.3511831995584039,0.37907791478056063,0.35137287814315654,0.3788889014089696,0.36412605516718116,0.3862959089689665
|
65 |
+
4,6000,0.3650280599596297,0.3802749643547585,0.3544197494967548,0.3750674962955716,0.35500453992830866,0.37532779019399826,0.36346211689073604,0.379409593608964
|
66 |
+
4,-1,0.3560085705800552,0.38157482206875537,0.3466208652660309,0.37802455284024533,0.3478865956032175,0.37921524109408583,0.35447459444289153,0.3804670629371819
|
67 |
+
5,1000,0.3766062677033697,0.39810255540783274,0.36297623847780736,0.39117227959361406,0.3628292245321205,0.3906697636952476,0.37681950500921646,0.3994695806618722
|
68 |
+
5,2000,0.36000889943150416,0.3771728520733639,0.3447567416779288,0.3696929021387119,0.34501610750771344,0.36988417591857137,0.3590222833685341,0.37809948245744207
|
69 |
+
5,3000,0.36433674893329554,0.3848969371185598,0.3516737909502914,0.3781944097807649,0.35178086629748995,0.37794743571172795,0.36300284989116804,0.38446955776704717
|
70 |
+
5,4000,0.36436268821592144,0.38102863375455376,0.34896915450951516,0.3732991648721413,0.3489099579282543,0.3726678621261739,0.36341795802597054,0.38100828641491147
|
71 |
+
5,5000,0.36372630985868537,0.3858586187685642,0.34890545642247917,0.3783434052836268,0.34849467475095486,0.3779326800593912,0.3632134568670863,0.38664520830549787
|
72 |
+
5,6000,0.36027542748006314,0.37785114248697155,0.3455443919563389,0.3708577494890933,0.3456307986198427,0.3708061596814175,0.35912084458605803,0.3783393006175112
|
73 |
+
5,-1,0.3622212765380366,0.3826921362497882,0.350034716325624,0.3768165700898358,0.34964275332611333,0.3762588396892805,0.36049982728167335,0.3816841434456942
|
74 |
+
6,1000,0.35319400108287907,0.3755204184800619,0.3369718977636497,0.3655178051758539,0.3371450601531917,0.3654385678378413,0.351357612549383,0.37586652198811815
|
75 |
+
6,2000,0.36439936312998567,0.3810823624338181,0.35044529179769673,0.37396054399535056,0.3499644605168018,0.3731045509962726,0.36199887679182796,0.3795489421369426
|
76 |
+
6,3000,0.36744682997308487,0.3880739558007194,0.3539039606412416,0.37927655279732964,0.3535222300788071,0.37940704637167744,0.36487609640249263,0.38685049379723385
|
77 |
+
6,4000,0.3576036640177278,0.3781159540792545,0.34689940031070227,0.3723577589560506,0.3460154416221251,0.371305186515342,0.35458501566720435,0.37560800243528975
|
78 |
+
6,5000,0.3603646383921794,0.3834080015434238,0.34869220752024627,0.3774110556436971,0.3487511702724834,0.3772607269205466,0.3577134278704481,0.38143233855367614
|
79 |
+
6,6000,0.36200116394370485,0.3821169559381804,0.3468468173497669,0.37493142869087676,0.3466763549834666,0.3747431746209383,0.36003501816476197,0.38055580556418295
|
80 |
+
6,-1,0.362052781997453,0.38175301347476465,0.34756418993437077,0.37413997717617575,0.3477054099817507,0.3742683766995432,0.360040691034049,0.38010073832745994
|
81 |
+
7,1000,0.3519498608309847,0.3717090233269585,0.3342703888551456,0.35978146259767,0.3335515249463602,0.3587842393548393,0.34993436605935546,0.37134469705458856
|
82 |
+
7,2000,0.35436777845563155,0.3739382746160016,0.3369976386305684,0.36432038119494725,0.3363205397140412,0.36345282767412135,0.3512603263951366,0.3724505008269909
|
83 |
+
7,3000,0.3587749565267525,0.37439453710201875,0.341723620536908,0.36502621086463993,0.34103972759933815,0.36418198985837025,0.35519663438735155,0.37217303620254105
|
84 |
+
7,4000,0.359167174769367,0.3773854774099476,0.3429418706920265,0.3682536634357658,0.34241853479183076,0.36779129493670903,0.35617039622121444,0.3759114227275516
|
85 |
+
7,5000,0.3619669340117347,0.38210524030455645,0.34523189834122736,0.3738081712988568,0.3445444138989187,0.3735927108942124,0.35954982084515397,0.3806578458027025
|
86 |
+
7,6000,0.3577371837845675,0.37864083059991593,0.3382722293650285,0.36800101873178365,0.3375154754754075,0.36734633779154846,0.35569706824738706,0.3783032973176098
|
87 |
+
7,-1,0.3608713529446777,0.379718870546183,0.3448843499219049,0.37229245346872925,0.34445517988033364,0.3719677303262084,0.358441941882725,0.3784457004664757
|
88 |
+
8,1000,0.35427358956962085,0.37381587397484217,0.3353611745143022,0.3616313418601386,0.3344938880406097,0.36097068248270825,0.35166941759873765,0.37296467800394134
|
89 |
+
8,2000,0.35182262265047154,0.3707525450993127,0.3325602648876793,0.35819907735618417,0.33189589838706496,0.35740121879598613,0.35006084846503993,0.3706762038913555
|
90 |
+
8,3000,0.3594711478494771,0.37860350620546673,0.34359276210175954,0.37026169120327085,0.3434709133558715,0.370062752800674,0.35618148543450134,0.3768013583121706
|
91 |
+
8,4000,0.3514927032198818,0.3719224076670651,0.3308238091901572,0.3590475792135897,0.32996946886058753,0.3580793681305129,0.3492137976375043,0.3716138124950741
|
92 |
+
8,5000,0.35568511768719174,0.37487173570851406,0.3393432118008958,0.36583350375599655,0.33918861642490467,0.3658544196817104,0.3525839489976293,0.3732764674193151
|
93 |
+
8,6000,0.35415553132491345,0.3741757471915421,0.3374200644256525,0.3648102790289248,0.33711262598197006,0.36472097132826375,0.3511633259259922,0.3725334657398662
|
94 |
+
8,-1,0.3546673514178637,0.3754001283291792,0.3363027134738956,0.36436842388805146,0.3356660020644132,0.3639825924746442,0.35210395544323714,0.3744183511781909
|
95 |
+
9,1000,0.35505203599199076,0.3744438203596704,0.33696628187374644,0.3633678035882917,0.3362822620511114,0.3629665154744244,0.35207293428105196,0.37332584687758175
|
96 |
+
9,2000,0.3490567327591763,0.3680634032387308,0.33249514104197336,0.3585337214946891,0.33192264775758473,0.3583128204896674,0.3454968554139329,0.3659836223491434
|
97 |
+
9,3000,0.34669927955746893,0.36593165869157873,0.33111159613443353,0.3570975586984098,0.33071685342087515,0.3569962632967977,0.3426615618086641,0.36359117382060435
|
98 |
+
9,4000,0.35028184030829573,0.3686385289448273,0.3347049114997081,0.35954362716167354,0.3344381779470545,0.3595636941465515,0.3463162023393919,0.36625543194551674
|
99 |
+
9,5000,0.3493425875384423,0.3678325006997602,0.33120660648234446,0.3567975987388829,0.330576954680179,0.35649188789603253,0.3458731067025904,0.3664177794389061
|
100 |
+
9,6000,0.34996986971245475,0.3685754897834849,0.3318502306197626,0.35732978116321523,0.3312797813724797,0.3570760422138131,0.346618621208374,0.36735535813782266
|
101 |
+
9,-1,0.3502838567315214,0.36917658938032644,0.3320899561198965,0.35777619901398133,0.3315147861512724,0.3575044051021567,0.34696782336587145,0.36789208847464283
|
modules.json
ADDED
@@ -0,0 +1,14 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
[
|
2 |
+
{
|
3 |
+
"idx": 0,
|
4 |
+
"name": "0",
|
5 |
+
"path": "",
|
6 |
+
"type": "sentence_transformers.models.Transformer"
|
7 |
+
},
|
8 |
+
{
|
9 |
+
"idx": 1,
|
10 |
+
"name": "1",
|
11 |
+
"path": "1_Pooling",
|
12 |
+
"type": "sentence_transformers.models.Pooling"
|
13 |
+
}
|
14 |
+
]
|
pytorch_model.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:18102df54a67e60ff8df0c09b217c4f8aa47d9901da99019434396ce6b856d23
|
3 |
+
size 438019895
|
sentence_bert_config.json
ADDED
@@ -0,0 +1,4 @@
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"max_seq_length": 512,
|
3 |
+
"do_lower_case": false
|
4 |
+
}
|
special_tokens_map.json
ADDED
@@ -0,0 +1,15 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"bos_token": "<s>",
|
3 |
+
"cls_token": "<s>",
|
4 |
+
"eos_token": "</s>",
|
5 |
+
"mask_token": {
|
6 |
+
"content": "<mask>",
|
7 |
+
"lstrip": true,
|
8 |
+
"normalized": false,
|
9 |
+
"rstrip": false,
|
10 |
+
"single_word": false
|
11 |
+
},
|
12 |
+
"pad_token": "<pad>",
|
13 |
+
"sep_token": "</s>",
|
14 |
+
"unk_token": "[UNK]"
|
15 |
+
}
|
tokenizer.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
tokenizer_config.json
ADDED
@@ -0,0 +1,16 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"bos_token": "<s>",
|
3 |
+
"cls_token": "<s>",
|
4 |
+
"do_lower_case": true,
|
5 |
+
"eos_token": "</s>",
|
6 |
+
"mask_token": "<mask>",
|
7 |
+
"model_max_length": 512,
|
8 |
+
"name_or_path": "/root/.cache/torch/sentence_transformers/sentence-transformers_multi-qa-mpnet-base-dot-v1/",
|
9 |
+
"pad_token": "<pad>",
|
10 |
+
"sep_token": "</s>",
|
11 |
+
"special_tokens_map_file": null,
|
12 |
+
"strip_accents": null,
|
13 |
+
"tokenize_chinese_chars": true,
|
14 |
+
"tokenizer_class": "MPNetTokenizer",
|
15 |
+
"unk_token": "[UNK]"
|
16 |
+
}
|
vocab.txt
ADDED
The diff for this file is too large to render.
See raw diff
|
|