model update
Browse files
README.md
CHANGED
@@ -191,29 +191,29 @@ vector = model.get_embedding(['Tokyo', 'Japan']) # shape of (1024, )
|
|
191 |
### Training hyperparameters
|
192 |
|
193 |
The following hyperparameters were used during training:
|
194 |
-
- model: roberta-base
|
195 |
-
- max_length: 64
|
196 |
-
- mode: mask
|
197 |
-
- data: relbert/semeval2012_relational_similarity_v6
|
198 |
-
- split: train
|
199 |
-
- split_eval: validation
|
200 |
-
- template_mode: manual
|
201 |
-
- template: Today, I finally discovered the relation between <subj> and <obj> : <obj> is <subj>'s <mask>
|
202 |
-
- loss_function: nce_logout
|
203 |
-
- classification_loss: False
|
204 |
-
- temperature_nce_constant: 0.05
|
205 |
-
- temperature_nce_rank: {'min': 0.01, 'max': 0.05, 'type': 'linear'}
|
206 |
-
- epoch: 8
|
207 |
-
- batch: 128
|
208 |
-
- lr: 5e-06
|
209 |
-
- lr_decay: False
|
210 |
-
- lr_warmup: 1
|
211 |
-
- weight_decay: 0
|
212 |
-
- random_seed: 0
|
213 |
-
- exclude_relation: None
|
214 |
-
- n_sample: 320
|
215 |
-
- gradient_accumulation: 8
|
216 |
-
- relation_level: None
|
217 |
|
218 |
The full configuration can be found at [fine-tuning parameter file](https://huggingface.co/relbert/relbert-roberta-base-semeval2012-v6-mask-prompt-b-nce-0/raw/main/trainer_config.json).
|
219 |
|
|
|
191 |
### Training hyperparameters
|
192 |
|
193 |
The following hyperparameters were used during training:
|
194 |
+
- model: "roberta-base"
|
195 |
+
- max_length: "64"
|
196 |
+
- mode: "mask"
|
197 |
+
- data: "relbert/semeval2012_relational_similarity_v6"
|
198 |
+
- split: "train"
|
199 |
+
- split_eval: "validation"
|
200 |
+
- template_mode: "manual"
|
201 |
+
- template: "Today, I finally discovered the relation between <subj> and <obj> : <obj> is <subj>'s <mask>"
|
202 |
+
- loss_function: "nce_logout"
|
203 |
+
- classification_loss: "False"
|
204 |
+
- temperature_nce_constant: "0.05"
|
205 |
+
- temperature_nce_rank: "{'min': 0.01, 'max': 0.05, 'type': 'linear'}"
|
206 |
+
- epoch: "8"
|
207 |
+
- batch: "128"
|
208 |
+
- lr: "5e-06"
|
209 |
+
- lr_decay: "False"
|
210 |
+
- lr_warmup: "1"
|
211 |
+
- weight_decay: "0"
|
212 |
+
- random_seed: "0"
|
213 |
+
- exclude_relation: "None"
|
214 |
+
- n_sample: "320"
|
215 |
+
- gradient_accumulation: "8"
|
216 |
+
- relation_level: "None"
|
217 |
|
218 |
The full configuration can be found at [fine-tuning parameter file](https://huggingface.co/relbert/relbert-roberta-base-semeval2012-v6-mask-prompt-b-nce-0/raw/main/trainer_config.json).
|
219 |
|