Edit model card

electra-base-discriminator-finetuned-ner-cadec-no-iob

This model is a fine-tuned version of google/electra-base-discriminator on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3907
  • Precision: 0.6459
  • Recall: 0.6715
  • F1: 0.6585
  • Accuracy: 0.9355
  • Adr Precision: 0.5961
  • Adr Recall: 0.6330
  • Adr F1: 0.614
  • Disease Precision: 0.4615
  • Disease Recall: 0.375
  • Disease F1: 0.4138
  • Drug Precision: 0.9022
  • Drug Recall: 0.9222
  • Drug F1: 0.9121
  • Finding Precision: 0.2571
  • Finding Recall: 0.2812
  • Finding F1: 0.2687
  • Symptom Precision: 0.5357
  • Symptom Recall: 0.5172
  • Symptom F1: 0.5263
  • Macro Avg F1: 0.5470
  • Weighted Avg F1: 0.6584

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy Adr Precision Adr Recall Adr F1 Disease Precision Disease Recall Disease F1 Drug Precision Drug Recall Drug F1 Finding Precision Finding Recall Finding F1 Symptom Precision Symptom Recall Symptom F1 Macro Avg F1 Weighted Avg F1
No log 1.0 125 0.2178 0.5120 0.5646 0.5370 0.9209 0.4311 0.5608 0.4875 0.1 0.0312 0.0476 0.8158 0.8611 0.8378 0.0 0.0 0.0 0.0 0.0 0.0 0.2746 0.5129
No log 2.0 250 0.1864 0.5799 0.6227 0.6005 0.9296 0.5017 0.6 0.5465 0.4146 0.5312 0.4658 0.9011 0.9111 0.9061 0.0 0.0 0.0 0.0 0.0 0.0 0.3837 0.5845
No log 3.0 375 0.1845 0.5894 0.6306 0.6093 0.9310 0.5520 0.6021 0.5759 0.2353 0.375 0.2892 0.9016 0.9167 0.9091 0.0857 0.0938 0.0896 0.4615 0.2069 0.2857 0.4299 0.6113
0.2159 4.0 500 0.1968 0.6002 0.6359 0.6176 0.9318 0.5570 0.6041 0.5796 0.0667 0.0312 0.0426 0.9121 0.9222 0.9171 0.1967 0.375 0.2581 0.5263 0.3448 0.4167 0.4428 0.6173
0.2159 5.0 625 0.2026 0.6357 0.6583 0.6468 0.9338 0.5817 0.6165 0.5986 0.52 0.4062 0.4561 0.9071 0.9222 0.9146 0.1951 0.25 0.2192 0.5909 0.4483 0.5098 0.5397 0.6482
0.2159 6.0 750 0.2142 0.6055 0.6359 0.6203 0.9319 0.5697 0.5814 0.5755 0.375 0.4688 0.4167 0.9066 0.9167 0.9116 0.1163 0.1562 0.1333 0.4167 0.5172 0.4615 0.4997 0.6256
0.2159 7.0 875 0.2283 0.6420 0.6649 0.6533 0.9328 0.5867 0.6206 0.6032 0.5 0.3125 0.3846 0.9076 0.9278 0.9176 0.2444 0.3438 0.2857 0.6522 0.5172 0.5769 0.5536 0.6542
0.0651 8.0 1000 0.2430 0.6103 0.6570 0.6328 0.9312 0.5632 0.6062 0.5839 0.5625 0.2812 0.375 0.9121 0.9222 0.9171 0.2241 0.4062 0.2889 0.4211 0.5517 0.4776 0.5285 0.6377
0.0651 9.0 1125 0.2367 0.6308 0.6583 0.6443 0.9338 0.5800 0.6206 0.5996 0.48 0.375 0.4211 0.8811 0.9056 0.8932 0.2286 0.25 0.2388 0.5556 0.5172 0.5357 0.5377 0.6441
0.0651 10.0 1250 0.2694 0.6241 0.6570 0.6401 0.9294 0.5639 0.6186 0.5900 0.4286 0.2812 0.3396 0.9171 0.9222 0.9197 0.2222 0.3125 0.2597 0.6842 0.4483 0.5417 0.5301 0.6419
0.0651 11.0 1375 0.2731 0.61 0.6438 0.6264 0.9325 0.5626 0.6021 0.5817 0.4286 0.375 0.4000 0.8511 0.8889 0.8696 0.2812 0.2812 0.2812 0.4545 0.5172 0.4839 0.5233 0.6259
0.0265 12.0 1500 0.2842 0.6221 0.6451 0.6334 0.9323 0.5697 0.5979 0.5835 0.3929 0.3438 0.3667 0.9027 0.9278 0.9151 0.2121 0.2188 0.2154 0.4516 0.4828 0.4667 0.5095 0.6331
0.0265 13.0 1625 0.2861 0.6438 0.6675 0.6554 0.9348 0.5802 0.6268 0.6026 0.5 0.4375 0.4667 0.9278 0.9278 0.9278 0.3 0.2812 0.2903 0.5 0.4138 0.4528 0.5480 0.6552
0.0265 14.0 1750 0.3202 0.6413 0.6557 0.6484 0.9323 0.5742 0.6144 0.5936 0.5263 0.3125 0.3922 0.9071 0.9222 0.9146 0.2647 0.2812 0.2727 0.7 0.4828 0.5714 0.5489 0.6469
0.0265 15.0 1875 0.3118 0.6284 0.6491 0.6385 0.9336 0.5731 0.6062 0.5892 0.4348 0.3125 0.3636 0.9066 0.9167 0.9116 0.25 0.3125 0.2778 0.52 0.4483 0.4815 0.5247 0.6390
0.0115 16.0 2000 0.3144 0.6293 0.6583 0.6435 0.9335 0.5747 0.6268 0.5996 0.3846 0.3125 0.3448 0.9061 0.9111 0.9086 0.2286 0.25 0.2388 0.5909 0.4483 0.5098 0.5203 0.6436
0.0115 17.0 2125 0.3235 0.6411 0.6623 0.6515 0.9346 0.5918 0.6247 0.6078 0.4 0.375 0.3871 0.9171 0.9222 0.9197 0.2121 0.2188 0.2154 0.5185 0.4828 0.5 0.5260 0.6519
0.0115 18.0 2250 0.3258 0.6230 0.6583 0.6402 0.9344 0.5948 0.6144 0.6045 0.3429 0.375 0.3582 0.8913 0.9111 0.9011 0.1957 0.2812 0.2308 0.4571 0.5517 0.5 0.5189 0.6447
0.0115 19.0 2375 0.3347 0.6312 0.6728 0.6513 0.9361 0.5794 0.6392 0.6078 0.4348 0.3125 0.3636 0.9022 0.9222 0.9121 0.2564 0.3125 0.2817 0.5185 0.4828 0.5 0.5331 0.6519
0.0076 20.0 2500 0.3334 0.6508 0.6662 0.6584 0.9328 0.6031 0.6330 0.6177 0.375 0.375 0.375 0.9231 0.9333 0.9282 0.1765 0.1875 0.1818 0.6316 0.4138 0.5 0.5205 0.6583
0.0076 21.0 2625 0.3426 0.6385 0.6781 0.6577 0.9344 0.5943 0.6495 0.6207 0.4231 0.3438 0.3793 0.9071 0.9222 0.9146 0.1765 0.1875 0.1818 0.5 0.5517 0.5246 0.5242 0.6581
0.0076 22.0 2750 0.3431 0.6269 0.6649 0.6453 0.9344 0.5862 0.6309 0.6077 0.3793 0.3438 0.3607 0.9071 0.9222 0.9146 0.1707 0.2188 0.1918 0.4828 0.4828 0.4828 0.5115 0.6478
0.0076 23.0 2875 0.3374 0.6410 0.6689 0.6546 0.9348 0.5913 0.6412 0.6152 0.3913 0.2812 0.3273 0.9180 0.9333 0.9256 0.2368 0.2812 0.2571 0.4762 0.3448 0.4000 0.5051 0.6534
0.0049 24.0 3000 0.3484 0.6338 0.6530 0.6433 0.9352 0.5819 0.6082 0.5948 0.4 0.375 0.3871 0.9126 0.9278 0.9201 0.2258 0.2188 0.2222 0.4667 0.4828 0.4746 0.5198 0.6429
0.0049 25.0 3125 0.3441 0.6444 0.6741 0.6589 0.9362 0.6015 0.6412 0.6208 0.4194 0.4062 0.4127 0.9076 0.9278 0.9176 0.1935 0.1875 0.1905 0.4667 0.4828 0.4746 0.5232 0.6587
0.0049 26.0 3250 0.3573 0.6330 0.6689 0.6504 0.9347 0.5802 0.6268 0.6026 0.4138 0.375 0.3934 0.9235 0.9389 0.9311 0.2286 0.25 0.2388 0.4667 0.4828 0.4746 0.5281 0.6515
0.0049 27.0 3375 0.3589 0.6477 0.6596 0.6536 0.9336 0.5909 0.6165 0.6034 0.48 0.375 0.4211 0.9126 0.9278 0.9201 0.25 0.2812 0.2647 0.5909 0.4483 0.5098 0.5438 0.6531
0.0037 28.0 3500 0.3629 0.6307 0.6715 0.6505 0.9354 0.5900 0.6351 0.6117 0.4444 0.375 0.4068 0.9016 0.9167 0.9091 0.2326 0.3125 0.2667 0.4375 0.4828 0.4590 0.5307 0.6533
0.0037 29.0 3625 0.3694 0.6490 0.6781 0.6632 0.9358 0.6035 0.6433 0.6228 0.4286 0.375 0.4000 0.9071 0.9222 0.9146 0.2105 0.25 0.2286 0.6154 0.5517 0.5818 0.5495 0.6644
0.0037 30.0 3750 0.3779 0.6456 0.6728 0.6589 0.9350 0.5946 0.6351 0.6142 0.5 0.375 0.4286 0.9071 0.9222 0.9146 0.2647 0.2812 0.2727 0.4839 0.5172 0.5000 0.5460 0.6589
0.0037 31.0 3875 0.3776 0.6465 0.6636 0.6549 0.9360 0.5992 0.6227 0.6107 0.4615 0.375 0.4138 0.9022 0.9222 0.9121 0.2424 0.25 0.2462 0.4839 0.5172 0.5000 0.5366 0.6543
0.0027 32.0 4000 0.3775 0.6429 0.6675 0.6550 0.9355 0.5984 0.6268 0.6123 0.4444 0.375 0.4068 0.9022 0.9222 0.9121 0.2368 0.2812 0.2571 0.5 0.5172 0.5085 0.5394 0.6558
0.0027 33.0 4125 0.3784 0.6378 0.6715 0.6542 0.9363 0.5881 0.6330 0.6097 0.4783 0.3438 0.4 0.9022 0.9222 0.9121 0.2381 0.3125 0.2703 0.5556 0.5172 0.5357 0.5456 0.6555
0.0027 34.0 4250 0.3824 0.6332 0.6741 0.6530 0.9356 0.5830 0.6371 0.6089 0.4286 0.375 0.4000 0.9022 0.9222 0.9121 0.2432 0.2812 0.2609 0.5357 0.5172 0.5263 0.5416 0.6542
0.0027 35.0 4375 0.3825 0.6415 0.6728 0.6568 0.9358 0.5954 0.6371 0.6155 0.4444 0.375 0.4068 0.9022 0.9222 0.9121 0.225 0.2812 0.25 0.56 0.4828 0.5185 0.5406 0.6580
0.0018 36.0 4500 0.3841 0.6390 0.6794 0.6586 0.9364 0.5928 0.6454 0.6180 0.4444 0.375 0.4068 0.9022 0.9222 0.9121 0.2368 0.2812 0.2571 0.5172 0.5172 0.5172 0.5422 0.6598
0.0018 37.0 4625 0.3892 0.6477 0.6741 0.6606 0.9361 0.6008 0.6392 0.6194 0.4286 0.375 0.4000 0.9022 0.9222 0.9121 0.2571 0.2812 0.2687 0.5385 0.4828 0.5091 0.5418 0.6606
0.0018 38.0 4750 0.3893 0.6480 0.6702 0.6589 0.9363 0.5988 0.6309 0.6145 0.4444 0.375 0.4068 0.9022 0.9222 0.9121 0.25 0.2812 0.2647 0.5769 0.5172 0.5455 0.5487 0.6590
0.0018 39.0 4875 0.3911 0.6422 0.6702 0.6559 0.9354 0.5907 0.6309 0.6102 0.4615 0.375 0.4138 0.9022 0.9222 0.9121 0.2571 0.2812 0.2687 0.5357 0.5172 0.5263 0.5462 0.6559
0.0016 40.0 5000 0.3907 0.6459 0.6715 0.6585 0.9355 0.5961 0.6330 0.614 0.4615 0.375 0.4138 0.9022 0.9222 0.9121 0.2571 0.2812 0.2687 0.5357 0.5172 0.5263 0.5470 0.6584

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
0
Safetensors
Model size
109M params
Tensor type
F32
·

Finetuned from