basmazouaoui's picture
end of training
975da75 verified
|
raw
history blame
12.1 kB
metadata
tags:
  - generated_from_trainer
metrics:
  - precision
  - recall
  - f1
  - accuracy
model-index:
  - name: Finetuned-camem-ner
    results: []

Finetuned-camem-ner

This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1080
  • Precision: 0.8445
  • Recall: 0.8740
  • F1: 0.8590
  • Accuracy: 0.9793

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • distributed_type: multi-GPU
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 10
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
1.2864 0.09 100 1.2891 0.0295 0.1206 0.0474 0.8750
0.8284 0.17 200 0.5688 0.0376 0.1252 0.0579 0.8888
0.374 0.26 300 0.2753 0.1477 0.2320 0.1805 0.9366
0.2215 0.35 400 0.1742 0.3205 0.3816 0.3484 0.9584
0.1447 0.43 500 0.1271 0.6077 0.7105 0.6551 0.9735
0.1183 0.52 600 0.1067 0.7066 0.7857 0.7440 0.9773
0.108 0.61 700 0.0983 0.7236 0.8071 0.7631 0.9779
0.0978 0.69 800 0.0880 0.7678 0.8224 0.7942 0.9789
0.0897 0.78 900 0.0908 0.7970 0.8432 0.8195 0.9797
0.0799 0.87 1000 0.0883 0.8052 0.8587 0.8311 0.9799
0.0868 0.95 1100 0.0832 0.8073 0.8622 0.8338 0.9801
0.0749 1.04 1200 0.0832 0.8138 0.8651 0.8387 0.9800
0.0765 1.13 1300 0.0844 0.8139 0.8689 0.8405 0.9800
0.0712 1.21 1400 0.0835 0.8262 0.8636 0.8445 0.9800
0.0678 1.3 1500 0.0838 0.8228 0.8687 0.8451 0.9801
0.0699 1.39 1600 0.0850 0.8212 0.8714 0.8455 0.9800
0.0731 1.47 1700 0.0809 0.8272 0.8709 0.8485 0.9800
0.0704 1.56 1800 0.0818 0.8400 0.8697 0.8546 0.9803
0.0749 1.65 1900 0.0820 0.8330 0.8726 0.8523 0.9802
0.0723 1.73 2000 0.0814 0.8423 0.8709 0.8563 0.9802
0.0737 1.82 2100 0.0814 0.8312 0.8737 0.8519 0.9801
0.073 1.91 2200 0.0821 0.8347 0.8769 0.8553 0.9799
0.0617 1.99 2300 0.0830 0.8375 0.8760 0.8563 0.9801
0.0607 2.08 2400 0.0863 0.8295 0.8803 0.8541 0.9803
0.0578 2.17 2500 0.0849 0.8365 0.8797 0.8575 0.9803
0.0546 2.25 2600 0.0854 0.8376 0.8785 0.8576 0.9802
0.0634 2.34 2700 0.0832 0.8375 0.8764 0.8565 0.9801
0.058 2.43 2800 0.0852 0.8405 0.8748 0.8573 0.9802
0.0616 2.51 2900 0.0851 0.8378 0.8796 0.8582 0.9800
0.0585 2.6 3000 0.0845 0.8434 0.8785 0.8606 0.9800
0.0542 2.69 3100 0.0847 0.8471 0.8773 0.8619 0.9801
0.0617 2.77 3200 0.0869 0.8396 0.8765 0.8577 0.9799
0.0634 2.86 3300 0.0828 0.8338 0.8773 0.8550 0.9796
0.0593 2.95 3400 0.0855 0.8360 0.8789 0.8569 0.9798
0.0486 3.03 3500 0.0888 0.8439 0.8781 0.8606 0.9801
0.0549 3.12 3600 0.0886 0.8444 0.8793 0.8615 0.9798
0.0499 3.21 3700 0.0925 0.8462 0.8771 0.8613 0.9800
0.0484 3.29 3800 0.0913 0.8449 0.8773 0.8608 0.9798
0.049 3.38 3900 0.0927 0.8409 0.8774 0.8588 0.9796
0.05 3.47 4000 0.0900 0.8468 0.8780 0.8621 0.9800
0.0456 3.55 4100 0.0904 0.8464 0.8787 0.8623 0.9801
0.051 3.64 4200 0.0911 0.8411 0.8778 0.8591 0.9798
0.0507 3.73 4300 0.0921 0.8457 0.8768 0.8610 0.9797
0.0526 3.81 4400 0.0888 0.8453 0.8774 0.8610 0.9801
0.0494 3.9 4500 0.0892 0.8440 0.8785 0.8609 0.9800
0.0513 3.99 4600 0.0901 0.8392 0.8811 0.8597 0.9796
0.0479 4.07 4700 0.0914 0.8461 0.8781 0.8618 0.9798
0.0408 4.16 4800 0.0938 0.8518 0.8724 0.8620 0.9797
0.0446 4.25 4900 0.0926 0.8475 0.8766 0.8618 0.9797
0.0425 4.33 5000 0.0927 0.8434 0.8762 0.8595 0.9795
0.0428 4.42 5100 0.0966 0.8473 0.8788 0.8628 0.9799
0.045 4.51 5200 0.0941 0.8428 0.8787 0.8604 0.9795
0.0472 4.59 5300 0.0894 0.8436 0.8757 0.8593 0.9794
0.0436 4.68 5400 0.0961 0.8464 0.8755 0.8607 0.9800
0.0466 4.77 5500 0.0947 0.8451 0.8767 0.8606 0.9797
0.0438 4.85 5600 0.0951 0.8398 0.8779 0.8584 0.9795
0.0444 4.94 5700 0.0965 0.8431 0.8767 0.8596 0.9797
0.0444 5.03 5800 0.0929 0.8421 0.8780 0.8597 0.9798
0.0382 5.11 5900 0.0983 0.8460 0.8772 0.8613 0.9796
0.0388 5.2 6000 0.0979 0.8406 0.8806 0.8601 0.9797
0.0434 5.29 6100 0.0963 0.8463 0.8783 0.8620 0.9795
0.038 5.37 6200 0.0977 0.8457 0.8774 0.8612 0.9795
0.0406 5.46 6300 0.0970 0.8454 0.8780 0.8614 0.9796
0.0415 5.55 6400 0.0971 0.8442 0.8769 0.8602 0.9795
0.037 5.63 6500 0.1001 0.8448 0.8771 0.8607 0.9794
0.0375 5.72 6600 0.1000 0.8448 0.8744 0.8593 0.9794
0.0414 5.81 6700 0.0955 0.8478 0.8745 0.8609 0.9794
0.0422 5.89 6800 0.0966 0.8482 0.8746 0.8612 0.9794
0.04 5.98 6900 0.0995 0.8410 0.8776 0.8589 0.9795
0.0367 6.07 7000 0.1008 0.8460 0.8757 0.8606 0.9795
0.0385 6.15 7100 0.1025 0.8428 0.8766 0.8593 0.9793
0.039 6.24 7200 0.1003 0.8424 0.8766 0.8592 0.9794
0.0344 6.33 7300 0.1047 0.8421 0.8784 0.8599 0.9794
0.0346 6.41 7400 0.1022 0.8419 0.8780 0.8596 0.9793
0.0379 6.5 7500 0.0978 0.8467 0.8772 0.8617 0.9797
0.0358 6.59 7600 0.1018 0.8446 0.8767 0.8603 0.9792
0.0363 6.67 7700 0.1001 0.8432 0.8768 0.8597 0.9792
0.0378 6.76 7800 0.1030 0.8456 0.8767 0.8609 0.9794
0.0403 6.85 7900 0.0971 0.8418 0.8761 0.8586 0.9793
0.0352 6.93 8000 0.1035 0.8456 0.8757 0.8604 0.9793
0.0332 7.02 8100 0.1021 0.8450 0.8755 0.8600 0.9792
0.0371 7.11 8200 0.1032 0.8478 0.8746 0.8610 0.9794
0.034 7.19 8300 0.1037 0.8467 0.8738 0.8600 0.9794
0.033 7.28 8400 0.1037 0.8457 0.8747 0.8599 0.9793
0.0329 7.37 8500 0.1048 0.8459 0.8751 0.8602 0.9791
0.0317 7.45 8600 0.1074 0.8441 0.8757 0.8596 0.9792
0.0319 7.54 8700 0.1056 0.8437 0.8753 0.8592 0.9792
0.0335 7.63 8800 0.1034 0.8446 0.8736 0.8589 0.9793
0.0346 7.71 8900 0.1069 0.8461 0.8735 0.8596 0.9792
0.0342 7.8 9000 0.1031 0.8427 0.8757 0.8589 0.9793
0.0371 7.89 9100 0.1024 0.8438 0.8747 0.8590 0.9793
0.0384 7.97 9200 0.1032 0.8472 0.8746 0.8607 0.9795
0.0308 8.06 9300 0.1070 0.8449 0.8753 0.8598 0.9793
0.0318 8.15 9400 0.1070 0.8459 0.8738 0.8596 0.9794
0.0285 8.23 9500 0.1077 0.8474 0.8751 0.8610 0.9794
0.0334 8.32 9600 0.1066 0.8443 0.8757 0.8598 0.9793
0.0332 8.41 9700 0.1055 0.8462 0.8747 0.8602 0.9793
0.0341 8.49 9800 0.1056 0.8442 0.8749 0.8593 0.9793
0.0304 8.58 9900 0.1066 0.8447 0.8729 0.8586 0.9792
0.0353 8.67 10000 0.1057 0.8446 0.8741 0.8591 0.9792
0.0348 8.75 10100 0.1051 0.8443 0.8736 0.8587 0.9792
0.0326 8.84 10200 0.1047 0.8443 0.8757 0.8597 0.9793
0.0332 8.93 10300 0.1044 0.8461 0.8732 0.8594 0.9793
0.0328 9.01 10400 0.1053 0.8438 0.8744 0.8588 0.9792
0.0318 9.1 10500 0.1072 0.8415 0.8746 0.8577 0.9793
0.0296 9.19 10600 0.1084 0.8431 0.8743 0.8584 0.9793
0.0324 9.27 10700 0.1074 0.8448 0.8746 0.8594 0.9794
0.0326 9.36 10800 0.1080 0.8439 0.8752 0.8593 0.9793
0.0288 9.45 10900 0.1084 0.8451 0.8739 0.8593 0.9794
0.0314 9.53 11000 0.1082 0.8450 0.8746 0.8596 0.9794
0.0292 9.62 11100 0.1084 0.8446 0.8740 0.8590 0.9794
0.0328 9.71 11200 0.1080 0.8447 0.8741 0.8591 0.9794
0.0313 9.79 11300 0.1080 0.8439 0.8747 0.8590 0.9794
0.0295 9.88 11400 0.1080 0.8445 0.8739 0.8589 0.9793
0.0316 9.97 11500 0.1080 0.8445 0.8740 0.8590 0.9793

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.1.2
  • Datasets 2.18.0
  • Tokenizers 0.15.2