Edit model card

stanford-deidentifier-base-finetuned-ner

This model is a fine-tuned version of StanfordAIMI/stanford-deidentifier-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6522
  • 0 Precision: 0.9766
  • 0 Recall: 0.9646
  • 0 F1-score: 0.9706
  • 1 Precision: 0.8268
  • 1 Recall: 0.8689
  • 1 F1-score: 0.8473
  • 2 Precision: 0.8419
  • 2 Recall: 0.8916
  • 2 F1-score: 0.8660
  • 3 Precision: 0.8394
  • 3 Recall: 0.8975
  • 3 F1-score: 0.8675
  • Accuracy: 0.9507
  • Macro avg Precision: 0.8712
  • Macro avg Recall: 0.9057
  • Macro avg F1-score: 0.8878
  • Weighted avg Precision: 0.9521
  • Weighted avg Recall: 0.9507
  • Weighted avg F1-score: 0.9513

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 60

Training results

Training Loss Epoch Step Validation Loss 0 Precision 0 Recall 0 F1-score 1 Precision 1 Recall 1 F1-score 2 Precision 2 Recall 2 F1-score 3 Precision 3 Recall 3 F1-score Accuracy Macro avg Precision Macro avg Recall Macro avg F1-score Weighted avg Precision Weighted avg Recall Weighted avg F1-score
No log 1.0 67 0.2686 0.9927 0.8966 0.9422 0.5702 0.9369 0.7089 0.5731 0.9458 0.7138 0.7626 0.8809 0.8175 0.9003 0.7246 0.9151 0.7956 0.9322 0.9003 0.9091
No log 2.0 134 0.2344 0.9922 0.9195 0.9544 0.6492 0.9296 0.7645 0.6989 0.9261 0.7966 0.7478 0.9529 0.8380 0.9226 0.7720 0.9320 0.8384 0.9411 0.9226 0.9275
No log 3.0 201 0.2085 0.9944 0.9139 0.9525 0.6088 0.9709 0.7484 0.7197 0.9360 0.8137 0.7834 0.9418 0.8553 0.9207 0.7766 0.9407 0.8425 0.9430 0.9207 0.9265
No log 4.0 268 0.2154 0.9925 0.9342 0.9625 0.6791 0.9709 0.7992 0.7966 0.9261 0.8565 0.8028 0.9474 0.8691 0.9374 0.8178 0.9446 0.8718 0.9506 0.9374 0.9408
No log 5.0 335 0.2524 0.9892 0.9381 0.9630 0.6934 0.9442 0.7996 0.7602 0.9212 0.8330 0.8200 0.9335 0.8731 0.9376 0.8157 0.9342 0.8671 0.9486 0.9376 0.9407
No log 6.0 402 0.2449 0.9908 0.9383 0.9638 0.7013 0.9515 0.8074 0.7705 0.9261 0.8412 0.8052 0.9391 0.8670 0.9389 0.8169 0.9387 0.8698 0.9499 0.9389 0.9418
No log 7.0 469 0.2695 0.9877 0.9426 0.9646 0.7282 0.9102 0.8091 0.7519 0.9557 0.8416 0.8153 0.9418 0.8740 0.9406 0.8208 0.9376 0.8723 0.9493 0.9406 0.9431
0.196 8.0 536 0.3949 0.9802 0.9596 0.9698 0.7948 0.8835 0.8368 0.8190 0.8916 0.8538 0.8430 0.9224 0.8810 0.9493 0.8592 0.9143 0.8853 0.9521 0.9493 0.9503
0.196 9.0 603 0.3717 0.9810 0.9581 0.9694 0.7918 0.8859 0.8362 0.8097 0.9015 0.8531 0.8291 0.9141 0.8696 0.9480 0.8529 0.9149 0.8821 0.9514 0.9480 0.9492
0.196 10.0 670 0.3790 0.9808 0.9509 0.9656 0.7426 0.9102 0.8179 0.7895 0.8867 0.8353 0.8264 0.8837 0.8541 0.9413 0.8348 0.9079 0.8682 0.9468 0.9413 0.9431
0.196 11.0 737 0.4844 0.9738 0.9671 0.9704 0.8519 0.8519 0.8519 0.8389 0.8719 0.8551 0.8286 0.8837 0.8552 0.9500 0.8733 0.8937 0.8832 0.9508 0.9500 0.9503
0.196 12.0 804 0.3634 0.9858 0.9515 0.9684 0.7669 0.9102 0.8324 0.8161 0.8966 0.8545 0.8103 0.9584 0.8782 0.9470 0.8448 0.9292 0.8834 0.9526 0.9470 0.9486
0.196 13.0 871 0.5154 0.9773 0.9663 0.9718 0.8460 0.8665 0.8561 0.8411 0.8867 0.8633 0.8372 0.9114 0.8727 0.9526 0.8754 0.9077 0.8910 0.9539 0.9526 0.9531
0.196 14.0 938 0.4340 0.9827 0.9605 0.9715 0.8125 0.8835 0.8465 0.8251 0.9064 0.8638 0.8297 0.9446 0.8834 0.9519 0.8625 0.9237 0.8913 0.9549 0.9519 0.9529
0.0268 15.0 1005 0.4523 0.9818 0.9588 0.9702 0.7996 0.8811 0.8383 0.8153 0.8916 0.8518 0.8293 0.9418 0.8820 0.9496 0.8565 0.9183 0.8856 0.9529 0.9496 0.9507
0.0268 16.0 1072 0.4877 0.9777 0.9628 0.9702 0.8129 0.8859 0.8479 0.8219 0.8867 0.8531 0.8417 0.8837 0.8622 0.9495 0.8635 0.9048 0.8833 0.9514 0.9495 0.9502
0.0268 17.0 1139 0.5068 0.9817 0.9609 0.9712 0.8118 0.8689 0.8394 0.8311 0.8966 0.8626 0.8313 0.9557 0.8892 0.9516 0.8640 0.9205 0.8906 0.9543 0.9516 0.9525
0.0268 18.0 1206 0.5995 0.9725 0.9684 0.9705 0.8430 0.8471 0.8450 0.8429 0.8719 0.8571 0.8387 0.8643 0.8513 0.9495 0.8743 0.8879 0.8810 0.9499 0.9495 0.9497
0.0268 19.0 1273 0.6119 0.9750 0.9663 0.9706 0.8401 0.8544 0.8472 0.8436 0.8768 0.8599 0.8299 0.8920 0.8598 0.9502 0.8721 0.8974 0.8844 0.9512 0.9502 0.9506
0.0268 20.0 1340 0.6114 0.9740 0.9667 0.9703 0.8353 0.8617 0.8483 0.8585 0.8670 0.8627 0.8346 0.8809 0.8571 0.9500 0.8756 0.8941 0.8846 0.9508 0.9500 0.9504
0.0268 21.0 1407 0.5467 0.9800 0.9641 0.9720 0.8284 0.8908 0.8585 0.8436 0.8768 0.8599 0.8392 0.9252 0.8801 0.9532 0.8728 0.9142 0.8926 0.9551 0.9532 0.9539
0.0268 22.0 1474 0.5370 0.9779 0.9618 0.9697 0.8266 0.8908 0.8575 0.8257 0.8867 0.8551 0.8235 0.8920 0.8564 0.9495 0.8634 0.9078 0.8847 0.9515 0.9495 0.9502
0.007 23.0 1541 0.6169 0.9739 0.9658 0.9699 0.8480 0.8665 0.8571 0.8429 0.8719 0.8571 0.8203 0.8726 0.8456 0.9493 0.8713 0.8942 0.8824 0.9503 0.9493 0.9497
0.007 24.0 1608 0.6113 0.9744 0.9658 0.9701 0.8269 0.8811 0.8531 0.8725 0.8768 0.8747 0.8289 0.8587 0.8435 0.9496 0.8757 0.8956 0.8854 0.9507 0.9496 0.9501
0.007 25.0 1675 0.5969 0.9775 0.9656 0.9715 0.8440 0.8665 0.8551 0.8161 0.8966 0.8545 0.8398 0.9003 0.8690 0.9518 0.8694 0.9072 0.8875 0.9532 0.9518 0.9523
0.007 26.0 1742 0.5509 0.9811 0.9620 0.9714 0.8288 0.8811 0.8541 0.8249 0.8818 0.8524 0.8180 0.9335 0.8719 0.9514 0.8632 0.9146 0.8875 0.9540 0.9514 0.9523
0.007 27.0 1809 0.5003 0.9818 0.9567 0.9691 0.7780 0.8932 0.8316 0.8243 0.9015 0.8612 0.8304 0.9224 0.8740 0.9479 0.8536 0.9184 0.8840 0.9517 0.9479 0.9491
0.007 28.0 1876 0.5675 0.9806 0.9611 0.9708 0.8039 0.8956 0.8473 0.8429 0.8719 0.8571 0.8275 0.9169 0.8699 0.9503 0.8637 0.9114 0.8863 0.9530 0.9503 0.9513
0.007 29.0 1943 0.6114 0.9806 0.9616 0.9710 0.8053 0.8835 0.8426 0.8551 0.8719 0.8634 0.8260 0.9335 0.8765 0.9509 0.8667 0.9126 0.8884 0.9535 0.9509 0.9518
0.0025 30.0 2010 0.6773 0.9741 0.9654 0.9698 0.8172 0.8786 0.8468 0.8737 0.8522 0.8628 0.8329 0.8698 0.8509 0.9489 0.8745 0.8915 0.8826 0.9501 0.9489 0.9494
0.0025 31.0 2077 0.5380 0.9791 0.9601 0.9695 0.7996 0.8908 0.8427 0.8326 0.8818 0.8565 0.8321 0.9058 0.8674 0.9488 0.8608 0.9096 0.8840 0.9514 0.9488 0.9497
0.0025 32.0 2144 0.5114 0.9825 0.9601 0.9712 0.8102 0.8908 0.8486 0.8296 0.9113 0.8685 0.8280 0.9335 0.8776 0.9516 0.8626 0.9239 0.8915 0.9546 0.9516 0.9526
0.0025 33.0 2211 0.5792 0.9785 0.9611 0.9697 0.8013 0.8908 0.8437 0.8429 0.8719 0.8571 0.8286 0.8975 0.8617 0.9488 0.8628 0.9053 0.8831 0.9512 0.9488 0.9496
0.0025 34.0 2278 0.6516 0.9748 0.9658 0.9703 0.8361 0.8665 0.8510 0.8396 0.8768 0.8578 0.8342 0.8781 0.8556 0.9498 0.8712 0.8968 0.8837 0.9509 0.9498 0.9503
0.0025 35.0 2345 0.5294 0.9823 0.9599 0.9709 0.8071 0.8835 0.8436 0.8227 0.8916 0.8558 0.8301 0.9474 0.8849 0.9511 0.8606 0.9206 0.8888 0.9541 0.9511 0.9521
0.0025 36.0 2412 0.6674 0.9759 0.9665 0.9711 0.8469 0.8592 0.8530 0.8364 0.8818 0.8585 0.8303 0.8947 0.8613 0.9511 0.8724 0.9006 0.8860 0.9522 0.9511 0.9515
0.0025 37.0 2479 0.6354 0.9753 0.9626 0.9689 0.8141 0.8714 0.8417 0.8310 0.8719 0.8510 0.8329 0.8837 0.8575 0.9477 0.8633 0.8974 0.8798 0.9493 0.9477 0.9483
0.0014 38.0 2546 0.6787 0.9746 0.9658 0.9702 0.8526 0.8422 0.8474 0.8241 0.8768 0.8496 0.8253 0.9030 0.8624 0.9496 0.8691 0.8970 0.8824 0.9508 0.9496 0.9500
0.0014 39.0 2613 0.7050 0.9739 0.9646 0.9692 0.8279 0.8641 0.8456 0.8462 0.8670 0.8564 0.8303 0.8809 0.8548 0.9484 0.8696 0.8941 0.8815 0.9495 0.9484 0.9489
0.0014 40.0 2680 0.6279 0.9781 0.9618 0.9699 0.8205 0.8762 0.8474 0.8174 0.8818 0.8483 0.8278 0.9058 0.8651 0.9491 0.8609 0.9064 0.8827 0.9512 0.9491 0.9499
0.0014 41.0 2747 0.6812 0.9779 0.9658 0.9719 0.8436 0.8641 0.8537 0.8333 0.8867 0.8592 0.8333 0.9141 0.8719 0.9523 0.8721 0.9077 0.8892 0.9538 0.9523 0.9528
0.0014 42.0 2814 0.6036 0.9804 0.9594 0.9698 0.7898 0.9029 0.8426 0.8443 0.8818 0.8627 0.8295 0.9030 0.8647 0.9489 0.8610 0.9118 0.8849 0.9520 0.9489 0.9500
0.0014 43.0 2881 0.6358 0.9779 0.9622 0.9700 0.8075 0.8859 0.8449 0.8524 0.8818 0.8668 0.8329 0.8975 0.864 0.9496 0.8677 0.9069 0.8864 0.9517 0.9496 0.9504
0.0014 44.0 2948 0.6128 0.98 0.9626 0.9712 0.8121 0.8811 0.8452 0.8426 0.8966 0.8687 0.8384 0.9197 0.8771 0.9516 0.8683 0.9150 0.8906 0.9538 0.9516 0.9524
0.001 45.0 3015 0.7238 0.9744 0.9663 0.9703 0.8318 0.8641 0.8476 0.8689 0.8818 0.8753 0.8373 0.8837 0.8598 0.9505 0.8781 0.8989 0.8883 0.9515 0.9505 0.9509
0.001 46.0 3082 0.5885 0.9811 0.9620 0.9714 0.8157 0.8811 0.8471 0.8161 0.8966 0.8545 0.8421 0.9307 0.8842 0.9518 0.8638 0.9176 0.8893 0.9542 0.9518 0.9526
0.001 47.0 3149 0.6924 0.9744 0.9654 0.9699 0.8245 0.8665 0.8450 0.8436 0.8768 0.8599 0.8373 0.8698 0.8533 0.9489 0.8699 0.8946 0.8820 0.9500 0.9489 0.9494
0.001 48.0 3216 0.5978 0.9798 0.9609 0.9702 0.8040 0.8859 0.8430 0.8287 0.8818 0.8544 0.8333 0.9141 0.8719 0.9496 0.8614 0.9107 0.8849 0.9522 0.9496 0.9505
0.001 49.0 3283 0.5363 0.9823 0.9592 0.9706 0.8013 0.8908 0.8437 0.8222 0.9113 0.8645 0.8313 0.9280 0.8770 0.9505 0.8593 0.9223 0.8889 0.9537 0.9505 0.9516
0.001 50.0 3350 0.5867 0.9793 0.9616 0.9704 0.8093 0.8859 0.8459 0.8265 0.8916 0.8578 0.8363 0.9058 0.8697 0.9500 0.8629 0.9112 0.8859 0.9524 0.9500 0.9508
0.001 51.0 3417 0.6383 0.9775 0.9643 0.9709 0.8238 0.8738 0.8481 0.8404 0.8818 0.8606 0.8355 0.9003 0.8667 0.9507 0.8693 0.9050 0.8865 0.9523 0.9507 0.9513
0.001 52.0 3484 0.6261 0.9779 0.9626 0.9702 0.8071 0.8835 0.8436 0.8436 0.8768 0.8599 0.8346 0.8947 0.8636 0.9495 0.8658 0.9044 0.8843 0.9515 0.9495 0.9502
0.0008 53.0 3551 0.6686 0.9763 0.9663 0.9712 0.8361 0.8665 0.8510 0.8647 0.8818 0.8732 0.8308 0.8975 0.8628 0.9516 0.8770 0.9030 0.8896 0.9528 0.9516 0.9521
0.0008 54.0 3618 0.6354 0.9785 0.9641 0.9713 0.8291 0.8714 0.8497 0.8333 0.8867 0.8592 0.8333 0.9141 0.8719 0.9514 0.8686 0.9091 0.8880 0.9532 0.9514 0.9521
0.0008 55.0 3685 0.6611 0.9758 0.9650 0.9704 0.8287 0.8689 0.8483 0.8483 0.8818 0.8647 0.8338 0.8892 0.8606 0.9502 0.8717 0.9012 0.8860 0.9515 0.9502 0.9507
0.0008 56.0 3752 0.6380 0.9766 0.9637 0.9701 0.8200 0.8738 0.8461 0.8451 0.8867 0.8654 0.8342 0.8920 0.8621 0.9498 0.8690 0.9040 0.8859 0.9514 0.9498 0.9504
0.0008 57.0 3819 0.6418 0.9768 0.9635 0.9701 0.8182 0.8738 0.8451 0.8419 0.8916 0.8660 0.8390 0.8947 0.8660 0.9500 0.8690 0.9059 0.8868 0.9516 0.9500 0.9506
0.0008 58.0 3886 0.6478 0.9770 0.9635 0.9702 0.8182 0.8738 0.8451 0.8419 0.8916 0.8660 0.8394 0.8975 0.8675 0.9502 0.8691 0.9066 0.8872 0.9518 0.9502 0.9508
0.0008 59.0 3953 0.6498 0.9766 0.9643 0.9705 0.8249 0.8689 0.8463 0.8419 0.8916 0.8660 0.8394 0.8975 0.8675 0.9505 0.8707 0.9056 0.8876 0.9520 0.9505 0.9511
0.0006 60.0 4020 0.6522 0.9766 0.9646 0.9706 0.8268 0.8689 0.8473 0.8419 0.8916 0.8660 0.8394 0.8975 0.8675 0.9507 0.8712 0.9057 0.8878 0.9521 0.9507 0.9513

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
2
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for antoineedy/stanford-deidentifier-base-finetuned-ner

Finetuned
(2)
this model