Edit model card

nerui-base-1

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0822
  • Location Precision: 0.9573
  • Location Recall: 0.9655
  • Location F1: 0.9614
  • Location Number: 116
  • Organization Precision: 0.9608
  • Organization Recall: 0.9304
  • Organization F1: 0.9453
  • Organization Number: 158
  • Person Precision: 0.984
  • Person Recall: 0.9919
  • Person F1: 0.9880
  • Person Number: 124
  • Overall Precision: 0.9671
  • Overall Recall: 0.9598
  • Overall F1: 0.9634
  • Overall Accuracy: 0.9920

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.2668 1.0 96 0.0394 0.9145 0.9224 0.9185 116 0.9141 0.9430 0.9283 158 0.984 0.9919 0.9880 124 0.9358 0.9523 0.9440 0.9879
0.0634 2.0 192 0.0460 0.9237 0.9397 0.9316 116 0.9477 0.9177 0.9325 158 0.984 0.9919 0.9880 124 0.9520 0.9472 0.9496 0.9882
0.032 3.0 288 0.0441 0.9474 0.9310 0.9391 116 0.9427 0.9367 0.9397 158 0.9762 0.9919 0.9840 124 0.9547 0.9523 0.9535 0.9890
0.022 4.0 384 0.0442 0.9732 0.9397 0.9561 116 0.9255 0.9430 0.9342 158 0.984 0.9919 0.9880 124 0.9573 0.9573 0.9573 0.9909
0.0143 5.0 480 0.0474 0.9339 0.9741 0.9536 116 0.9671 0.9304 0.9484 158 0.976 0.9839 0.9799 124 0.9598 0.9598 0.9598 0.9898
0.0122 6.0 576 0.0581 0.9328 0.9569 0.9447 116 0.9662 0.9051 0.9346 158 0.976 0.9839 0.9799 124 0.9592 0.9447 0.9519 0.9885
0.0062 7.0 672 0.0578 0.9569 0.9569 0.9569 116 0.9548 0.9367 0.9457 158 0.984 0.9919 0.9880 124 0.9646 0.9598 0.9622 0.9909
0.007 8.0 768 0.0608 0.9655 0.9655 0.9655 116 0.9551 0.9430 0.9490 158 0.984 0.9919 0.9880 124 0.9673 0.9648 0.9660 0.9901
0.0049 9.0 864 0.0656 0.9328 0.9569 0.9447 116 0.9530 0.8987 0.9251 158 0.984 0.9919 0.9880 124 0.9567 0.9447 0.9507 0.9874
0.0056 10.0 960 0.0566 0.9569 0.9569 0.9569 116 0.9423 0.9304 0.9363 158 0.9762 0.9919 0.9840 124 0.9573 0.9573 0.9573 0.9896
0.0046 11.0 1056 0.0709 0.9492 0.9655 0.9573 116 0.9346 0.9051 0.9196 158 0.984 0.9919 0.9880 124 0.9545 0.9497 0.9521 0.9879
0.0022 12.0 1152 0.0721 0.9412 0.9655 0.9532 116 0.9548 0.9367 0.9457 158 0.984 0.9919 0.9880 124 0.9599 0.9623 0.9611 0.9901
0.0048 13.0 1248 0.0544 0.9487 0.9569 0.9528 116 0.9490 0.9430 0.9460 158 0.9762 0.9919 0.9840 124 0.9575 0.9623 0.9599 0.9920
0.0029 14.0 1344 0.0602 0.9649 0.9483 0.9565 116 0.9434 0.9494 0.9464 158 0.984 0.9919 0.9880 124 0.9623 0.9623 0.9623 0.9918
0.0031 15.0 1440 0.0678 0.9478 0.9397 0.9437 116 0.9484 0.9304 0.9393 158 0.9762 0.9919 0.9840 124 0.9571 0.9523 0.9547 0.9904
0.0039 16.0 1536 0.0820 0.9244 0.9483 0.9362 116 0.96 0.9114 0.9351 158 0.9839 0.9839 0.9839 124 0.9567 0.9447 0.9507 0.9871
0.0021 17.0 1632 0.0793 0.9421 0.9828 0.9620 116 0.9735 0.9304 0.9515 158 0.9762 0.9919 0.9840 124 0.9648 0.9648 0.9648 0.9898
0.0035 18.0 1728 0.0844 0.9310 0.9310 0.9310 116 0.9545 0.9304 0.9423 158 0.9685 0.9919 0.9801 124 0.9521 0.9497 0.9509 0.9879
0.0039 19.0 1824 0.0907 0.9106 0.9655 0.9372 116 0.9726 0.8987 0.9342 158 0.9762 0.9919 0.9840 124 0.9544 0.9472 0.9508 0.9868
0.0014 20.0 1920 0.0629 0.9412 0.9655 0.9532 116 0.9608 0.9304 0.9453 158 0.984 0.9919 0.9880 124 0.9622 0.9598 0.9610 0.9912
0.0019 21.0 2016 0.0655 0.9412 0.9655 0.9532 116 0.9737 0.9367 0.9548 158 0.984 0.9919 0.9880 124 0.9672 0.9623 0.9647 0.9909
0.0021 22.0 2112 0.0593 0.9487 0.9569 0.9528 116 0.9371 0.9430 0.9401 158 0.984 0.9919 0.9880 124 0.9551 0.9623 0.9587 0.9915
0.0038 23.0 2208 0.0698 0.9322 0.9483 0.9402 116 0.9608 0.9304 0.9453 158 0.9762 0.9919 0.9840 124 0.9572 0.9548 0.9560 0.9890
0.0024 24.0 2304 0.0686 0.9492 0.9655 0.9573 116 0.9669 0.9241 0.9450 158 0.984 0.9919 0.9880 124 0.9670 0.9573 0.9621 0.9901
0.0032 25.0 2400 0.0782 0.9174 0.9569 0.9367 116 0.9412 0.9114 0.9260 158 0.984 0.9919 0.9880 124 0.9474 0.9497 0.9486 0.9874
0.0028 26.0 2496 0.0841 0.9167 0.9483 0.9322 116 0.9865 0.9241 0.9542 158 0.984 0.9919 0.9880 124 0.9644 0.9523 0.9583 0.9893
0.0024 27.0 2592 0.0762 0.9174 0.9569 0.9367 116 0.9554 0.9494 0.9524 158 0.984 0.9919 0.9880 124 0.9529 0.9648 0.9588 0.9893
0.0065 28.0 2688 0.0943 0.9483 0.9483 0.9483 116 0.9662 0.9051 0.9346 158 0.9531 0.9839 0.9683 124 0.9566 0.9422 0.9494 0.9887
0.0026 29.0 2784 0.0959 0.9487 0.9569 0.9528 116 0.9664 0.9114 0.9381 158 0.984 0.9919 0.9880 124 0.9668 0.9497 0.9582 0.9874
0.002 30.0 2880 0.0732 0.9402 0.9483 0.9442 116 0.9548 0.9367 0.9457 158 0.984 0.9919 0.9880 124 0.9597 0.9573 0.9585 0.9912
0.0012 31.0 2976 0.0808 0.9487 0.9569 0.9528 116 0.9735 0.9304 0.9515 158 0.984 0.9919 0.9880 124 0.9695 0.9573 0.9633 0.9901
0.001 32.0 3072 0.0846 0.9492 0.9655 0.9573 116 0.98 0.9304 0.9545 158 0.984 0.9919 0.9880 124 0.9720 0.9598 0.9659 0.9898
0.0018 33.0 3168 0.0949 0.9487 0.9569 0.9528 116 0.9735 0.9304 0.9515 158 0.984 0.9919 0.9880 124 0.9695 0.9573 0.9633 0.9893
0.0012 34.0 3264 0.0965 0.9322 0.9483 0.9402 116 0.9669 0.9241 0.9450 158 0.9685 0.9919 0.9801 124 0.9571 0.9523 0.9547 0.9879
0.0025 35.0 3360 0.1011 0.9554 0.9224 0.9386 116 0.9367 0.9367 0.9367 158 0.9762 0.9919 0.9840 124 0.9545 0.9497 0.9521 0.9879
0.0029 36.0 3456 0.0913 0.9487 0.9569 0.9528 116 0.9545 0.9304 0.9423 158 0.9762 0.9919 0.9840 124 0.9597 0.9573 0.9585 0.9882
0.0037 37.0 3552 0.0543 0.9492 0.9655 0.9573 116 0.9430 0.9430 0.9430 158 0.9762 0.9919 0.9840 124 0.9552 0.9648 0.96 0.9923
0.002 38.0 3648 0.0655 0.9487 0.9569 0.9528 116 0.9430 0.9430 0.9430 158 0.984 0.9919 0.9880 124 0.9575 0.9623 0.9599 0.9909
0.0015 39.0 3744 0.0786 0.9565 0.9483 0.9524 116 0.9671 0.9304 0.9484 158 0.984 0.9919 0.9880 124 0.9694 0.9548 0.9620 0.9893
0.001 40.0 3840 0.0722 0.9328 0.9569 0.9447 116 0.9608 0.9304 0.9453 158 0.984 0.9919 0.9880 124 0.9597 0.9573 0.9585 0.9904
0.0021 41.0 3936 0.0722 0.9350 0.9914 0.9623 116 0.9737 0.9367 0.9548 158 0.984 0.9919 0.9880 124 0.965 0.9698 0.9674 0.9904
0.0018 42.0 4032 0.0764 0.9483 0.9483 0.9483 116 0.9608 0.9304 0.9453 158 0.9762 0.9919 0.9840 124 0.9620 0.9548 0.9584 0.9893
0.0009 43.0 4128 0.0854 0.9492 0.9655 0.9573 116 0.9735 0.9304 0.9515 158 0.9839 0.9839 0.9839 124 0.9695 0.9573 0.9633 0.9898
0.0007 44.0 4224 0.0778 0.9412 0.9655 0.9532 116 0.9735 0.9304 0.9515 158 0.9839 0.9839 0.9839 124 0.9670 0.9573 0.9621 0.9904
0.0018 45.0 4320 0.0880 0.9558 0.9310 0.9432 116 0.9481 0.9241 0.9359 158 0.976 0.9839 0.9799 124 0.9592 0.9447 0.9519 0.9887
0.0022 46.0 4416 0.0823 0.9412 0.9655 0.9532 116 0.9867 0.9367 0.9610 158 0.976 0.9839 0.9799 124 0.9695 0.9598 0.9646 0.9901
0.0013 47.0 4512 0.0913 0.9483 0.9483 0.9483 116 0.98 0.9304 0.9545 158 0.9762 0.9919 0.9840 124 0.9694 0.9548 0.9620 0.9896
0.0013 48.0 4608 0.0819 0.9417 0.9741 0.9576 116 0.9801 0.9367 0.9579 158 0.984 0.9919 0.9880 124 0.9697 0.9648 0.9673 0.9901
0.0005 49.0 4704 0.0735 0.9412 0.9655 0.9532 116 0.9737 0.9367 0.9548 158 0.984 0.9919 0.9880 124 0.9672 0.9623 0.9647 0.9909
0.0011 50.0 4800 0.0772 0.9483 0.9483 0.9483 116 0.9484 0.9304 0.9393 158 0.984 0.9919 0.9880 124 0.9596 0.9548 0.9572 0.9907
0.0021 51.0 4896 0.0813 0.9492 0.9655 0.9573 116 0.9735 0.9304 0.9515 158 0.984 0.9919 0.9880 124 0.9695 0.9598 0.9646 0.9904
0.0006 52.0 4992 0.0927 0.9576 0.9741 0.9658 116 0.98 0.9304 0.9545 158 0.984 0.9919 0.9880 124 0.9746 0.9623 0.9684 0.9904
0.0007 53.0 5088 0.0791 0.9496 0.9741 0.9617 116 0.9740 0.9494 0.9615 158 0.984 0.9919 0.9880 124 0.9698 0.9698 0.9698 0.9912
0.0011 54.0 5184 0.0722 0.9496 0.9741 0.9617 116 0.9679 0.9557 0.9618 158 0.984 0.9919 0.9880 124 0.9675 0.9724 0.9699 0.9929
0.0005 55.0 5280 0.0721 0.9328 0.9569 0.9447 116 0.9557 0.9557 0.9557 158 0.9839 0.9839 0.9839 124 0.9576 0.9648 0.9612 0.9920
0.0005 56.0 5376 0.0705 0.9496 0.9741 0.9617 116 0.9806 0.9620 0.9712 158 0.9839 0.9839 0.9839 124 0.9724 0.9724 0.9724 0.9931
0.0003 57.0 5472 0.0651 0.9487 0.9569 0.9528 116 0.9677 0.9494 0.9585 158 0.9839 0.9839 0.9839 124 0.9672 0.9623 0.9647 0.9923
0.0011 58.0 5568 0.0754 0.9569 0.9569 0.9569 116 0.9679 0.9557 0.9618 158 0.9839 0.9839 0.9839 124 0.9697 0.9648 0.9673 0.9929
0.0006 59.0 5664 0.0718 0.9397 0.9397 0.9397 116 0.9618 0.9557 0.9587 158 0.9839 0.9839 0.9839 124 0.9622 0.9598 0.9610 0.9923
0.0005 60.0 5760 0.0870 0.9496 0.9741 0.9617 116 0.98 0.9304 0.9545 158 0.984 0.9919 0.9880 124 0.9721 0.9623 0.9672 0.9898
0.0004 61.0 5856 0.0687 0.9474 0.9310 0.9391 116 0.9437 0.9557 0.9497 158 0.984 0.9919 0.9880 124 0.9574 0.9598 0.9586 0.9909
0.0002 62.0 5952 0.0983 0.9402 0.9483 0.9442 116 0.9799 0.9241 0.9511 158 0.9839 0.9839 0.9839 124 0.9692 0.9497 0.9594 0.9893
0.0006 63.0 6048 0.0818 0.9573 0.9655 0.9614 116 0.9671 0.9304 0.9484 158 0.984 0.9919 0.9880 124 0.9695 0.9598 0.9646 0.9912
0.0002 64.0 6144 0.0858 0.9573 0.9655 0.9614 116 0.9608 0.9304 0.9453 158 0.984 0.9919 0.9880 124 0.9671 0.9598 0.9634 0.9915
0.0005 65.0 6240 0.0884 0.9569 0.9569 0.9569 116 0.9673 0.9367 0.9518 158 0.9683 0.9839 0.976 124 0.9646 0.9573 0.9609 0.9915
0.001 66.0 6336 0.0771 0.9487 0.9569 0.9528 116 0.9542 0.9241 0.9389 158 0.9683 0.9839 0.976 124 0.9571 0.9523 0.9547 0.9912
0.0006 67.0 6432 0.0808 0.9487 0.9569 0.9528 116 0.9735 0.9304 0.9515 158 0.984 0.9919 0.9880 124 0.9695 0.9573 0.9633 0.9909
0.0002 68.0 6528 0.0749 0.9573 0.9655 0.9614 116 0.9610 0.9367 0.9487 158 0.984 0.9919 0.9880 124 0.9672 0.9623 0.9647 0.9920
0.0011 69.0 6624 0.0784 0.9573 0.9655 0.9614 116 0.9608 0.9304 0.9453 158 0.984 0.9919 0.9880 124 0.9671 0.9598 0.9634 0.9918
0.0005 70.0 6720 0.0750 0.9573 0.9655 0.9614 116 0.9671 0.9304 0.9484 158 0.984 0.9919 0.9880 124 0.9695 0.9598 0.9646 0.9920
0.0001 71.0 6816 0.0758 0.9573 0.9655 0.9614 116 0.9671 0.9304 0.9484 158 0.984 0.9919 0.9880 124 0.9695 0.9598 0.9646 0.9920
0.0005 72.0 6912 0.0771 0.9573 0.9655 0.9614 116 0.9671 0.9304 0.9484 158 0.984 0.9919 0.9880 124 0.9695 0.9598 0.9646 0.9920
0.0004 73.0 7008 0.0733 0.9412 0.9655 0.9532 116 0.9542 0.9241 0.9389 158 0.984 0.9919 0.9880 124 0.9597 0.9573 0.9585 0.9915
0.0001 74.0 7104 0.0740 0.9492 0.9655 0.9573 116 0.9542 0.9241 0.9389 158 0.984 0.9919 0.9880 124 0.9621 0.9573 0.9597 0.9918
0.0001 75.0 7200 0.0795 0.9492 0.9655 0.9573 116 0.9669 0.9241 0.9450 158 0.984 0.9919 0.9880 124 0.9670 0.9573 0.9621 0.9915
0.0002 76.0 7296 0.0800 0.9492 0.9655 0.9573 116 0.9669 0.9241 0.9450 158 0.984 0.9919 0.9880 124 0.9670 0.9573 0.9621 0.9915
0.0002 77.0 7392 0.0781 0.9569 0.9569 0.9569 116 0.9608 0.9304 0.9453 158 0.984 0.9919 0.9880 124 0.9670 0.9573 0.9621 0.9920
0.0002 78.0 7488 0.0798 0.9492 0.9655 0.9573 116 0.9735 0.9304 0.9515 158 0.984 0.9919 0.9880 124 0.9695 0.9598 0.9646 0.9918
0.0002 79.0 7584 0.0785 0.9573 0.9655 0.9614 116 0.9737 0.9367 0.9548 158 0.984 0.9919 0.9880 124 0.9721 0.9623 0.9672 0.9926
0.0001 80.0 7680 0.0794 0.9573 0.9655 0.9614 116 0.9737 0.9367 0.9548 158 0.984 0.9919 0.9880 124 0.9721 0.9623 0.9672 0.9926
0.0004 81.0 7776 0.0812 0.9573 0.9655 0.9614 116 0.9737 0.9367 0.9548 158 0.984 0.9919 0.9880 124 0.9721 0.9623 0.9672 0.9926
0.0001 82.0 7872 0.0880 0.9492 0.9655 0.9573 116 0.9669 0.9241 0.9450 158 0.984 0.9919 0.9880 124 0.9670 0.9573 0.9621 0.9915
0.0001 83.0 7968 0.0832 0.9576 0.9741 0.9658 116 0.9671 0.9304 0.9484 158 0.984 0.9919 0.9880 124 0.9696 0.9623 0.9660 0.9920
0.0007 84.0 8064 0.0854 0.9492 0.9655 0.9573 116 0.9669 0.9241 0.9450 158 0.984 0.9919 0.9880 124 0.9670 0.9573 0.9621 0.9915
0.0001 85.0 8160 0.0863 0.9492 0.9655 0.9573 116 0.9669 0.9241 0.9450 158 0.984 0.9919 0.9880 124 0.9670 0.9573 0.9621 0.9915
0.0001 86.0 8256 0.0854 0.9492 0.9655 0.9573 116 0.9669 0.9241 0.9450 158 0.984 0.9919 0.9880 124 0.9670 0.9573 0.9621 0.9909
0.0001 87.0 8352 0.0789 0.9573 0.9655 0.9614 116 0.9673 0.9367 0.9518 158 0.984 0.9919 0.9880 124 0.9696 0.9623 0.9660 0.9923
0.0001 88.0 8448 0.0776 0.9658 0.9741 0.9700 116 0.9608 0.9304 0.9453 158 0.984 0.9919 0.9880 124 0.9696 0.9623 0.9660 0.9923
0.0002 89.0 8544 0.0786 0.9569 0.9569 0.9569 116 0.9610 0.9367 0.9487 158 0.984 0.9919 0.9880 124 0.9671 0.9598 0.9634 0.9920
0.0001 90.0 8640 0.0798 0.9569 0.9569 0.9569 116 0.9610 0.9367 0.9487 158 0.984 0.9919 0.9880 124 0.9671 0.9598 0.9634 0.9920
0.0001 91.0 8736 0.0816 0.9573 0.9655 0.9614 116 0.9608 0.9304 0.9453 158 0.984 0.9919 0.9880 124 0.9671 0.9598 0.9634 0.9920
0.0005 92.0 8832 0.0819 0.9573 0.9655 0.9614 116 0.9608 0.9304 0.9453 158 0.984 0.9919 0.9880 124 0.9671 0.9598 0.9634 0.9920
0.0003 93.0 8928 0.0819 0.9573 0.9655 0.9614 116 0.9608 0.9304 0.9453 158 0.984 0.9919 0.9880 124 0.9671 0.9598 0.9634 0.9920
0.0003 94.0 9024 0.0814 0.9573 0.9655 0.9614 116 0.9608 0.9304 0.9453 158 0.984 0.9919 0.9880 124 0.9671 0.9598 0.9634 0.9920
0.0001 95.0 9120 0.0814 0.9573 0.9655 0.9614 116 0.9608 0.9304 0.9453 158 0.984 0.9919 0.9880 124 0.9671 0.9598 0.9634 0.9920
0.0001 96.0 9216 0.0816 0.9573 0.9655 0.9614 116 0.9608 0.9304 0.9453 158 0.984 0.9919 0.9880 124 0.9671 0.9598 0.9634 0.9920
0.0001 97.0 9312 0.0817 0.9573 0.9655 0.9614 116 0.9608 0.9304 0.9453 158 0.984 0.9919 0.9880 124 0.9671 0.9598 0.9634 0.9920
0.0001 98.0 9408 0.0821 0.9573 0.9655 0.9614 116 0.9608 0.9304 0.9453 158 0.984 0.9919 0.9880 124 0.9671 0.9598 0.9634 0.9920
0.0001 99.0 9504 0.0822 0.9573 0.9655 0.9614 116 0.9608 0.9304 0.9453 158 0.984 0.9919 0.9880 124 0.9671 0.9598 0.9634 0.9920
0.0001 100.0 9600 0.0822 0.9573 0.9655 0.9614 116 0.9608 0.9304 0.9453 158 0.984 0.9919 0.9880 124 0.9671 0.9598 0.9634 0.9920

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month
3
Safetensors
Model size
110M params
Tensor type
F32
·
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Finetuned from