Edit model card

nerui-lora-r8-3

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0484
  • Location Precision: 0.9
  • Location Recall: 0.9419
  • Location F1: 0.9205
  • Location Number: 86
  • Organization Precision: 0.9364
  • Organization Recall: 0.9101
  • Organization F1: 0.9231
  • Organization Number: 178
  • Person Precision: 0.9843
  • Person Recall: 0.9766
  • Person F1: 0.9804
  • Person Number: 128
  • Overall Precision: 0.9436
  • Overall Recall: 0.9388
  • Overall F1: 0.9412
  • Overall Accuracy: 0.9846

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.1489 1.0 96 0.6808 0.0 0.0 0.0 86 0.0 0.0 0.0 178 0.0 0.0 0.0 128 0.0 0.0 0.0 0.8435
0.6648 2.0 192 0.5508 0.0 0.0 0.0 86 0.5 0.0056 0.0111 178 0.0 0.0 0.0 128 0.3333 0.0026 0.0051 0.8437
0.5545 3.0 288 0.4324 0.0 0.0 0.0 86 0.3793 0.0618 0.1063 178 0.3714 0.1016 0.1595 128 0.3636 0.0612 0.1048 0.8543
0.4347 4.0 384 0.3185 0.3077 0.0465 0.0808 86 0.3876 0.2809 0.3257 178 0.4167 0.5078 0.4577 128 0.3993 0.3036 0.3449 0.8910
0.3178 5.0 480 0.2349 0.5714 0.3721 0.4507 86 0.5476 0.6461 0.5928 178 0.5890 0.75 0.6598 128 0.5664 0.6199 0.5920 0.9320
0.2406 6.0 576 0.1835 0.7407 0.6977 0.7186 86 0.6716 0.7584 0.7124 178 0.7467 0.875 0.8058 128 0.7106 0.7832 0.7451 0.9536
0.1942 7.0 672 0.1519 0.7701 0.7791 0.7746 86 0.7114 0.8034 0.7546 178 0.8786 0.9609 0.9179 128 0.7780 0.8495 0.8122 0.9625
0.1647 8.0 768 0.1279 0.7882 0.7791 0.7836 86 0.7487 0.8034 0.7751 178 0.8986 0.9688 0.9323 128 0.8068 0.8520 0.8288 0.9660
0.1479 9.0 864 0.1130 0.7978 0.8256 0.8114 86 0.7602 0.8371 0.7968 178 0.9118 0.9688 0.9394 128 0.8171 0.8776 0.8462 0.9690
0.135 10.0 960 0.1037 0.7660 0.8372 0.8 86 0.7755 0.8539 0.8128 178 0.9179 0.9609 0.9389 128 0.8184 0.8852 0.8505 0.9682
0.1317 11.0 1056 0.0951 0.7935 0.8488 0.8202 86 0.8182 0.8596 0.8384 178 0.9466 0.9688 0.9575 128 0.8537 0.8929 0.8728 0.9733
0.1196 12.0 1152 0.0904 0.7708 0.8605 0.8132 86 0.8404 0.8876 0.8634 178 0.9328 0.9766 0.9542 128 0.8541 0.9107 0.8815 0.9749
0.1108 13.0 1248 0.0824 0.7979 0.8721 0.8333 86 0.8466 0.8989 0.8719 178 0.9466 0.9688 0.9575 128 0.8671 0.9158 0.8908 0.9768
0.107 14.0 1344 0.0797 0.8 0.8837 0.8398 86 0.8729 0.8876 0.8802 178 0.9394 0.9688 0.9538 128 0.8775 0.9133 0.895 0.9781
0.1063 15.0 1440 0.0760 0.7872 0.8605 0.8222 86 0.8610 0.9045 0.8822 178 0.9394 0.9688 0.9538 128 0.8692 0.9158 0.8919 0.9776
0.1 16.0 1536 0.0724 0.8462 0.8953 0.8701 86 0.8703 0.9045 0.8871 178 0.9538 0.9688 0.9612 128 0.8916 0.9235 0.9073 0.9795
0.095 17.0 1632 0.0705 0.8261 0.8837 0.8539 86 0.8710 0.9101 0.8901 178 0.9466 0.9688 0.9575 128 0.8851 0.9235 0.9039 0.9789
0.0932 18.0 1728 0.0698 0.8370 0.8953 0.8652 86 0.8944 0.9045 0.8994 178 0.9466 0.9688 0.9575 128 0.8983 0.9235 0.9107 0.9803
0.0871 19.0 1824 0.0672 0.8387 0.9070 0.8715 86 0.8944 0.9045 0.8994 178 0.9466 0.9688 0.9575 128 0.8985 0.9260 0.9121 0.9800
0.0883 20.0 1920 0.0650 0.8298 0.9070 0.8667 86 0.8944 0.9045 0.8994 178 0.9612 0.9688 0.9650 128 0.9007 0.9260 0.9132 0.9803
0.0832 21.0 2016 0.0651 0.8298 0.9070 0.8667 86 0.8994 0.9045 0.9020 178 0.9612 0.9688 0.9650 128 0.9030 0.9260 0.9144 0.9811
0.0829 22.0 2112 0.0645 0.8125 0.9070 0.8571 86 0.8663 0.9101 0.8877 178 0.9466 0.9688 0.9575 128 0.8792 0.9286 0.9032 0.9787
0.0789 23.0 2208 0.0601 0.8211 0.9070 0.8619 86 0.8994 0.9045 0.9020 178 0.9766 0.9766 0.9766 128 0.9055 0.9286 0.9169 0.9819
0.078 24.0 2304 0.0612 0.8211 0.9070 0.8619 86 0.8927 0.8876 0.8901 178 0.9766 0.9766 0.9766 128 0.9025 0.9209 0.9116 0.9806
0.0756 25.0 2400 0.0594 0.8298 0.9070 0.8667 86 0.9045 0.9045 0.9045 178 0.9615 0.9766 0.9690 128 0.9055 0.9286 0.9169 0.9806
0.0767 26.0 2496 0.0588 0.7822 0.9186 0.8449 86 0.8960 0.8708 0.8832 178 0.9766 0.9766 0.9766 128 0.8930 0.9158 0.9043 0.9800
0.0721 27.0 2592 0.0561 0.8125 0.9070 0.8571 86 0.8852 0.9101 0.8975 178 0.9766 0.9766 0.9766 128 0.8968 0.9311 0.9136 0.9814
0.0719 28.0 2688 0.0559 0.8404 0.9186 0.8778 86 0.9040 0.8989 0.9014 178 0.9766 0.9766 0.9766 128 0.9123 0.9286 0.9204 0.9819
0.0702 29.0 2784 0.0543 0.8478 0.9070 0.8764 86 0.9016 0.9270 0.9141 178 0.9766 0.9766 0.9766 128 0.9132 0.9388 0.9258 0.9816
0.0711 30.0 2880 0.0539 0.8667 0.9070 0.8864 86 0.9066 0.9270 0.9167 178 0.9690 0.9766 0.9728 128 0.9177 0.9388 0.9281 0.9819
0.067 31.0 2976 0.0576 0.8061 0.9186 0.8587 86 0.9101 0.9101 0.9101 178 0.9766 0.9766 0.9766 128 0.9059 0.9337 0.9196 0.9819
0.0664 32.0 3072 0.0567 0.8211 0.9070 0.8619 86 0.9011 0.9213 0.9111 178 0.9690 0.9766 0.9728 128 0.9039 0.9362 0.9198 0.9814
0.0642 33.0 3168 0.0558 0.8316 0.9186 0.8729 86 0.9096 0.9045 0.9070 178 0.9766 0.9766 0.9766 128 0.9125 0.9311 0.9217 0.9825
0.0642 34.0 3264 0.0545 0.8587 0.9186 0.8876 86 0.9157 0.9157 0.9157 178 0.9766 0.9766 0.9766 128 0.9221 0.9362 0.9291 0.9835
0.0624 35.0 3360 0.0542 0.8681 0.9186 0.8927 86 0.9111 0.9213 0.9162 178 0.9766 0.9766 0.9766 128 0.9223 0.9388 0.9305 0.9830
0.0651 36.0 3456 0.0535 0.8778 0.9186 0.8977 86 0.9213 0.9213 0.9213 178 0.9690 0.9766 0.9728 128 0.9270 0.9388 0.9328 0.9833
0.0635 37.0 3552 0.0523 0.8864 0.9070 0.8966 86 0.9111 0.9213 0.9162 178 0.9766 0.9766 0.9766 128 0.9268 0.9362 0.9315 0.9833
0.0617 38.0 3648 0.0528 0.8587 0.9186 0.8876 86 0.9157 0.9157 0.9157 178 0.9766 0.9766 0.9766 128 0.9221 0.9362 0.9291 0.9838
0.0581 39.0 3744 0.0548 0.8061 0.9186 0.8587 86 0.9091 0.8989 0.9040 178 0.9766 0.9766 0.9766 128 0.9055 0.9286 0.9169 0.9827
0.0597 40.0 3840 0.0510 0.8778 0.9186 0.8977 86 0.9270 0.9270 0.9270 178 0.9766 0.9766 0.9766 128 0.9318 0.9413 0.9365 0.9846
0.0569 41.0 3936 0.0505 0.8778 0.9186 0.8977 86 0.9270 0.9270 0.9270 178 0.9766 0.9766 0.9766 128 0.9318 0.9413 0.9365 0.9849
0.0579 42.0 4032 0.0504 0.8778 0.9186 0.8977 86 0.9270 0.9270 0.9270 178 0.9766 0.9766 0.9766 128 0.9318 0.9413 0.9365 0.9843
0.0564 43.0 4128 0.0506 0.8681 0.9186 0.8927 86 0.9106 0.9157 0.9132 178 0.9843 0.9766 0.9804 128 0.9244 0.9362 0.9303 0.9843
0.0572 44.0 4224 0.0499 0.8681 0.9186 0.8927 86 0.9116 0.9270 0.9192 178 0.9843 0.9766 0.9804 128 0.9248 0.9413 0.9330 0.9849
0.0563 45.0 4320 0.0488 0.8681 0.9186 0.8927 86 0.9213 0.9213 0.9213 178 0.9843 0.9766 0.9804 128 0.9293 0.9388 0.9340 0.9843
0.0594 46.0 4416 0.0507 0.8681 0.9186 0.8927 86 0.9167 0.9270 0.9218 178 0.9766 0.9766 0.9766 128 0.9248 0.9413 0.9330 0.9841
0.0545 47.0 4512 0.0497 0.8681 0.9186 0.8927 86 0.9162 0.9213 0.9188 178 0.9766 0.9766 0.9766 128 0.9246 0.9388 0.9316 0.9846
0.0536 48.0 4608 0.0487 0.8681 0.9186 0.8927 86 0.9162 0.9213 0.9188 178 0.9766 0.9766 0.9766 128 0.9246 0.9388 0.9316 0.9849
0.0556 49.0 4704 0.0501 0.8681 0.9186 0.8927 86 0.9096 0.9045 0.9070 178 0.9766 0.9766 0.9766 128 0.9217 0.9311 0.9264 0.9833
0.0522 50.0 4800 0.0506 0.8791 0.9302 0.9040 86 0.9162 0.9213 0.9188 178 0.9766 0.9766 0.9766 128 0.9271 0.9413 0.9342 0.9854
0.0527 51.0 4896 0.0496 0.8791 0.9302 0.9040 86 0.9318 0.9213 0.9266 178 0.9766 0.9766 0.9766 128 0.9342 0.9413 0.9377 0.9852
0.0529 52.0 4992 0.0490 0.8791 0.9302 0.9040 86 0.9266 0.9213 0.9239 178 0.9688 0.9688 0.9688 128 0.9293 0.9388 0.9340 0.9852
0.0522 53.0 5088 0.0494 0.8791 0.9302 0.9040 86 0.9157 0.9157 0.9157 178 0.9766 0.9766 0.9766 128 0.9270 0.9388 0.9328 0.9846
0.0525 54.0 5184 0.0482 0.8889 0.9302 0.9091 86 0.9270 0.9270 0.9270 178 0.9766 0.9766 0.9766 128 0.9343 0.9439 0.9391 0.9860
0.0512 55.0 5280 0.0488 0.8696 0.9302 0.8989 86 0.9318 0.9213 0.9266 178 0.9766 0.9766 0.9766 128 0.9318 0.9413 0.9365 0.9854
0.053 56.0 5376 0.0487 0.8791 0.9302 0.9040 86 0.9205 0.9101 0.9153 178 0.9766 0.9766 0.9766 128 0.9291 0.9362 0.9327 0.9849
0.0498 57.0 5472 0.0486 0.8791 0.9302 0.9040 86 0.9209 0.9157 0.9183 178 0.9766 0.9766 0.9766 128 0.9293 0.9388 0.9340 0.9846
0.0504 58.0 5568 0.0489 0.8696 0.9302 0.8989 86 0.9318 0.9213 0.9266 178 0.9766 0.9766 0.9766 128 0.9318 0.9413 0.9365 0.9854
0.0456 59.0 5664 0.0492 0.8696 0.9302 0.8989 86 0.9148 0.9045 0.9096 178 0.9766 0.9766 0.9766 128 0.9242 0.9337 0.9289 0.9846
0.0504 60.0 5760 0.0475 0.8681 0.9186 0.8927 86 0.9153 0.9101 0.9127 178 0.9766 0.9766 0.9766 128 0.9242 0.9337 0.9289 0.9849
0.0494 61.0 5856 0.0476 0.8681 0.9186 0.8927 86 0.9314 0.9157 0.9235 178 0.9766 0.9766 0.9766 128 0.9315 0.9362 0.9338 0.9852
0.046 62.0 5952 0.0478 0.8901 0.9419 0.9153 86 0.9318 0.9213 0.9266 178 0.9843 0.9766 0.9804 128 0.9391 0.9439 0.9415 0.9860
0.0463 63.0 6048 0.0485 0.8696 0.9302 0.8989 86 0.9162 0.9213 0.9188 178 0.9688 0.9688 0.9688 128 0.9223 0.9388 0.9305 0.9849
0.0452 64.0 6144 0.0482 0.8791 0.9302 0.9040 86 0.9213 0.9213 0.9213 178 0.9766 0.9766 0.9766 128 0.9295 0.9413 0.9354 0.9852
0.0446 65.0 6240 0.0492 0.8791 0.9302 0.9040 86 0.9111 0.9213 0.9162 178 0.9843 0.9766 0.9804 128 0.9271 0.9413 0.9342 0.9854
0.0463 66.0 6336 0.0495 0.8587 0.9186 0.8876 86 0.9101 0.9101 0.9101 178 0.9766 0.9766 0.9766 128 0.9196 0.9337 0.9266 0.9843
0.0466 67.0 6432 0.0491 0.8791 0.9302 0.9040 86 0.9101 0.9101 0.9101 178 0.9766 0.9766 0.9766 128 0.9244 0.9362 0.9303 0.9846
0.0451 68.0 6528 0.0499 0.8791 0.9302 0.9040 86 0.9111 0.9213 0.9162 178 0.9766 0.9766 0.9766 128 0.9248 0.9413 0.9330 0.9852
0.047 69.0 6624 0.0493 0.8696 0.9302 0.8989 86 0.9209 0.9157 0.9183 178 0.9766 0.9766 0.9766 128 0.9270 0.9388 0.9328 0.9852
0.0435 70.0 6720 0.0485 0.8791 0.9302 0.9040 86 0.9157 0.9157 0.9157 178 0.9766 0.9766 0.9766 128 0.9270 0.9388 0.9328 0.9849
0.045 71.0 6816 0.0490 0.8791 0.9302 0.9040 86 0.9111 0.9213 0.9162 178 0.9766 0.9766 0.9766 128 0.9248 0.9413 0.9330 0.9852
0.0458 72.0 6912 0.0497 0.8901 0.9419 0.9153 86 0.9257 0.9101 0.9178 178 0.9766 0.9766 0.9766 128 0.9340 0.9388 0.9364 0.9849
0.0442 73.0 7008 0.0495 0.8901 0.9419 0.9153 86 0.9157 0.9157 0.9157 178 0.9766 0.9766 0.9766 128 0.9295 0.9413 0.9354 0.9854
0.0442 74.0 7104 0.0490 0.8901 0.9419 0.9153 86 0.9153 0.9101 0.9127 178 0.9766 0.9766 0.9766 128 0.9293 0.9388 0.9340 0.9852
0.0437 75.0 7200 0.0487 0.8681 0.9186 0.8927 86 0.9209 0.9157 0.9183 178 0.9766 0.9766 0.9766 128 0.9268 0.9362 0.9315 0.9841
0.0458 76.0 7296 0.0493 0.8791 0.9302 0.9040 86 0.9209 0.9157 0.9183 178 0.9843 0.9766 0.9804 128 0.9316 0.9388 0.9352 0.9843
0.0448 77.0 7392 0.0487 0.8681 0.9186 0.8927 86 0.9153 0.9101 0.9127 178 0.9843 0.9766 0.9804 128 0.9266 0.9337 0.9301 0.9838
0.0451 78.0 7488 0.0495 0.8791 0.9302 0.9040 86 0.9209 0.9157 0.9183 178 0.9766 0.9766 0.9766 128 0.9293 0.9388 0.9340 0.9843
0.0449 79.0 7584 0.0498 0.8791 0.9302 0.9040 86 0.9213 0.9213 0.9213 178 0.9843 0.9766 0.9804 128 0.9318 0.9413 0.9365 0.9846
0.0436 80.0 7680 0.0493 0.8696 0.9302 0.8989 86 0.9205 0.9101 0.9153 178 0.9843 0.9766 0.9804 128 0.9291 0.9362 0.9327 0.9843
0.044 81.0 7776 0.0494 0.8804 0.9419 0.9101 86 0.9209 0.9157 0.9183 178 0.9843 0.9766 0.9804 128 0.9318 0.9413 0.9365 0.9852
0.0438 82.0 7872 0.0485 0.8696 0.9302 0.8989 86 0.9257 0.9101 0.9178 178 0.9766 0.9766 0.9766 128 0.9291 0.9362 0.9327 0.9846
0.0434 83.0 7968 0.0482 0.8901 0.9419 0.9153 86 0.9162 0.9213 0.9188 178 0.9766 0.9766 0.9766 128 0.9296 0.9439 0.9367 0.9857
0.0418 84.0 8064 0.0485 0.8696 0.9302 0.8989 86 0.9101 0.9101 0.9101 178 0.9766 0.9766 0.9766 128 0.9221 0.9362 0.9291 0.9846
0.0424 85.0 8160 0.0484 0.8901 0.9419 0.9153 86 0.9310 0.9101 0.9205 178 0.9766 0.9766 0.9766 128 0.9364 0.9388 0.9376 0.9849
0.042 86.0 8256 0.0482 0.8901 0.9419 0.9153 86 0.9266 0.9213 0.9239 178 0.9843 0.9766 0.9804 128 0.9367 0.9439 0.9403 0.9857
0.0431 87.0 8352 0.0482 0.8804 0.9419 0.9101 86 0.9257 0.9101 0.9178 178 0.9843 0.9766 0.9804 128 0.9340 0.9388 0.9364 0.9852
0.0417 88.0 8448 0.0482 0.8901 0.9419 0.9153 86 0.9257 0.9101 0.9178 178 0.9843 0.9766 0.9804 128 0.9364 0.9388 0.9376 0.9849
0.0421 89.0 8544 0.0482 0.8901 0.9419 0.9153 86 0.9261 0.9157 0.9209 178 0.9843 0.9766 0.9804 128 0.9365 0.9413 0.9389 0.9854
0.0412 90.0 8640 0.0485 0.8901 0.9419 0.9153 86 0.9257 0.9101 0.9178 178 0.9843 0.9766 0.9804 128 0.9364 0.9388 0.9376 0.9852
0.0407 91.0 8736 0.0484 0.8901 0.9419 0.9153 86 0.9310 0.9101 0.9205 178 0.9843 0.9766 0.9804 128 0.9388 0.9388 0.9388 0.9849
0.0405 92.0 8832 0.0487 0.9 0.9419 0.9205 86 0.9364 0.9101 0.9231 178 0.9843 0.9766 0.9804 128 0.9436 0.9388 0.9412 0.9846
0.0447 93.0 8928 0.0487 0.9 0.9419 0.9205 86 0.9364 0.9101 0.9231 178 0.9843 0.9766 0.9804 128 0.9436 0.9388 0.9412 0.9846
0.0402 94.0 9024 0.0487 0.9 0.9419 0.9205 86 0.9310 0.9101 0.9205 178 0.9843 0.9766 0.9804 128 0.9412 0.9388 0.9400 0.9849
0.0406 95.0 9120 0.0485 0.9 0.9419 0.9205 86 0.9364 0.9101 0.9231 178 0.9843 0.9766 0.9804 128 0.9436 0.9388 0.9412 0.9846
0.0413 96.0 9216 0.0485 0.9 0.9419 0.9205 86 0.9364 0.9101 0.9231 178 0.9843 0.9766 0.9804 128 0.9436 0.9388 0.9412 0.9846
0.0404 97.0 9312 0.0484 0.9 0.9419 0.9205 86 0.9368 0.9157 0.9261 178 0.9843 0.9766 0.9804 128 0.9437 0.9413 0.9425 0.9852
0.0403 98.0 9408 0.0485 0.9 0.9419 0.9205 86 0.9364 0.9101 0.9231 178 0.9843 0.9766 0.9804 128 0.9436 0.9388 0.9412 0.9846
0.0403 99.0 9504 0.0484 0.9 0.9419 0.9205 86 0.9364 0.9101 0.9231 178 0.9843 0.9766 0.9804 128 0.9436 0.9388 0.9412 0.9846
0.0417 100.0 9600 0.0484 0.9 0.9419 0.9205 86 0.9364 0.9101 0.9231 178 0.9843 0.9766 0.9804 128 0.9436 0.9388 0.9412 0.9846

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-lora-r8-3

Finetuned
(353)
this model