Edit model card

nerui-lora-r16-3

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0458
  • Location Precision: 0.9022
  • Location Recall: 0.9651
  • Location F1: 0.9326
  • Location Number: 86
  • Organization Precision: 0.9314
  • Organization Recall: 0.9157
  • Organization F1: 0.9235
  • Organization Number: 178
  • Person Precision: 0.9843
  • Person Recall: 0.9766
  • Person F1: 0.9804
  • Person Number: 128
  • Overall Precision: 0.9416
  • Overall Recall: 0.9464
  • Overall F1: 0.9440
  • Overall Accuracy: 0.9884

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.0611 1.0 96 0.6536 0.0 0.0 0.0 86 0.0 0.0 0.0 178 0.0 0.0 0.0 128 0.0 0.0 0.0 0.8435
0.6324 2.0 192 0.5023 0.0 0.0 0.0 86 0.5556 0.0281 0.0535 178 0.0 0.0 0.0 128 0.4167 0.0128 0.0248 0.8448
0.4878 3.0 288 0.3482 0.25 0.0233 0.0426 86 0.3936 0.2079 0.2721 178 0.3543 0.3516 0.3529 128 0.3668 0.2143 0.2705 0.8799
0.341 4.0 384 0.2386 0.5185 0.3256 0.4 86 0.5308 0.6292 0.5758 178 0.5767 0.7344 0.6460 128 0.5467 0.5969 0.5707 0.9296
0.2391 5.0 480 0.1745 0.7179 0.6512 0.6829 86 0.6603 0.7753 0.7132 178 0.8151 0.9297 0.8686 128 0.7229 0.7985 0.7588 0.9547
0.1867 6.0 576 0.1380 0.7396 0.8256 0.7802 86 0.7385 0.8090 0.7721 178 0.9118 0.9688 0.9394 128 0.7939 0.8648 0.8278 0.9655
0.1578 7.0 672 0.1150 0.75 0.8372 0.7912 86 0.7755 0.8539 0.8128 178 0.9058 0.9766 0.9398 128 0.8116 0.8903 0.8491 0.9690
0.1374 8.0 768 0.0980 0.7766 0.8488 0.8111 86 0.8105 0.8652 0.8370 178 0.9191 0.9766 0.9470 128 0.8381 0.8980 0.8670 0.9730
0.1267 9.0 864 0.0882 0.77 0.8953 0.8280 86 0.8511 0.8989 0.8743 178 0.9328 0.9766 0.9542 128 0.8578 0.9235 0.8894 0.9749
0.115 10.0 960 0.0822 0.8061 0.9186 0.8587 86 0.8474 0.9045 0.8750 178 0.9328 0.9766 0.9542 128 0.8649 0.9311 0.8968 0.9765
0.1082 11.0 1056 0.0755 0.7835 0.8837 0.8306 86 0.8495 0.8876 0.8681 178 0.9466 0.9688 0.9575 128 0.8647 0.9133 0.8883 0.9768
0.1032 12.0 1152 0.0724 0.8495 0.9186 0.8827 86 0.8579 0.9157 0.8859 178 0.9323 0.9688 0.9502 128 0.8798 0.9337 0.9059 0.9781
0.0944 13.0 1248 0.0646 0.8778 0.9186 0.8977 86 0.875 0.9045 0.8895 178 0.9466 0.9688 0.9575 128 0.8988 0.9286 0.9134 0.9800
0.0923 14.0 1344 0.0638 0.8242 0.8721 0.8475 86 0.8743 0.8989 0.8864 178 0.9538 0.9688 0.9612 128 0.8886 0.9158 0.9020 0.9798
0.0918 15.0 1440 0.0623 0.8571 0.9070 0.8814 86 0.8859 0.9157 0.9006 178 0.9615 0.9766 0.9690 128 0.9037 0.9337 0.9184 0.9806
0.0848 16.0 1536 0.0615 0.8298 0.9070 0.8667 86 0.8696 0.8989 0.8840 178 0.9766 0.9766 0.9766 128 0.8941 0.9260 0.9098 0.9798
0.0818 17.0 1632 0.0594 0.8495 0.9186 0.8827 86 0.8840 0.8989 0.8914 178 0.9690 0.9766 0.9728 128 0.9032 0.9286 0.9157 0.9814
0.0797 18.0 1728 0.0577 0.8764 0.9070 0.8914 86 0.8840 0.8989 0.8914 178 0.9766 0.9766 0.9766 128 0.9121 0.9260 0.9190 0.9814
0.0745 19.0 1824 0.0573 0.8667 0.9070 0.8864 86 0.8852 0.9101 0.8975 178 0.9766 0.9766 0.9766 128 0.9102 0.9311 0.9206 0.9814
0.0747 20.0 1920 0.0554 0.8478 0.9070 0.8764 86 0.8907 0.9157 0.9030 178 0.9766 0.9766 0.9766 128 0.9082 0.9337 0.9208 0.9816
0.0702 21.0 2016 0.0560 0.8387 0.9070 0.8715 86 0.8876 0.8876 0.8876 178 0.9766 0.9766 0.9766 128 0.9048 0.9209 0.9128 0.9811
0.0701 22.0 2112 0.0550 0.8316 0.9186 0.8729 86 0.8871 0.9270 0.9066 178 0.9766 0.9766 0.9766 128 0.9022 0.9413 0.9213 0.9822
0.0663 23.0 2208 0.0520 0.8478 0.9070 0.8764 86 0.9045 0.9045 0.9045 178 0.9766 0.9766 0.9766 128 0.9146 0.9286 0.9215 0.9833
0.0666 24.0 2304 0.0543 0.8211 0.9070 0.8619 86 0.8870 0.8820 0.8845 178 0.9766 0.9766 0.9766 128 0.9 0.9184 0.9091 0.9806
0.0635 25.0 2400 0.0524 0.8316 0.9186 0.8729 86 0.8927 0.8876 0.8901 178 0.9766 0.9766 0.9766 128 0.905 0.9235 0.9141 0.9822
0.0632 26.0 2496 0.0519 0.8421 0.9302 0.8840 86 0.8927 0.8876 0.8901 178 0.9766 0.9766 0.9766 128 0.9075 0.9260 0.9167 0.9825
0.0596 27.0 2592 0.0489 0.8495 0.9186 0.8827 86 0.8840 0.8989 0.8914 178 0.9766 0.9766 0.9766 128 0.9055 0.9286 0.9169 0.9830
0.0608 28.0 2688 0.0508 0.8316 0.9186 0.8729 86 0.8927 0.8876 0.8901 178 0.9843 0.9766 0.9804 128 0.9073 0.9235 0.9153 0.9825
0.0591 29.0 2784 0.0464 0.8966 0.9070 0.9017 86 0.8962 0.9213 0.9086 178 0.9766 0.9766 0.9766 128 0.9221 0.9362 0.9291 0.9843
0.0582 30.0 2880 0.0472 0.8864 0.9070 0.8966 86 0.9126 0.9382 0.9252 178 0.9766 0.9766 0.9766 128 0.9273 0.9439 0.9355 0.9857
0.0567 31.0 2976 0.0518 0.8333 0.9302 0.8791 86 0.8971 0.8820 0.8895 178 0.9766 0.9766 0.9766 128 0.9073 0.9235 0.9153 0.9825
0.0545 32.0 3072 0.0493 0.8681 0.9186 0.8927 86 0.8956 0.9157 0.9056 178 0.9766 0.9766 0.9766 128 0.9152 0.9362 0.9256 0.9841
0.0526 33.0 3168 0.0488 0.8696 0.9302 0.8989 86 0.8944 0.9045 0.8994 178 0.9688 0.9688 0.9688 128 0.9125 0.9311 0.9217 0.9843
0.0536 34.0 3264 0.0481 0.8989 0.9302 0.9143 86 0.9213 0.9213 0.9213 178 0.9766 0.9766 0.9766 128 0.9342 0.9413 0.9377 0.9843
0.0501 35.0 3360 0.0482 0.8889 0.9302 0.9091 86 0.9157 0.9157 0.9157 178 0.9766 0.9766 0.9766 128 0.9293 0.9388 0.9340 0.9846
0.0541 36.0 3456 0.0481 0.8889 0.9302 0.9091 86 0.9270 0.9270 0.9270 178 0.9766 0.9766 0.9766 128 0.9343 0.9439 0.9391 0.9857
0.0513 37.0 3552 0.0475 0.8778 0.9186 0.8977 86 0.9162 0.9213 0.9188 178 0.9766 0.9766 0.9766 128 0.9270 0.9388 0.9328 0.9857
0.0506 38.0 3648 0.0483 0.8602 0.9302 0.8939 86 0.9045 0.9045 0.9045 178 0.9843 0.9766 0.9804 128 0.9196 0.9337 0.9266 0.9846
0.0483 39.0 3744 0.0498 0.8617 0.9419 0.9000 86 0.9213 0.9213 0.9213 178 0.9843 0.9766 0.9804 128 0.9273 0.9439 0.9355 0.9854
0.0481 40.0 3840 0.0467 0.8876 0.9186 0.9029 86 0.8950 0.9101 0.9025 178 0.9843 0.9766 0.9804 128 0.9219 0.9337 0.9278 0.9846
0.0463 41.0 3936 0.0471 0.8791 0.9302 0.9040 86 0.9050 0.9101 0.9076 178 0.9766 0.9766 0.9766 128 0.9221 0.9362 0.9291 0.9846
0.0461 42.0 4032 0.0456 0.8977 0.9186 0.9080 86 0.9282 0.9438 0.9359 178 0.9764 0.9688 0.9725 128 0.9369 0.9464 0.9416 0.9870
0.0454 43.0 4128 0.0459 0.8791 0.9302 0.9040 86 0.9111 0.9213 0.9162 178 0.9843 0.9766 0.9804 128 0.9271 0.9413 0.9342 0.9860
0.0459 44.0 4224 0.0470 0.8804 0.9419 0.9101 86 0.9282 0.9438 0.9359 178 0.9843 0.9766 0.9804 128 0.935 0.9541 0.9444 0.9873
0.0465 45.0 4320 0.0464 0.8696 0.9302 0.8989 86 0.9322 0.9270 0.9296 178 0.9843 0.9766 0.9804 128 0.9343 0.9439 0.9391 0.9870
0.0468 46.0 4416 0.0483 0.8901 0.9419 0.9153 86 0.9157 0.9157 0.9157 178 0.9764 0.9688 0.9725 128 0.9293 0.9388 0.9340 0.9854
0.0432 47.0 4512 0.0477 0.8901 0.9419 0.9153 86 0.9157 0.9157 0.9157 178 0.9764 0.9688 0.9725 128 0.9293 0.9388 0.9340 0.9857
0.0434 48.0 4608 0.0450 0.8681 0.9186 0.8927 86 0.9270 0.9270 0.9270 178 0.9843 0.9766 0.9804 128 0.9318 0.9413 0.9365 0.9868
0.0442 49.0 4704 0.0464 0.8696 0.9302 0.8989 86 0.9253 0.9045 0.9148 178 0.9766 0.9766 0.9766 128 0.9289 0.9337 0.9313 0.9849
0.0421 50.0 4800 0.0474 0.8696 0.9302 0.8989 86 0.9218 0.9270 0.9244 178 0.9764 0.9688 0.9725 128 0.9271 0.9413 0.9342 0.9865
0.0421 51.0 4896 0.0462 0.8901 0.9419 0.9153 86 0.9106 0.9157 0.9132 178 0.9766 0.9766 0.9766 128 0.9271 0.9413 0.9342 0.9862
0.0415 52.0 4992 0.0461 0.8602 0.9302 0.8939 86 0.9056 0.9157 0.9106 178 0.9843 0.9766 0.9804 128 0.92 0.9388 0.9293 0.9865
0.0418 53.0 5088 0.0455 0.8696 0.9302 0.8989 86 0.9050 0.9101 0.9076 178 0.9766 0.9766 0.9766 128 0.9198 0.9362 0.9279 0.9857
0.0416 54.0 5184 0.0450 0.8696 0.9302 0.8989 86 0.9162 0.9213 0.9188 178 0.9843 0.9766 0.9804 128 0.9271 0.9413 0.9342 0.9865
0.0403 55.0 5280 0.0456 0.8901 0.9419 0.9153 86 0.9153 0.9101 0.9127 178 0.9843 0.9766 0.9804 128 0.9316 0.9388 0.9352 0.9862
0.0424 56.0 5376 0.0458 0.8804 0.9419 0.9101 86 0.9195 0.8989 0.9091 178 0.9766 0.9766 0.9766 128 0.9289 0.9337 0.9313 0.9843
0.0391 57.0 5472 0.0450 0.8901 0.9419 0.9153 86 0.9111 0.9213 0.9162 178 0.9766 0.9766 0.9766 128 0.9273 0.9439 0.9355 0.9868
0.039 58.0 5568 0.0462 0.8901 0.9419 0.9153 86 0.9101 0.9101 0.9101 178 0.9766 0.9766 0.9766 128 0.9270 0.9388 0.9328 0.9857
0.0367 59.0 5664 0.0457 0.8696 0.9302 0.8989 86 0.8939 0.8989 0.8964 178 0.9766 0.9766 0.9766 128 0.9148 0.9311 0.9229 0.9860
0.0396 60.0 5760 0.0450 0.8696 0.9302 0.8989 86 0.9101 0.9101 0.9101 178 0.9766 0.9766 0.9766 128 0.9221 0.9362 0.9291 0.9865
0.038 61.0 5856 0.0451 0.8913 0.9535 0.9213 86 0.9209 0.9157 0.9183 178 0.9843 0.9766 0.9804 128 0.9343 0.9439 0.9391 0.9876
0.0359 62.0 5952 0.0451 0.8913 0.9535 0.9213 86 0.9261 0.9157 0.9209 178 0.9843 0.9766 0.9804 128 0.9367 0.9439 0.9403 0.9870
0.0366 63.0 6048 0.0456 0.9022 0.9651 0.9326 86 0.9213 0.9213 0.9213 178 0.9843 0.9766 0.9804 128 0.9370 0.9490 0.9430 0.9881
0.0346 64.0 6144 0.0452 0.8901 0.9419 0.9153 86 0.9157 0.9157 0.9157 178 0.9843 0.9766 0.9804 128 0.9318 0.9413 0.9365 0.9876
0.0338 65.0 6240 0.0457 0.9022 0.9651 0.9326 86 0.9162 0.9213 0.9188 178 0.9843 0.9766 0.9804 128 0.9347 0.9490 0.9418 0.9881
0.0352 66.0 6336 0.0455 0.8696 0.9302 0.8989 86 0.9101 0.9101 0.9101 178 0.9843 0.9766 0.9804 128 0.9244 0.9362 0.9303 0.9873
0.0351 67.0 6432 0.0456 0.8696 0.9302 0.8989 86 0.9040 0.8989 0.9014 178 0.9843 0.9766 0.9804 128 0.9217 0.9311 0.9264 0.9857
0.0333 68.0 6528 0.0462 0.8817 0.9535 0.9162 86 0.9116 0.9270 0.9192 178 0.9843 0.9766 0.9804 128 0.9277 0.9490 0.9382 0.9881
0.0356 69.0 6624 0.0452 0.8901 0.9419 0.9153 86 0.9091 0.8989 0.9040 178 0.9843 0.9766 0.9804 128 0.9289 0.9337 0.9313 0.9862
0.0336 70.0 6720 0.0455 0.8696 0.9302 0.8989 86 0.9266 0.9213 0.9239 178 0.9843 0.9766 0.9804 128 0.9318 0.9413 0.9365 0.9879
0.0331 71.0 6816 0.0459 0.8710 0.9419 0.9050 86 0.9116 0.9270 0.9192 178 0.9843 0.9766 0.9804 128 0.9252 0.9464 0.9357 0.9876
0.0351 72.0 6912 0.0469 0.8696 0.9302 0.8989 86 0.8989 0.8989 0.8989 178 0.9843 0.9766 0.9804 128 0.9194 0.9311 0.9252 0.9852
0.0333 73.0 7008 0.0466 0.8817 0.9535 0.9162 86 0.9261 0.9157 0.9209 178 0.9764 0.9688 0.9725 128 0.9318 0.9413 0.9365 0.9876
0.0345 74.0 7104 0.0455 0.8817 0.9535 0.9162 86 0.9101 0.9101 0.9101 178 0.9843 0.9766 0.9804 128 0.9271 0.9413 0.9342 0.9873
0.033 75.0 7200 0.0458 0.8710 0.9419 0.9050 86 0.9261 0.9157 0.9209 178 0.9843 0.9766 0.9804 128 0.9318 0.9413 0.9365 0.9879
0.0334 76.0 7296 0.0455 0.8913 0.9535 0.9213 86 0.9205 0.9101 0.9153 178 0.9843 0.9766 0.9804 128 0.9342 0.9413 0.9377 0.9881
0.0332 77.0 7392 0.0442 0.8710 0.9419 0.9050 86 0.9213 0.9213 0.9213 178 0.9843 0.9766 0.9804 128 0.9296 0.9439 0.9367 0.9884
0.0337 78.0 7488 0.0470 0.9022 0.9651 0.9326 86 0.9314 0.9157 0.9235 178 0.9843 0.9766 0.9804 128 0.9416 0.9464 0.9440 0.9881
0.0334 79.0 7584 0.0465 0.9022 0.9651 0.9326 86 0.9209 0.9157 0.9183 178 0.9843 0.9766 0.9804 128 0.9369 0.9464 0.9416 0.9873
0.0319 80.0 7680 0.0455 0.9022 0.9651 0.9326 86 0.9318 0.9213 0.9266 178 0.9843 0.9766 0.9804 128 0.9418 0.9490 0.9454 0.9879
0.032 81.0 7776 0.0465 0.9022 0.9651 0.9326 86 0.9318 0.9213 0.9266 178 0.9843 0.9766 0.9804 128 0.9418 0.9490 0.9454 0.9876
0.0328 82.0 7872 0.0450 0.8817 0.9535 0.9162 86 0.9106 0.9157 0.9132 178 0.9843 0.9766 0.9804 128 0.9273 0.9439 0.9355 0.9884
0.032 83.0 7968 0.0449 0.8817 0.9535 0.9162 86 0.9106 0.9157 0.9132 178 0.9843 0.9766 0.9804 128 0.9273 0.9439 0.9355 0.9881
0.0309 84.0 8064 0.0451 0.8817 0.9535 0.9162 86 0.9106 0.9157 0.9132 178 0.9843 0.9766 0.9804 128 0.9273 0.9439 0.9355 0.9879
0.0315 85.0 8160 0.0455 0.8913 0.9535 0.9213 86 0.9205 0.9101 0.9153 178 0.9843 0.9766 0.9804 128 0.9342 0.9413 0.9377 0.9879
0.0305 86.0 8256 0.0456 0.9022 0.9651 0.9326 86 0.9266 0.9213 0.9239 178 0.9843 0.9766 0.9804 128 0.9394 0.9490 0.9442 0.9879
0.0318 87.0 8352 0.0457 0.9022 0.9651 0.9326 86 0.9209 0.9157 0.9183 178 0.9843 0.9766 0.9804 128 0.9369 0.9464 0.9416 0.9873
0.0317 88.0 8448 0.0459 0.9022 0.9651 0.9326 86 0.9209 0.9157 0.9183 178 0.9843 0.9766 0.9804 128 0.9369 0.9464 0.9416 0.9873
0.0319 89.0 8544 0.0463 0.9022 0.9651 0.9326 86 0.9261 0.9157 0.9209 178 0.9843 0.9766 0.9804 128 0.9392 0.9464 0.9428 0.9876
0.0311 90.0 8640 0.0465 0.9022 0.9651 0.9326 86 0.9314 0.9157 0.9235 178 0.9843 0.9766 0.9804 128 0.9416 0.9464 0.9440 0.9870
0.0297 91.0 8736 0.0460 0.9022 0.9651 0.9326 86 0.9257 0.9101 0.9178 178 0.9843 0.9766 0.9804 128 0.9391 0.9439 0.9415 0.9876
0.0306 92.0 8832 0.0462 0.9022 0.9651 0.9326 86 0.9257 0.9101 0.9178 178 0.9843 0.9766 0.9804 128 0.9391 0.9439 0.9415 0.9876
0.0335 93.0 8928 0.0460 0.8913 0.9535 0.9213 86 0.92 0.9045 0.9122 178 0.9843 0.9766 0.9804 128 0.9340 0.9388 0.9364 0.9870
0.0288 94.0 9024 0.0462 0.9022 0.9651 0.9326 86 0.9318 0.9213 0.9266 178 0.9843 0.9766 0.9804 128 0.9418 0.9490 0.9454 0.9881
0.0296 95.0 9120 0.0459 0.9022 0.9651 0.9326 86 0.9368 0.9157 0.9261 178 0.9843 0.9766 0.9804 128 0.9440 0.9464 0.9452 0.9881
0.0317 96.0 9216 0.0455 0.9022 0.9651 0.9326 86 0.9314 0.9157 0.9235 178 0.9843 0.9766 0.9804 128 0.9416 0.9464 0.9440 0.9884
0.0298 97.0 9312 0.0457 0.9022 0.9651 0.9326 86 0.9314 0.9157 0.9235 178 0.9843 0.9766 0.9804 128 0.9416 0.9464 0.9440 0.9884
0.0295 98.0 9408 0.0456 0.9022 0.9651 0.9326 86 0.9314 0.9157 0.9235 178 0.9843 0.9766 0.9804 128 0.9416 0.9464 0.9440 0.9884
0.0303 99.0 9504 0.0458 0.9022 0.9651 0.9326 86 0.9314 0.9157 0.9235 178 0.9843 0.9766 0.9804 128 0.9416 0.9464 0.9440 0.9884
0.0304 100.0 9600 0.0458 0.9022 0.9651 0.9326 86 0.9314 0.9157 0.9235 178 0.9843 0.9766 0.9804 128 0.9416 0.9464 0.9440 0.9884

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-lora-r16-3

Finetuned
(366)
this model