Edit model card

nerui-base-4

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0870
  • Location Precision: 0.9434
  • Location Recall: 0.9709
  • Location F1: 0.9569
  • Location Number: 103
  • Organization Precision: 0.9812
  • Organization Recall: 0.9181
  • Organization F1: 0.9486
  • Organization Number: 171
  • Person Precision: 0.9549
  • Person Recall: 0.9695
  • Person F1: 0.9621
  • Person Number: 131
  • Overall Precision: 0.9624
  • Overall Recall: 0.9481
  • Overall F1: 0.9552
  • Overall Accuracy: 0.9909

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.2555 1.0 96 0.0709 0.8083 0.9417 0.8700 103 0.8352 0.8596 0.8473 171 0.9333 0.9618 0.9474 131 0.8585 0.9136 0.8852 0.9774
0.0584 2.0 192 0.0591 0.7951 0.9417 0.8622 103 0.8667 0.9123 0.8889 171 0.9697 0.9771 0.9734 131 0.8779 0.9407 0.9082 0.9796
0.0322 3.0 288 0.0348 0.9340 0.9612 0.9474 103 0.9176 0.9123 0.9150 171 0.9697 0.9771 0.9734 131 0.9387 0.9457 0.9422 0.9898
0.0184 4.0 384 0.0499 0.8839 0.9612 0.9209 103 0.8988 0.8830 0.8909 171 0.9769 0.9695 0.9732 131 0.9195 0.9309 0.9252 0.9862
0.0164 5.0 480 0.0517 0.8870 0.9903 0.9358 103 0.9545 0.8596 0.9046 171 0.9767 0.9618 0.9692 131 0.9422 0.9259 0.9340 0.9870
0.0092 6.0 576 0.0621 0.9314 0.9223 0.9268 103 0.9157 0.8889 0.9021 171 0.9767 0.9618 0.9692 131 0.9395 0.9210 0.9302 0.9865
0.0075 7.0 672 0.0698 0.8807 0.9320 0.9057 103 0.9034 0.9298 0.9164 171 0.9394 0.9466 0.9430 131 0.9089 0.9358 0.9221 0.9848
0.0052 8.0 768 0.0524 0.9333 0.9515 0.9423 103 0.9353 0.9298 0.9326 171 0.9690 0.9542 0.9615 131 0.9455 0.9432 0.9444 0.9912
0.0049 9.0 864 0.0636 0.9293 0.8932 0.9109 103 0.9053 0.8947 0.9000 171 0.9615 0.9542 0.9579 131 0.9296 0.9136 0.9215 0.9854
0.005 10.0 960 0.0718 0.8718 0.9903 0.9273 103 0.9281 0.9064 0.9172 171 0.9328 0.9542 0.9434 131 0.9139 0.9432 0.9283 0.9870
0.0049 11.0 1056 0.0600 0.9505 0.9320 0.9412 103 0.9401 0.9181 0.9290 171 0.9615 0.9542 0.9579 131 0.9497 0.9333 0.9415 0.9895
0.0042 12.0 1152 0.0689 0.9099 0.9806 0.9439 103 0.9571 0.9123 0.9341 171 0.9769 0.9695 0.9732 131 0.9505 0.9481 0.9493 0.9887
0.0038 13.0 1248 0.0786 0.9238 0.9417 0.9327 103 0.9571 0.9123 0.9341 171 0.9692 0.9618 0.9655 131 0.9523 0.9358 0.9440 0.9873
0.0035 14.0 1344 0.0781 0.9083 0.9612 0.9340 103 0.9573 0.9181 0.9373 171 0.9470 0.9542 0.9506 131 0.9407 0.9407 0.9407 0.9892
0.0047 15.0 1440 0.0707 0.9231 0.9320 0.9275 103 0.9341 0.9123 0.9231 171 0.9615 0.9542 0.9579 131 0.9401 0.9309 0.9355 0.9884
0.0047 16.0 1536 0.0851 0.8707 0.9806 0.9224 103 0.9419 0.8538 0.8957 171 0.9466 0.9466 0.9466 131 0.9229 0.9160 0.9195 0.9845
0.0024 17.0 1632 0.0803 0.9314 0.9223 0.9268 103 0.9298 0.9298 0.9298 171 0.9688 0.9466 0.9575 131 0.9426 0.9333 0.9380 0.9865
0.005 18.0 1728 0.0702 0.8783 0.9806 0.9266 103 0.9448 0.9006 0.9222 171 0.9462 0.9389 0.9425 131 0.9265 0.9333 0.9299 0.9881
0.0025 19.0 1824 0.0709 0.9167 0.9612 0.9384 103 0.9394 0.9064 0.9226 171 0.9185 0.9466 0.9323 131 0.9265 0.9333 0.9299 0.9876
0.0028 20.0 1920 0.0700 0.8839 0.9612 0.9209 103 0.9375 0.8772 0.9063 171 0.9690 0.9542 0.9615 131 0.9327 0.9235 0.9280 0.9876
0.0046 21.0 2016 0.0884 0.9320 0.9320 0.9320 103 0.9503 0.8947 0.9217 171 0.9624 0.9771 0.9697 131 0.9496 0.9309 0.9401 0.9859
0.0032 22.0 2112 0.0811 0.95 0.9223 0.9360 103 0.9360 0.9415 0.9388 171 0.9398 0.9542 0.9470 131 0.9407 0.9407 0.9407 0.9884
0.0021 23.0 2208 0.0857 0.9083 0.9612 0.9340 103 0.9444 0.8947 0.9189 171 0.9549 0.9695 0.9621 131 0.9381 0.9358 0.9370 0.9876
0.0015 24.0 2304 0.0834 0.8919 0.9612 0.9252 103 0.9557 0.8830 0.9179 171 0.9403 0.9618 0.9509 131 0.9330 0.9284 0.9307 0.9873
0.0024 25.0 2400 0.1184 0.8621 0.9709 0.9132 103 0.9861 0.8304 0.9016 171 0.9624 0.9771 0.9697 131 0.9415 0.9136 0.9273 0.9832
0.0028 26.0 2496 0.0843 0.8850 0.9709 0.9259 103 0.9490 0.8713 0.9085 171 0.9338 0.9695 0.9513 131 0.9261 0.9284 0.9273 0.9865
0.0026 27.0 2592 0.0965 0.8909 0.9515 0.9202 103 0.9673 0.8655 0.9136 171 0.9191 0.9542 0.9363 131 0.9298 0.9160 0.9229 0.9862
0.003 28.0 2688 0.0935 0.9307 0.9126 0.9216 103 0.9565 0.9006 0.9277 171 0.9474 0.9618 0.9545 131 0.9468 0.9235 0.935 0.9870
0.0039 29.0 2784 0.0754 0.8942 0.9029 0.8986 103 0.9455 0.9123 0.9286 171 0.9845 0.9695 0.9769 131 0.9447 0.9284 0.9365 0.9878
0.0035 30.0 2880 0.0852 0.9167 0.9612 0.9384 103 0.9383 0.8889 0.9129 171 0.9407 0.9695 0.9549 131 0.9333 0.9333 0.9333 0.9859
0.0025 31.0 2976 0.1153 0.8621 0.9709 0.9132 103 0.9799 0.8538 0.9125 171 0.9552 0.9771 0.9660 131 0.9373 0.9235 0.9303 0.9843
0.0025 32.0 3072 0.0899 0.9057 0.9320 0.9187 103 0.9268 0.8889 0.9075 171 0.9412 0.9771 0.9588 131 0.9261 0.9284 0.9273 0.9859
0.002 33.0 3168 0.0965 0.8584 0.9417 0.8981 103 0.9304 0.8596 0.8936 171 0.9549 0.9695 0.9621 131 0.9183 0.9160 0.9172 0.9848
0.002 34.0 3264 0.0981 0.8684 0.9612 0.9124 103 0.9735 0.8596 0.9130 171 0.9695 0.9695 0.9695 131 0.9419 0.9210 0.9313 0.9867
0.0013 35.0 3360 0.0809 0.9057 0.9320 0.9187 103 0.9277 0.9006 0.9139 171 0.9692 0.9618 0.9655 131 0.9353 0.9284 0.9318 0.9895
0.0007 36.0 3456 0.0882 0.9238 0.9417 0.9327 103 0.9451 0.9064 0.9254 171 0.9621 0.9695 0.9658 131 0.9451 0.9358 0.9404 0.9887
0.0015 37.0 3552 0.0853 0.9065 0.9417 0.9238 103 0.9441 0.8889 0.9157 171 0.9695 0.9695 0.9695 131 0.9424 0.9284 0.9353 0.9890
0.0004 38.0 3648 0.0811 0.9245 0.9515 0.9378 103 0.9398 0.9123 0.9258 171 0.9767 0.9618 0.9692 131 0.9476 0.9383 0.9429 0.9895
0.0005 39.0 3744 0.1016 0.9074 0.9515 0.9289 103 0.9677 0.8772 0.9202 171 0.9692 0.9618 0.9655 131 0.9517 0.9235 0.9373 0.9881
0.0013 40.0 3840 0.0848 0.9314 0.9223 0.9268 103 0.9290 0.9181 0.9235 171 0.9545 0.9618 0.9582 131 0.9380 0.9333 0.9356 0.9881
0.0003 41.0 3936 0.1044 0.9327 0.9417 0.9372 103 0.9565 0.9006 0.9277 171 0.9621 0.9695 0.9658 131 0.9521 0.9333 0.9426 0.9881
0.0007 42.0 4032 0.0875 0.9238 0.9417 0.9327 103 0.9625 0.9006 0.9305 171 0.9695 0.9695 0.9695 131 0.9545 0.9333 0.9438 0.9884
0.0009 43.0 4128 0.0934 0.9423 0.9515 0.9469 103 0.9571 0.9123 0.9341 171 0.9695 0.9695 0.9695 131 0.9573 0.9407 0.9489 0.9887
0.0011 44.0 4224 0.1034 0.9333 0.9515 0.9423 103 0.9625 0.9006 0.9305 171 0.9545 0.9618 0.9582 131 0.9521 0.9333 0.9426 0.9870
0.0023 45.0 4320 0.0915 0.9167 0.9612 0.9384 103 0.9509 0.9064 0.9281 171 0.9618 0.9618 0.9618 131 0.9453 0.9383 0.9418 0.9876
0.0008 46.0 4416 0.0958 0.9423 0.9515 0.9469 103 0.9627 0.9064 0.9337 171 0.9545 0.9618 0.9582 131 0.9547 0.9358 0.9451 0.9878
0.0017 47.0 4512 0.1030 0.9252 0.9612 0.9429 103 0.9684 0.8947 0.9301 171 0.9545 0.9618 0.9582 131 0.9521 0.9333 0.9426 0.9865
0.0011 48.0 4608 0.0954 0.9346 0.9709 0.9524 103 0.9625 0.9006 0.9305 171 0.9545 0.9618 0.9582 131 0.9524 0.9383 0.9453 0.9878
0.0004 49.0 4704 0.0885 0.9340 0.9612 0.9474 103 0.9458 0.9181 0.9318 171 0.9615 0.9542 0.9579 131 0.9478 0.9407 0.9442 0.9884
0.0006 50.0 4800 0.1008 0.9245 0.9515 0.9378 103 0.9571 0.9123 0.9341 171 0.9542 0.9542 0.9542 131 0.9475 0.9358 0.9416 0.9887
0.0006 51.0 4896 0.1018 0.9333 0.9515 0.9423 103 0.9571 0.9123 0.9341 171 0.9470 0.9542 0.9506 131 0.9475 0.9358 0.9416 0.9892
0.0005 52.0 4992 0.1026 0.9340 0.9612 0.9474 103 0.9811 0.9123 0.9455 171 0.9545 0.9618 0.9582 131 0.9597 0.9407 0.9501 0.9884
0.001 53.0 5088 0.1140 0.9252 0.9612 0.9429 103 0.9565 0.9006 0.9277 171 0.9398 0.9542 0.9470 131 0.9426 0.9333 0.9380 0.9881
0.0014 54.0 5184 0.0966 0.9340 0.9612 0.9474 103 0.9573 0.9181 0.9373 171 0.9338 0.9695 0.9513 131 0.9433 0.9457 0.9445 0.9878
0.0007 55.0 5280 0.1186 0.9252 0.9612 0.9429 103 0.9808 0.8947 0.9358 171 0.9545 0.9618 0.9582 131 0.9570 0.9333 0.9450 0.9867
0.0017 56.0 5376 0.0981 0.9266 0.9806 0.9528 103 0.9691 0.9181 0.9429 171 0.9474 0.9618 0.9545 131 0.9505 0.9481 0.9493 0.9890
0.0011 57.0 5472 0.0864 0.9434 0.9709 0.9569 103 0.9752 0.9181 0.9458 171 0.9470 0.9542 0.9506 131 0.9574 0.9432 0.9502 0.9898
0.0004 58.0 5568 0.1228 0.9009 0.9709 0.9346 103 0.9739 0.8713 0.9198 171 0.9545 0.9618 0.9582 131 0.9470 0.9259 0.9363 0.9851
0.0007 59.0 5664 0.0882 0.9346 0.9709 0.9524 103 0.9573 0.9181 0.9373 171 0.9545 0.9618 0.9582 131 0.9504 0.9457 0.9480 0.9898
0.0011 60.0 5760 0.0786 0.9346 0.9709 0.9524 103 0.9451 0.9064 0.9254 171 0.9478 0.9695 0.9585 131 0.9432 0.9432 0.9432 0.9903
0.0021 61.0 5856 0.0761 0.9434 0.9709 0.9569 103 0.9634 0.9240 0.9433 171 0.9478 0.9695 0.9585 131 0.9530 0.9506 0.9518 0.9906
0.0009 62.0 5952 0.0789 0.9346 0.9709 0.9524 103 0.9752 0.9181 0.9458 171 0.9403 0.9618 0.9509 131 0.9527 0.9457 0.9492 0.9903
0.0008 63.0 6048 0.0725 0.9346 0.9709 0.9524 103 0.9874 0.9181 0.9515 171 0.9549 0.9695 0.9621 131 0.9624 0.9481 0.9552 0.9912
0.0006 64.0 6144 0.0742 0.9346 0.9709 0.9524 103 0.9632 0.9181 0.9401 171 0.9552 0.9771 0.9660 131 0.9530 0.9506 0.9518 0.9909
0.0002 65.0 6240 0.0765 0.9346 0.9709 0.9524 103 0.9632 0.9181 0.9401 171 0.9552 0.9771 0.9660 131 0.9530 0.9506 0.9518 0.9909
0.0005 66.0 6336 0.0768 0.9346 0.9709 0.9524 103 0.975 0.9123 0.9426 171 0.9545 0.9618 0.9582 131 0.9574 0.9432 0.9502 0.9906
0.0003 67.0 6432 0.0800 0.9346 0.9709 0.9524 103 0.9811 0.9123 0.9455 171 0.9545 0.9618 0.9582 131 0.9598 0.9432 0.9514 0.9898
0.0002 68.0 6528 0.0818 0.9346 0.9709 0.9524 103 0.9811 0.9123 0.9455 171 0.9549 0.9695 0.9621 131 0.9599 0.9457 0.9527 0.9901
0.0002 69.0 6624 0.0801 0.9346 0.9709 0.9524 103 0.9755 0.9298 0.9521 171 0.9549 0.9695 0.9621 131 0.9578 0.9531 0.9554 0.9914
0.0003 70.0 6720 0.0813 0.9346 0.9709 0.9524 103 0.975 0.9123 0.9426 171 0.9549 0.9695 0.9621 131 0.9575 0.9457 0.9516 0.9906
0.0002 71.0 6816 0.0809 0.9434 0.9709 0.9569 103 0.9641 0.9415 0.9527 171 0.9549 0.9695 0.9621 131 0.9557 0.9580 0.9568 0.9920
0.0003 72.0 6912 0.0836 0.9346 0.9709 0.9524 103 0.9755 0.9298 0.9521 171 0.9549 0.9695 0.9621 131 0.9578 0.9531 0.9554 0.9914
0.0002 73.0 7008 0.0845 0.9346 0.9709 0.9524 103 0.9753 0.9240 0.9489 171 0.9549 0.9695 0.9621 131 0.9577 0.9506 0.9542 0.9912
0.0002 74.0 7104 0.0826 0.9346 0.9709 0.9524 103 0.9697 0.9357 0.9524 171 0.9549 0.9695 0.9621 131 0.9556 0.9556 0.9556 0.9914
0.0002 75.0 7200 0.0884 0.9346 0.9709 0.9524 103 0.9756 0.9357 0.9552 171 0.9478 0.9695 0.9585 131 0.9556 0.9556 0.9556 0.9914
0.0004 76.0 7296 0.0857 0.9346 0.9709 0.9524 103 0.9752 0.9181 0.9458 171 0.9549 0.9695 0.9621 131 0.9576 0.9481 0.9529 0.9909
0.0002 77.0 7392 0.0895 0.9346 0.9709 0.9524 103 0.9752 0.9181 0.9458 171 0.9478 0.9695 0.9585 131 0.9552 0.9481 0.9517 0.9906
0.0002 78.0 7488 0.0913 0.9259 0.9709 0.9479 103 0.9568 0.9064 0.9309 171 0.9478 0.9695 0.9585 131 0.9455 0.9432 0.9444 0.9901
0.0002 79.0 7584 0.0814 0.9434 0.9709 0.9569 103 0.9755 0.9298 0.9521 171 0.9549 0.9695 0.9621 131 0.9602 0.9531 0.9566 0.9917
0.0003 80.0 7680 0.0857 0.9346 0.9709 0.9524 103 0.975 0.9123 0.9426 171 0.9549 0.9695 0.9621 131 0.9575 0.9457 0.9516 0.9906
0.0006 81.0 7776 0.0870 0.9346 0.9709 0.9524 103 0.9811 0.9123 0.9455 171 0.9549 0.9695 0.9621 131 0.9599 0.9457 0.9527 0.9903
0.0002 82.0 7872 0.0980 0.9346 0.9709 0.9524 103 0.9811 0.9123 0.9455 171 0.9407 0.9695 0.9549 131 0.9551 0.9457 0.9504 0.9898
0.0003 83.0 7968 0.0886 0.9434 0.9709 0.9569 103 0.9874 0.9181 0.9515 171 0.9621 0.9695 0.9658 131 0.9673 0.9481 0.9576 0.9914
0.0002 84.0 8064 0.0884 0.9346 0.9709 0.9524 103 0.975 0.9123 0.9426 171 0.9549 0.9695 0.9621 131 0.9575 0.9457 0.9516 0.9906
0.0002 85.0 8160 0.0879 0.9346 0.9709 0.9524 103 0.975 0.9123 0.9426 171 0.9478 0.9695 0.9585 131 0.9551 0.9457 0.9504 0.9903
0.0003 86.0 8256 0.0875 0.9434 0.9709 0.9569 103 0.9812 0.9181 0.9486 171 0.9407 0.9695 0.9549 131 0.9576 0.9481 0.9529 0.9903
0.0005 87.0 8352 0.0859 0.9434 0.9709 0.9569 103 0.9812 0.9181 0.9486 171 0.9478 0.9695 0.9585 131 0.96 0.9481 0.9540 0.9906
0.0002 88.0 8448 0.0863 0.9434 0.9709 0.9569 103 0.9812 0.9181 0.9486 171 0.9478 0.9695 0.9585 131 0.96 0.9481 0.9540 0.9906
0.0001 89.0 8544 0.0865 0.9434 0.9709 0.9569 103 0.9812 0.9181 0.9486 171 0.9549 0.9695 0.9621 131 0.9624 0.9481 0.9552 0.9909
0.0006 90.0 8640 0.0859 0.9434 0.9709 0.9569 103 0.9691 0.9181 0.9429 171 0.9407 0.9695 0.9549 131 0.9529 0.9481 0.9505 0.9909
0.0004 91.0 8736 0.0870 0.9434 0.9709 0.9569 103 0.9812 0.9181 0.9486 171 0.9407 0.9695 0.9549 131 0.9576 0.9481 0.9529 0.9903
0.0004 92.0 8832 0.0870 0.9434 0.9709 0.9569 103 0.9812 0.9181 0.9486 171 0.9478 0.9695 0.9585 131 0.96 0.9481 0.9540 0.9906
0.0001 93.0 8928 0.0873 0.9434 0.9709 0.9569 103 0.9812 0.9181 0.9486 171 0.9478 0.9695 0.9585 131 0.96 0.9481 0.9540 0.9906
0.0002 94.0 9024 0.0900 0.9346 0.9709 0.9524 103 0.975 0.9123 0.9426 171 0.9407 0.9695 0.9549 131 0.9527 0.9457 0.9492 0.9901
0.0001 95.0 9120 0.0899 0.9346 0.9709 0.9524 103 0.975 0.9123 0.9426 171 0.9478 0.9695 0.9585 131 0.9551 0.9457 0.9504 0.9903
0.0001 96.0 9216 0.0898 0.9434 0.9709 0.9569 103 0.9812 0.9181 0.9486 171 0.9478 0.9695 0.9585 131 0.96 0.9481 0.9540 0.9906
0.0004 97.0 9312 0.0890 0.9434 0.9709 0.9569 103 0.9812 0.9181 0.9486 171 0.9407 0.9695 0.9549 131 0.9576 0.9481 0.9529 0.9903
0.0004 98.0 9408 0.0886 0.9434 0.9709 0.9569 103 0.9812 0.9181 0.9486 171 0.9478 0.9695 0.9585 131 0.96 0.9481 0.9540 0.9906
0.0001 99.0 9504 0.0872 0.9434 0.9709 0.9569 103 0.9812 0.9181 0.9486 171 0.9549 0.9695 0.9621 131 0.9624 0.9481 0.9552 0.9909
0.0002 100.0 9600 0.0870 0.9434 0.9709 0.9569 103 0.9812 0.9181 0.9486 171 0.9549 0.9695 0.9621 131 0.9624 0.9481 0.9552 0.9909

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month
3
Safetensors
Model size
110M params
Tensor type
F32
·
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Finetuned from