nerui-base-0

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1084
  • Location Precision: 0.89
  • Location Recall: 0.9468
  • Location F1: 0.9175
  • Location Number: 94
  • Organization Precision: 0.9387
  • Organization Recall: 0.9162
  • Organization F1: 0.9273
  • Organization Number: 167
  • Person Precision: 1.0
  • Person Recall: 0.9781
  • Person F1: 0.9889
  • Person Number: 137
  • Overall Precision: 0.9471
  • Overall Recall: 0.9447
  • Overall F1: 0.9459
  • Overall Accuracy: 0.9887

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.2566 1.0 96 0.0455 0.9634 0.8404 0.8977 94 0.8333 0.9281 0.8782 167 0.9708 0.9708 0.9708 137 0.9062 0.9221 0.9141 0.9843
0.0617 2.0 192 0.0519 0.8381 0.9362 0.8844 94 0.8896 0.8683 0.8788 167 0.9926 0.9781 0.9853 137 0.9107 0.9221 0.9164 0.9834
0.0356 3.0 288 0.0534 0.9062 0.9255 0.9158 94 0.8211 0.9341 0.8739 167 1.0 0.9708 0.9852 137 0.8974 0.9447 0.9204 0.9840
0.0235 4.0 384 0.0525 0.8866 0.9149 0.9005 94 0.9006 0.9222 0.9112 167 1.0 0.9781 0.9889 137 0.9303 0.9397 0.9350 0.9856
0.0156 5.0 480 0.0623 0.9032 0.8936 0.8984 94 0.9333 0.9222 0.9277 167 0.9926 0.9781 0.9853 137 0.9466 0.9347 0.9406 0.9873
0.0101 6.0 576 0.0590 0.9043 0.9043 0.9043 94 0.8929 0.8982 0.8955 167 0.9926 0.9781 0.9853 137 0.9295 0.9271 0.9283 0.9859
0.0091 7.0 672 0.0955 0.8036 0.9574 0.8738 94 0.9211 0.8383 0.8777 167 0.9643 0.9854 0.9747 137 0.9035 0.9171 0.9102 0.9809
0.0084 8.0 768 0.0871 0.8365 0.9255 0.8788 94 0.9062 0.8683 0.8869 167 1.0 0.9781 0.9889 137 0.9196 0.9196 0.9196 0.9826
0.007 9.0 864 0.0629 0.9565 0.9362 0.9462 94 0.8895 0.9162 0.9027 167 1.0 0.9854 0.9926 137 0.9424 0.9447 0.9435 0.9881
0.0047 10.0 960 0.0564 0.9167 0.9362 0.9263 94 0.9512 0.9341 0.9426 167 1.0 0.9781 0.9889 137 0.9594 0.9497 0.9545 0.9901
0.0043 11.0 1056 0.0829 0.9158 0.9255 0.9206 94 0.8708 0.9281 0.8986 167 0.9926 0.9781 0.9853 137 0.9216 0.9447 0.9330 0.9856
0.0034 12.0 1152 0.0779 0.9247 0.9149 0.9198 94 0.8667 0.9341 0.8991 167 0.9926 0.9781 0.9853 137 0.9216 0.9447 0.9330 0.9865
0.0047 13.0 1248 0.0781 0.8922 0.9681 0.9286 94 0.95 0.9102 0.9297 167 0.9854 0.9854 0.9854 137 0.9474 0.9497 0.9486 0.9862
0.006 14.0 1344 0.0682 0.9271 0.9468 0.9368 94 0.9236 0.8683 0.8951 167 1.0 0.9781 0.9889 137 0.9509 0.9246 0.9376 0.9859
0.0031 15.0 1440 0.0759 0.9149 0.9149 0.9149 94 0.8814 0.9341 0.9070 167 0.9926 0.9781 0.9853 137 0.9261 0.9447 0.9353 0.9878
0.0049 16.0 1536 0.0801 0.9082 0.9468 0.9271 94 0.9107 0.9162 0.9134 167 0.9574 0.9854 0.9712 137 0.9263 0.9472 0.9366 0.9865
0.0036 17.0 1632 0.0933 0.9278 0.9574 0.9424 94 0.9333 0.9222 0.9277 167 0.9853 0.9781 0.9817 137 0.9497 0.9497 0.9497 0.9887
0.0033 18.0 1728 0.0828 0.9167 0.9362 0.9263 94 0.9167 0.9222 0.9194 167 0.9926 0.9781 0.9853 137 0.9424 0.9447 0.9435 0.9870
0.0031 19.0 1824 0.0819 0.9149 0.9149 0.9149 94 0.9102 0.9102 0.9102 167 0.9708 0.9708 0.9708 137 0.9322 0.9322 0.9322 0.9873
0.0025 20.0 1920 0.0871 0.8969 0.9255 0.9110 94 0.9321 0.9042 0.9179 167 0.9708 0.9708 0.9708 137 0.9369 0.9322 0.9345 0.9878
0.0023 21.0 2016 0.0813 0.89 0.9468 0.9175 94 0.9162 0.9162 0.9162 167 0.9706 0.9635 0.9670 137 0.9280 0.9397 0.9338 0.9873
0.0023 22.0 2112 0.0885 0.9158 0.9255 0.9206 94 0.8814 0.9341 0.9070 167 1.0 0.9635 0.9814 137 0.9282 0.9422 0.9352 0.9867
0.0018 23.0 2208 0.1209 0.8788 0.9255 0.9016 94 0.8947 0.9162 0.9053 167 0.9779 0.9708 0.9744 137 0.9187 0.9372 0.9279 0.9837
0.0036 24.0 2304 0.0841 0.9175 0.9468 0.9319 94 0.9029 0.9461 0.9240 167 0.9853 0.9781 0.9817 137 0.9338 0.9573 0.9454 0.9878
0.0034 25.0 2400 0.0860 0.9368 0.9468 0.9418 94 0.9186 0.9461 0.9322 167 0.9926 0.9781 0.9853 137 0.9478 0.9573 0.9525 0.9884
0.0029 26.0 2496 0.0684 0.9381 0.9681 0.9529 94 0.9176 0.9341 0.9258 167 0.9926 0.9781 0.9853 137 0.9478 0.9573 0.9525 0.9898
0.0031 27.0 2592 0.1158 0.9278 0.9574 0.9424 94 0.8933 0.9521 0.9217 167 0.9926 0.9781 0.9853 137 0.9341 0.9623 0.9480 0.9865
0.0045 28.0 2688 0.0860 0.9263 0.9362 0.9312 94 0.8963 0.8802 0.8882 167 0.9926 0.9781 0.9853 137 0.9365 0.9271 0.9318 0.9854
0.0018 29.0 2784 0.0869 0.9271 0.9468 0.9368 94 0.9290 0.9401 0.9345 167 0.9926 0.9781 0.9853 137 0.95 0.9548 0.9524 0.9884
0.0023 30.0 2880 0.1042 0.9184 0.9574 0.9375 94 0.9394 0.9281 0.9337 167 1.0 0.9781 0.9889 137 0.9547 0.9523 0.9535 0.9881
0.0028 31.0 2976 0.1003 0.9020 0.9787 0.9388 94 0.9118 0.9281 0.9199 167 0.9853 0.9781 0.9817 137 0.9338 0.9573 0.9454 0.9862
0.0015 32.0 3072 0.0802 0.91 0.9681 0.9381 94 0.9353 0.9521 0.9436 167 0.9853 0.9781 0.9817 137 0.9458 0.9648 0.9552 0.9890
0.0025 33.0 3168 0.0959 0.8667 0.9681 0.9146 94 0.9375 0.8982 0.9174 167 1.0 0.9781 0.9889 137 0.9398 0.9422 0.9410 0.9862
0.0014 34.0 3264 0.0970 0.9184 0.9574 0.9375 94 0.9286 0.9341 0.9313 167 1.0 0.9781 0.9889 137 0.95 0.9548 0.9524 0.9881
0.0017 35.0 3360 0.0790 0.9570 0.9468 0.9519 94 0.9123 0.9341 0.9231 167 0.9926 0.9781 0.9853 137 0.9499 0.9523 0.9511 0.9890
0.002 36.0 3456 0.0912 0.9010 0.9681 0.9333 94 0.9317 0.8982 0.9146 167 0.9853 0.9781 0.9817 137 0.9422 0.9422 0.9422 0.9870
0.0025 37.0 3552 0.1061 0.9271 0.9468 0.9368 94 0.9030 0.8922 0.8976 167 1.0 0.9781 0.9889 137 0.9418 0.9347 0.9382 0.9865
0.0028 38.0 3648 0.0982 0.9184 0.9574 0.9375 94 0.9085 0.8922 0.9003 167 1.0 0.9781 0.9889 137 0.9419 0.9372 0.9395 0.9870
0.0022 39.0 3744 0.1061 0.8969 0.9255 0.9110 94 0.8953 0.9222 0.9086 167 1.0 0.9781 0.9889 137 0.9305 0.9422 0.9363 0.9848
0.0018 40.0 3840 0.1077 0.8980 0.9362 0.9167 94 0.9202 0.8982 0.9091 167 1.0 0.9781 0.9889 137 0.9418 0.9347 0.9382 0.9862
0.002 41.0 3936 0.0923 0.8980 0.9362 0.9167 94 0.9325 0.9102 0.9212 167 1.0 0.9781 0.9889 137 0.9468 0.9397 0.9433 0.9870
0.003 42.0 4032 0.0899 0.9053 0.9149 0.9101 94 0.9112 0.9222 0.9167 167 0.9853 0.9781 0.9817 137 0.935 0.9397 0.9373 0.9862
0.0027 43.0 4128 0.0827 0.9355 0.9255 0.9305 94 0.9277 0.9222 0.9249 167 1.0 0.9781 0.9889 137 0.9542 0.9422 0.9482 0.9878
0.0015 44.0 4224 0.0798 0.9149 0.9149 0.9149 94 0.9102 0.9102 0.9102 167 1.0 0.9781 0.9889 137 0.9418 0.9347 0.9382 0.9878
0.0011 45.0 4320 0.0868 0.8958 0.9149 0.9053 94 0.9313 0.8922 0.9113 167 0.9853 0.9781 0.9817 137 0.9413 0.9271 0.9342 0.9881
0.0012 46.0 4416 0.0743 0.8922 0.9681 0.9286 94 0.9679 0.9042 0.9350 167 0.9852 0.9708 0.9779 137 0.9542 0.9422 0.9482 0.9903
0.0012 47.0 4512 0.0870 0.9072 0.9362 0.9215 94 0.9375 0.8982 0.9174 167 0.9853 0.9781 0.9817 137 0.9466 0.9347 0.9406 0.9884
0.0019 48.0 4608 0.0759 0.89 0.9468 0.9175 94 0.9308 0.8862 0.9080 167 0.9779 0.9708 0.9744 137 0.9367 0.9296 0.9332 0.9881
0.0015 49.0 4704 0.0810 0.9271 0.9468 0.9368 94 0.9176 0.9341 0.9258 167 1.0 0.9781 0.9889 137 0.9475 0.9523 0.9499 0.9895
0.0011 50.0 4800 0.0890 0.9082 0.9468 0.9271 94 0.9506 0.9222 0.9362 167 0.9853 0.9781 0.9817 137 0.9520 0.9472 0.9496 0.9890
0.0007 51.0 4896 0.0827 0.9167 0.9362 0.9263 94 0.9341 0.9341 0.9341 167 0.9853 0.9781 0.9817 137 0.9474 0.9497 0.9486 0.9895
0.001 52.0 4992 0.0873 0.8980 0.9362 0.9167 94 0.9281 0.9281 0.9281 167 0.9926 0.9781 0.9853 137 0.9425 0.9472 0.9449 0.9887
0.001 53.0 5088 0.0820 0.8980 0.9362 0.9167 94 0.9394 0.9281 0.9337 167 0.9852 0.9708 0.9779 137 0.9447 0.9447 0.9447 0.9890
0.0004 54.0 5184 0.0917 0.8911 0.9574 0.9231 94 0.9434 0.8982 0.9202 167 0.9853 0.9781 0.9817 137 0.9444 0.9397 0.9421 0.9867
0.0006 55.0 5280 0.1053 0.8980 0.9362 0.9167 94 0.9333 0.9222 0.9277 167 0.9926 0.9781 0.9853 137 0.9447 0.9447 0.9447 0.9884
0.001 56.0 5376 0.1040 0.8990 0.9468 0.9223 94 0.9333 0.9222 0.9277 167 0.9853 0.9781 0.9817 137 0.9425 0.9472 0.9449 0.9881
0.0005 57.0 5472 0.1042 0.8990 0.9468 0.9223 94 0.9337 0.9281 0.9309 167 0.9926 0.9781 0.9853 137 0.945 0.9497 0.9474 0.9884
0.0009 58.0 5568 0.1057 0.9082 0.9468 0.9271 94 0.9202 0.8982 0.9091 167 0.9853 0.9781 0.9817 137 0.9395 0.9372 0.9384 0.9876
0.001 59.0 5664 0.1034 0.8911 0.9574 0.9231 94 0.9277 0.9222 0.9249 167 1.0 0.9781 0.9889 137 0.9426 0.9497 0.9462 0.9873
0.0012 60.0 5760 0.0910 0.9072 0.9362 0.9215 94 0.9337 0.9281 0.9309 167 0.9779 0.9708 0.9744 137 0.9424 0.9447 0.9435 0.9887
0.0008 61.0 5856 0.0987 0.9247 0.9149 0.9198 94 0.9102 0.9102 0.9102 167 0.9779 0.9708 0.9744 137 0.9369 0.9322 0.9345 0.9862
0.0005 62.0 5952 0.1056 0.8889 0.9362 0.9119 94 0.9387 0.9162 0.9273 167 1.0 0.9781 0.9889 137 0.9470 0.9422 0.9446 0.9876
0.0006 63.0 6048 0.1050 0.8980 0.9362 0.9167 94 0.9268 0.9102 0.9184 167 0.9926 0.9781 0.9853 137 0.9421 0.9397 0.9409 0.9873
0.0013 64.0 6144 0.0956 0.9072 0.9362 0.9215 94 0.9329 0.9162 0.9245 167 1.0 0.9781 0.9889 137 0.9494 0.9422 0.9458 0.9884
0.0006 65.0 6240 0.1061 0.9082 0.9468 0.9271 94 0.9313 0.8922 0.9113 167 1.0 0.9781 0.9889 137 0.9490 0.9347 0.9418 0.9854
0.0008 66.0 6336 0.1032 0.8980 0.9362 0.9167 94 0.9325 0.9102 0.9212 167 0.9926 0.9781 0.9853 137 0.9444 0.9397 0.9421 0.9881
0.0004 67.0 6432 0.0961 0.8980 0.9362 0.9167 94 0.9273 0.9162 0.9217 167 1.0 0.9781 0.9889 137 0.9446 0.9422 0.9434 0.9890
0.0008 68.0 6528 0.0979 0.88 0.9362 0.9072 94 0.925 0.8862 0.9052 167 0.9926 0.9781 0.9853 137 0.9367 0.9296 0.9332 0.9870
0.0013 69.0 6624 0.1021 0.89 0.9468 0.9175 94 0.9162 0.9162 0.9162 167 1.0 0.9781 0.9889 137 0.9377 0.9447 0.9412 0.9870
0.0004 70.0 6720 0.0933 0.88 0.9362 0.9072 94 0.9264 0.9042 0.9152 167 1.0 0.9781 0.9889 137 0.9395 0.9372 0.9384 0.9881
0.001 71.0 6816 0.0892 0.8788 0.9255 0.9016 94 0.9264 0.9042 0.9152 167 0.9852 0.9708 0.9779 137 0.9345 0.9322 0.9333 0.9881
0.0006 72.0 6912 0.0966 0.9091 0.9574 0.9326 94 0.9509 0.9281 0.9394 167 0.9926 0.9781 0.9853 137 0.9547 0.9523 0.9535 0.9892
0.0006 73.0 7008 0.0997 0.8911 0.9574 0.9231 94 0.9441 0.9102 0.9268 167 1.0 0.9781 0.9889 137 0.9495 0.9447 0.9471 0.9884
0.0004 74.0 7104 0.1035 0.8824 0.9574 0.9184 94 0.9497 0.9042 0.9264 167 0.9926 0.9781 0.9853 137 0.9470 0.9422 0.9446 0.9881
0.0005 75.0 7200 0.1036 0.8788 0.9255 0.9016 94 0.9371 0.8922 0.9141 167 0.9852 0.9708 0.9779 137 0.9389 0.9271 0.9330 0.9870
0.0004 76.0 7296 0.0978 0.8788 0.9255 0.9016 94 0.9317 0.8982 0.9146 167 0.9638 0.9708 0.9673 137 0.9296 0.9296 0.9296 0.9867
0.0004 77.0 7392 0.0896 0.88 0.9362 0.9072 94 0.9273 0.9162 0.9217 167 0.9926 0.9781 0.9853 137 0.9375 0.9422 0.9398 0.9887
0.0007 78.0 7488 0.1034 0.8889 0.9362 0.9119 94 0.9308 0.8862 0.9080 167 1.0 0.9781 0.9889 137 0.9439 0.9296 0.9367 0.9878
0.0004 79.0 7584 0.1117 0.8812 0.9468 0.9128 94 0.9259 0.8982 0.9119 167 1.0 0.9781 0.9889 137 0.9395 0.9372 0.9384 0.9873
0.0006 80.0 7680 0.1053 0.8980 0.9362 0.9167 94 0.9017 0.9341 0.9176 167 1.0 0.9781 0.9889 137 0.9333 0.9497 0.9415 0.9873
0.0003 81.0 7776 0.1023 0.8980 0.9362 0.9167 94 0.9222 0.9222 0.9222 167 1.0 0.9781 0.9889 137 0.9424 0.9447 0.9435 0.9884
0.0005 82.0 7872 0.0998 0.8990 0.9468 0.9223 94 0.9281 0.9281 0.9281 167 1.0 0.9781 0.9889 137 0.945 0.9497 0.9474 0.9887
0.0004 83.0 7968 0.1031 0.8980 0.9362 0.9167 94 0.9222 0.9222 0.9222 167 1.0 0.9781 0.9889 137 0.9424 0.9447 0.9435 0.9884
0.0002 84.0 8064 0.1076 0.9072 0.9362 0.9215 94 0.9273 0.9162 0.9217 167 1.0 0.9781 0.9889 137 0.9470 0.9422 0.9446 0.9890
0.0008 85.0 8160 0.1031 0.9062 0.9255 0.9158 94 0.9273 0.9162 0.9217 167 0.9925 0.9708 0.9815 137 0.9443 0.9372 0.9407 0.9887
0.0003 86.0 8256 0.0967 0.9062 0.9255 0.9158 94 0.9383 0.9102 0.9240 167 0.9925 0.9708 0.9815 137 0.9490 0.9347 0.9418 0.9892
0.0005 87.0 8352 0.0978 0.8889 0.9362 0.9119 94 0.9317 0.8982 0.9146 167 1.0 0.9781 0.9889 137 0.9442 0.9347 0.9394 0.9884
0.0003 88.0 8448 0.1104 0.8889 0.9362 0.9119 94 0.9375 0.8982 0.9174 167 1.0 0.9781 0.9889 137 0.9466 0.9347 0.9406 0.9881
0.0005 89.0 8544 0.1069 0.89 0.9468 0.9175 94 0.9441 0.9102 0.9268 167 1.0 0.9781 0.9889 137 0.9494 0.9422 0.9458 0.9887
0.0003 90.0 8640 0.1071 0.89 0.9468 0.9175 94 0.9441 0.9102 0.9268 167 1.0 0.9781 0.9889 137 0.9494 0.9422 0.9458 0.9887
0.0005 91.0 8736 0.1068 0.89 0.9468 0.9175 94 0.9441 0.9102 0.9268 167 1.0 0.9781 0.9889 137 0.9494 0.9422 0.9458 0.9887
0.0004 92.0 8832 0.1078 0.89 0.9468 0.9175 94 0.9444 0.9162 0.9301 167 1.0 0.9781 0.9889 137 0.9495 0.9447 0.9471 0.9890
0.0003 93.0 8928 0.1079 0.89 0.9468 0.9175 94 0.9444 0.9162 0.9301 167 1.0 0.9781 0.9889 137 0.9495 0.9447 0.9471 0.9890
0.0004 94.0 9024 0.1082 0.89 0.9468 0.9175 94 0.9387 0.9162 0.9273 167 1.0 0.9781 0.9889 137 0.9471 0.9447 0.9459 0.9887
0.0003 95.0 9120 0.1080 0.89 0.9468 0.9175 94 0.9387 0.9162 0.9273 167 1.0 0.9781 0.9889 137 0.9471 0.9447 0.9459 0.9887
0.0003 96.0 9216 0.1082 0.89 0.9468 0.9175 94 0.9387 0.9162 0.9273 167 1.0 0.9781 0.9889 137 0.9471 0.9447 0.9459 0.9887
0.0002 97.0 9312 0.1080 0.89 0.9468 0.9175 94 0.9387 0.9162 0.9273 167 1.0 0.9781 0.9889 137 0.9471 0.9447 0.9459 0.9887
0.0003 98.0 9408 0.1080 0.89 0.9468 0.9175 94 0.9444 0.9162 0.9301 167 1.0 0.9781 0.9889 137 0.9495 0.9447 0.9471 0.9890
0.0003 99.0 9504 0.1085 0.89 0.9468 0.9175 94 0.9387 0.9162 0.9273 167 1.0 0.9781 0.9889 137 0.9471 0.9447 0.9459 0.9887
0.0002 100.0 9600 0.1084 0.89 0.9468 0.9175 94 0.9387 0.9162 0.9273 167 1.0 0.9781 0.9889 137 0.9471 0.9447 0.9459 0.9887

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month
23
Safetensors
Model size
110M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for apwic/nerui-base-0

Finetuned
(368)
this model