Edit model card

nerui-lora-r8-2

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0395
  • Location Precision: 0.88
  • Location Recall: 0.9462
  • Location F1: 0.9119
  • Location Number: 93
  • Organization Precision: 0.9048
  • Organization Recall: 0.9157
  • Organization F1: 0.9102
  • Organization Number: 166
  • Person Precision: 0.9718
  • Person Recall: 0.9718
  • Person F1: 0.9718
  • Person Number: 142
  • Overall Precision: 0.9220
  • Overall Recall: 0.9426
  • Overall F1: 0.9322
  • Overall Accuracy: 0.9874

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.1704 1.0 96 0.7085 0.0 0.0 0.0 93 0.0 0.0 0.0 166 0.0 0.0 0.0 142 0.0 0.0 0.0 0.8343
0.668 2.0 192 0.5723 0.0 0.0 0.0 93 0.5 0.0060 0.0119 166 0.0 0.0 0.0 142 0.3333 0.0025 0.0050 0.8348
0.5537 3.0 288 0.4494 0.0 0.0 0.0 93 0.4167 0.0602 0.1053 166 0.2353 0.0563 0.0909 142 0.3 0.0449 0.0781 0.8455
0.4382 4.0 384 0.3281 0.2727 0.0645 0.1043 93 0.3710 0.2771 0.3172 166 0.3882 0.4648 0.4231 142 0.3734 0.2943 0.3291 0.8883
0.32 5.0 480 0.2350 0.3857 0.2903 0.3313 93 0.5231 0.6145 0.5651 166 0.5886 0.7254 0.6498 142 0.5273 0.5786 0.5517 0.9292
0.2426 6.0 576 0.1839 0.5745 0.5806 0.5775 93 0.6158 0.7530 0.6775 166 0.7636 0.8873 0.8208 142 0.6602 0.7606 0.7068 0.9512
0.1962 7.0 672 0.1463 0.7188 0.7419 0.7302 93 0.6804 0.7952 0.7333 166 0.8903 0.9718 0.9293 142 0.7618 0.8454 0.8014 0.9619
0.1696 8.0 768 0.1200 0.7732 0.8065 0.7895 93 0.7312 0.8193 0.7727 166 0.9133 0.9648 0.9384 142 0.8037 0.8678 0.8345 0.9682
0.1508 9.0 864 0.1069 0.8 0.8602 0.8290 93 0.7473 0.8373 0.7898 166 0.9079 0.9718 0.9388 142 0.8151 0.8903 0.8510 0.9695
0.1359 10.0 960 0.0937 0.7980 0.8495 0.8229 93 0.7581 0.8494 0.8011 166 0.9195 0.9648 0.9416 142 0.8226 0.8903 0.8551 0.9712
0.126 11.0 1056 0.0873 0.7843 0.8602 0.8205 93 0.7772 0.8614 0.8171 166 0.9133 0.9648 0.9384 142 0.8257 0.8978 0.8602 0.9726
0.1191 12.0 1152 0.0826 0.7885 0.8817 0.8325 93 0.7861 0.8855 0.8329 166 0.9195 0.9648 0.9416 142 0.8318 0.9127 0.8704 0.9739
0.1126 13.0 1248 0.0742 0.8235 0.9032 0.8615 93 0.8167 0.8855 0.8497 166 0.9320 0.9648 0.9481 142 0.8578 0.9177 0.8867 0.9770
0.1061 14.0 1344 0.0707 0.85 0.9140 0.8808 93 0.8439 0.8795 0.8614 166 0.9320 0.9648 0.9481 142 0.8762 0.9177 0.8965 0.9789
0.1003 15.0 1440 0.0703 0.86 0.9247 0.8912 93 0.8278 0.8976 0.8613 166 0.9448 0.9648 0.9547 142 0.8753 0.9277 0.9007 0.9783
0.1008 16.0 1536 0.0686 0.8529 0.9355 0.8923 93 0.8287 0.9036 0.8646 166 0.9320 0.9648 0.9481 142 0.8698 0.9327 0.9001 0.9778
0.0957 17.0 1632 0.0617 0.86 0.9247 0.8912 93 0.8613 0.8976 0.8791 166 0.9514 0.9648 0.9580 142 0.8921 0.9277 0.9095 0.9802
0.0923 18.0 1728 0.0594 0.8687 0.9247 0.8958 93 0.8713 0.8976 0.8843 166 0.9580 0.9648 0.9614 142 0.9007 0.9277 0.9140 0.9819
0.0894 19.0 1824 0.0591 0.8529 0.9355 0.8923 93 0.8497 0.8855 0.8673 166 0.9448 0.9648 0.9547 142 0.8833 0.9252 0.9038 0.9800
0.0852 20.0 1920 0.0565 0.8365 0.9355 0.8832 93 0.8690 0.8795 0.8743 166 0.9448 0.9648 0.9547 142 0.8873 0.9227 0.9046 0.9813
0.0857 21.0 2016 0.0591 0.8286 0.9355 0.8788 93 0.8514 0.8976 0.8739 166 0.9580 0.9648 0.9614 142 0.8818 0.9302 0.9053 0.9816
0.0817 22.0 2112 0.0585 0.8286 0.9355 0.8788 93 0.8506 0.8916 0.8706 166 0.9448 0.9648 0.9547 142 0.8774 0.9277 0.9018 0.9808
0.0792 23.0 2208 0.0544 0.8431 0.9247 0.8821 93 0.8675 0.8675 0.8675 166 0.9580 0.9648 0.9614 142 0.8929 0.9152 0.9039 0.9811
0.0788 24.0 2304 0.0548 0.8269 0.9247 0.8731 93 0.8675 0.8675 0.8675 166 0.9580 0.9648 0.9614 142 0.8886 0.9152 0.9017 0.9811
0.0772 25.0 2400 0.0541 0.8365 0.9355 0.8832 93 0.875 0.8855 0.8802 166 0.9514 0.9648 0.9580 142 0.8918 0.9252 0.9082 0.9816
0.0755 26.0 2496 0.0507 0.8776 0.9247 0.9005 93 0.8772 0.9036 0.8902 166 0.9514 0.9648 0.9580 142 0.9031 0.9302 0.9165 0.9835
0.0717 27.0 2592 0.0506 0.8687 0.9247 0.8958 93 0.8678 0.9096 0.8882 166 0.9580 0.9648 0.9614 142 0.8990 0.9327 0.9155 0.9841
0.0725 28.0 2688 0.0518 0.8350 0.9247 0.8776 93 0.8765 0.8976 0.8869 166 0.9718 0.9718 0.9718 142 0.8988 0.9302 0.9142 0.9833
0.0713 29.0 2784 0.0505 0.8431 0.9247 0.8821 93 0.8817 0.8976 0.8896 166 0.9580 0.9648 0.9614 142 0.8986 0.9277 0.9129 0.9833
0.0671 30.0 2880 0.0477 0.8687 0.9247 0.8958 93 0.8889 0.9157 0.9021 166 0.9718 0.9718 0.9718 142 0.9126 0.9377 0.9250 0.9846
0.0666 31.0 2976 0.0480 0.8350 0.9247 0.8776 93 0.8855 0.8855 0.8855 166 0.9718 0.9718 0.9718 142 0.9027 0.9252 0.9138 0.9838
0.0638 32.0 3072 0.0482 0.8515 0.9247 0.8866 93 0.8922 0.8976 0.8949 166 0.9718 0.9718 0.9718 142 0.9098 0.9302 0.9199 0.9844
0.0647 33.0 3168 0.0482 0.8350 0.9247 0.8776 93 0.8862 0.8916 0.8889 166 0.9718 0.9718 0.9718 142 0.9029 0.9277 0.9151 0.9835
0.0642 34.0 3264 0.0486 0.8431 0.9247 0.8821 93 0.8779 0.9096 0.8935 166 0.9650 0.9718 0.9684 142 0.8993 0.9352 0.9169 0.9833
0.0603 35.0 3360 0.0463 0.8515 0.9247 0.8866 93 0.8929 0.9036 0.8982 166 0.9718 0.9718 0.9718 142 0.9100 0.9327 0.9212 0.9852
0.0627 36.0 3456 0.0483 0.8350 0.9247 0.8776 93 0.8876 0.9036 0.8955 166 0.9718 0.9718 0.9718 142 0.9034 0.9327 0.9178 0.9846
0.0606 37.0 3552 0.0461 0.8776 0.9247 0.9005 93 0.8902 0.9277 0.9086 166 0.9718 0.9718 0.9718 142 0.9153 0.9426 0.9287 0.9855
0.0602 38.0 3648 0.0457 0.8958 0.9247 0.9101 93 0.8953 0.9277 0.9112 166 0.9718 0.9718 0.9718 142 0.9220 0.9426 0.9322 0.9857
0.058 39.0 3744 0.0452 0.8866 0.9247 0.9053 93 0.8902 0.9277 0.9086 166 0.9718 0.9718 0.9718 142 0.9175 0.9426 0.9299 0.9860
0.0579 40.0 3840 0.0443 0.8958 0.9247 0.9101 93 0.9 0.9217 0.9107 166 0.9718 0.9718 0.9718 142 0.9240 0.9401 0.9320 0.9863
0.0551 41.0 3936 0.0439 0.8958 0.9247 0.9101 93 0.8960 0.9337 0.9145 166 0.9718 0.9718 0.9718 142 0.9221 0.9451 0.9335 0.9868
0.0568 42.0 4032 0.0435 0.8788 0.9355 0.9062 93 0.8941 0.9157 0.9048 166 0.9718 0.9718 0.9718 142 0.9173 0.9401 0.9286 0.9866
0.0557 43.0 4128 0.0440 0.8969 0.9355 0.9158 93 0.9042 0.9096 0.9069 166 0.9718 0.9718 0.9718 142 0.9261 0.9377 0.9318 0.9860
0.0582 44.0 4224 0.0446 0.8529 0.9355 0.8923 93 0.9024 0.8916 0.8970 166 0.9718 0.9718 0.9718 142 0.9142 0.9302 0.9221 0.9844
0.0548 45.0 4320 0.0424 0.8878 0.9355 0.9110 93 0.9107 0.9217 0.9162 166 0.9718 0.9718 0.9718 142 0.9265 0.9426 0.9345 0.9866
0.0533 46.0 4416 0.0424 0.8788 0.9355 0.9062 93 0.8988 0.9096 0.9042 166 0.9718 0.9718 0.9718 142 0.9193 0.9377 0.9284 0.9866
0.0516 47.0 4512 0.0428 0.8687 0.9247 0.8958 93 0.8864 0.9398 0.9123 166 0.9650 0.9718 0.9684 142 0.9091 0.9476 0.9280 0.9860
0.0501 48.0 4608 0.0430 0.8788 0.9355 0.9062 93 0.9042 0.9096 0.9069 166 0.9718 0.9718 0.9718 142 0.9216 0.9377 0.9295 0.9863
0.053 49.0 4704 0.0433 0.8788 0.9355 0.9062 93 0.9053 0.9217 0.9134 166 0.9718 0.9718 0.9718 142 0.9220 0.9426 0.9322 0.9866
0.0483 50.0 4800 0.0416 0.9062 0.9355 0.9206 93 0.9048 0.9157 0.9102 166 0.9718 0.9718 0.9718 142 0.9286 0.9401 0.9343 0.9871
0.0505 51.0 4896 0.0418 0.8980 0.9462 0.9215 93 0.9096 0.9096 0.9096 166 0.9718 0.9718 0.9718 142 0.9286 0.9401 0.9343 0.9866
0.05 52.0 4992 0.0403 0.9255 0.9355 0.9305 93 0.8895 0.9217 0.9053 166 0.9718 0.9718 0.9718 142 0.9265 0.9426 0.9345 0.9879
0.0493 53.0 5088 0.0422 0.8969 0.9355 0.9158 93 0.9048 0.9157 0.9102 166 0.9718 0.9718 0.9718 142 0.9263 0.9401 0.9332 0.9860
0.0487 54.0 5184 0.0408 0.9158 0.9355 0.9255 93 0.9053 0.9217 0.9134 166 0.9718 0.9718 0.9718 142 0.9310 0.9426 0.9368 0.9877
0.0485 55.0 5280 0.0402 0.9158 0.9355 0.9255 93 0.9112 0.9277 0.9194 166 0.9718 0.9718 0.9718 142 0.9335 0.9451 0.9393 0.9874
0.0491 56.0 5376 0.0432 0.8878 0.9355 0.9110 93 0.8960 0.9337 0.9145 166 0.9718 0.9718 0.9718 142 0.9201 0.9476 0.9337 0.9863
0.0495 57.0 5472 0.0409 0.8980 0.9462 0.9215 93 0.9048 0.9157 0.9102 166 0.9718 0.9718 0.9718 142 0.9265 0.9426 0.9345 0.9866
0.0495 58.0 5568 0.0425 0.8980 0.9462 0.9215 93 0.9048 0.9157 0.9102 166 0.9718 0.9718 0.9718 142 0.9265 0.9426 0.9345 0.9866
0.0462 59.0 5664 0.0412 0.8980 0.9462 0.9215 93 0.9102 0.9157 0.9129 166 0.9718 0.9718 0.9718 142 0.9287 0.9426 0.9356 0.9871
0.048 60.0 5760 0.0409 0.9072 0.9462 0.9263 93 0.9157 0.9157 0.9157 166 0.9718 0.9718 0.9718 142 0.9333 0.9426 0.9380 0.9868
0.048 61.0 5856 0.0396 0.8980 0.9462 0.9215 93 0.9102 0.9157 0.9129 166 0.9718 0.9718 0.9718 142 0.9287 0.9426 0.9356 0.9879
0.0461 62.0 5952 0.0403 0.8969 0.9355 0.9158 93 0.8935 0.9096 0.9015 166 0.9718 0.9718 0.9718 142 0.9216 0.9377 0.9295 0.9871
0.0459 63.0 6048 0.0405 0.8889 0.9462 0.9167 93 0.9162 0.9217 0.9189 166 0.9718 0.9718 0.9718 142 0.9289 0.9451 0.9370 0.9871
0.0461 64.0 6144 0.0394 0.8969 0.9355 0.9158 93 0.8882 0.9096 0.8988 166 0.9718 0.9718 0.9718 142 0.9193 0.9377 0.9284 0.9874
0.0431 65.0 6240 0.0408 0.8980 0.9462 0.9215 93 0.9091 0.9036 0.9063 166 0.9718 0.9718 0.9718 142 0.9284 0.9377 0.9330 0.9874
0.0448 66.0 6336 0.0396 0.9072 0.9462 0.9263 93 0.9053 0.9217 0.9134 166 0.9718 0.9718 0.9718 142 0.9289 0.9451 0.9370 0.9877
0.044 67.0 6432 0.0403 0.8980 0.9462 0.9215 93 0.9091 0.9036 0.9063 166 0.9718 0.9718 0.9718 142 0.9284 0.9377 0.9330 0.9871
0.0439 68.0 6528 0.0404 0.8889 0.9462 0.9167 93 0.9207 0.9096 0.9152 166 0.9718 0.9718 0.9718 142 0.9309 0.9401 0.9355 0.9874
0.0451 69.0 6624 0.0416 0.88 0.9462 0.9119 93 0.9207 0.9096 0.9152 166 0.9718 0.9718 0.9718 142 0.9286 0.9401 0.9343 0.9868
0.0429 70.0 6720 0.0403 0.88 0.9462 0.9119 93 0.9053 0.9217 0.9134 166 0.9718 0.9718 0.9718 142 0.9221 0.9451 0.9335 0.9877
0.0447 71.0 6816 0.0402 0.8980 0.9462 0.9215 93 0.9167 0.9277 0.9222 166 0.9718 0.9718 0.9718 142 0.9314 0.9476 0.9394 0.9877
0.0437 72.0 6912 0.0398 0.8889 0.9462 0.9167 93 0.9152 0.9096 0.9124 166 0.9718 0.9718 0.9718 142 0.9286 0.9401 0.9343 0.9871
0.041 73.0 7008 0.0399 0.8878 0.9355 0.9110 93 0.9107 0.9217 0.9162 166 0.9718 0.9718 0.9718 142 0.9265 0.9426 0.9345 0.9874
0.0425 74.0 7104 0.0406 0.8969 0.9355 0.9158 93 0.9112 0.9277 0.9194 166 0.9718 0.9718 0.9718 142 0.9289 0.9451 0.9370 0.9871
0.0426 75.0 7200 0.0395 0.8878 0.9355 0.9110 93 0.9 0.9217 0.9107 166 0.9718 0.9718 0.9718 142 0.9220 0.9426 0.9322 0.9871
0.0398 76.0 7296 0.0402 0.8889 0.9462 0.9167 93 0.9212 0.9157 0.9184 166 0.9718 0.9718 0.9718 142 0.9310 0.9426 0.9368 0.9874
0.0407 77.0 7392 0.0392 0.8788 0.9355 0.9062 93 0.9096 0.9096 0.9096 166 0.9718 0.9718 0.9718 142 0.9238 0.9377 0.9307 0.9874
0.0411 78.0 7488 0.0394 0.8788 0.9355 0.9062 93 0.8988 0.9096 0.9042 166 0.9718 0.9718 0.9718 142 0.9193 0.9377 0.9284 0.9868
0.0417 79.0 7584 0.0395 0.8788 0.9355 0.9062 93 0.9053 0.9217 0.9134 166 0.9718 0.9718 0.9718 142 0.9220 0.9426 0.9322 0.9871
0.0412 80.0 7680 0.0396 0.8788 0.9355 0.9062 93 0.8895 0.9217 0.9053 166 0.9718 0.9718 0.9718 142 0.9153 0.9426 0.9287 0.9877
0.0431 81.0 7776 0.0399 0.87 0.9355 0.9016 93 0.8941 0.9157 0.9048 166 0.9718 0.9718 0.9718 142 0.9150 0.9401 0.9274 0.9871
0.042 82.0 7872 0.0401 0.8889 0.9462 0.9167 93 0.9162 0.9217 0.9189 166 0.9718 0.9718 0.9718 142 0.9289 0.9451 0.9370 0.9871
0.0412 83.0 7968 0.0403 0.8889 0.9462 0.9167 93 0.9162 0.9217 0.9189 166 0.9718 0.9718 0.9718 142 0.9289 0.9451 0.9370 0.9871
0.0413 84.0 8064 0.0409 0.8889 0.9462 0.9167 93 0.9102 0.9157 0.9129 166 0.9718 0.9718 0.9718 142 0.9265 0.9426 0.9345 0.9871
0.0405 85.0 8160 0.0397 0.8713 0.9462 0.9072 93 0.9107 0.9217 0.9162 166 0.9718 0.9718 0.9718 142 0.9221 0.9451 0.9335 0.9879
0.0405 86.0 8256 0.0397 0.8713 0.9462 0.9072 93 0.9107 0.9217 0.9162 166 0.9718 0.9718 0.9718 142 0.9221 0.9451 0.9335 0.9879
0.0401 87.0 8352 0.0398 0.8713 0.9462 0.9072 93 0.9048 0.9157 0.9102 166 0.9718 0.9718 0.9718 142 0.9197 0.9426 0.9310 0.9877
0.041 88.0 8448 0.0398 0.8713 0.9462 0.9072 93 0.9107 0.9217 0.9162 166 0.9718 0.9718 0.9718 142 0.9221 0.9451 0.9335 0.9879
0.0397 89.0 8544 0.0396 0.87 0.9355 0.9016 93 0.9048 0.9157 0.9102 166 0.9718 0.9718 0.9718 142 0.9195 0.9401 0.9297 0.9877
0.0398 90.0 8640 0.0396 0.87 0.9355 0.9016 93 0.9053 0.9217 0.9134 166 0.9718 0.9718 0.9718 142 0.9197 0.9426 0.9310 0.9879
0.039 91.0 8736 0.0395 0.88 0.9462 0.9119 93 0.9162 0.9217 0.9189 166 0.9718 0.9718 0.9718 142 0.9267 0.9451 0.9358 0.9877
0.0385 92.0 8832 0.0398 0.88 0.9462 0.9119 93 0.9048 0.9157 0.9102 166 0.9718 0.9718 0.9718 142 0.9220 0.9426 0.9322 0.9871
0.0385 93.0 8928 0.0398 0.88 0.9462 0.9119 93 0.9048 0.9157 0.9102 166 0.9718 0.9718 0.9718 142 0.9220 0.9426 0.9322 0.9874
0.0398 94.0 9024 0.0397 0.88 0.9462 0.9119 93 0.9048 0.9157 0.9102 166 0.9718 0.9718 0.9718 142 0.9220 0.9426 0.9322 0.9871
0.0382 95.0 9120 0.0396 0.88 0.9462 0.9119 93 0.9048 0.9157 0.9102 166 0.9718 0.9718 0.9718 142 0.9220 0.9426 0.9322 0.9874
0.0408 96.0 9216 0.0394 0.88 0.9462 0.9119 93 0.9048 0.9157 0.9102 166 0.9718 0.9718 0.9718 142 0.9220 0.9426 0.9322 0.9874
0.0372 97.0 9312 0.0395 0.88 0.9462 0.9119 93 0.9048 0.9157 0.9102 166 0.9718 0.9718 0.9718 142 0.9220 0.9426 0.9322 0.9871
0.0392 98.0 9408 0.0395 0.8788 0.9355 0.9062 93 0.8994 0.9157 0.9075 166 0.9718 0.9718 0.9718 142 0.9195 0.9401 0.9297 0.9871
0.0393 99.0 9504 0.0395 0.88 0.9462 0.9119 93 0.9048 0.9157 0.9102 166 0.9718 0.9718 0.9718 142 0.9220 0.9426 0.9322 0.9874
0.0399 100.0 9600 0.0395 0.88 0.9462 0.9119 93 0.9048 0.9157 0.9102 166 0.9718 0.9718 0.9718 142 0.9220 0.9426 0.9322 0.9874

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-lora-r8-2

Finetuned
(360)
this model