Edit model card

nerui-lora-r8-4

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0437
  • Location Precision: 0.8739
  • Location Recall: 0.9417
  • Location F1: 0.9065
  • Location Number: 103
  • Organization Precision: 0.9152
  • Organization Recall: 0.8830
  • Organization F1: 0.8988
  • Organization Number: 171
  • Person Precision: 0.9695
  • Person Recall: 0.9695
  • Person F1: 0.9695
  • Person Number: 131
  • Overall Precision: 0.9214
  • Overall Recall: 0.9259
  • Overall F1: 0.9236
  • Overall Accuracy: 0.9870

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.1566 1.0 96 0.6952 0.0 0.0 0.0 103 0.0 0.0 0.0 171 0.0 0.0 0.0 131 0.0 0.0 0.0 0.8373
0.6676 2.0 192 0.5653 0.0 0.0 0.0 103 0.0 0.0 0.0 171 0.0 0.0 0.0 131 0.0 0.0 0.0 0.8376
0.5559 3.0 288 0.4487 0.0 0.0 0.0 103 0.375 0.0351 0.0642 171 0.2188 0.0534 0.0859 131 0.26 0.0321 0.0571 0.8456
0.4455 4.0 384 0.3389 0.2083 0.0485 0.0787 103 0.3509 0.2339 0.2807 171 0.3816 0.4427 0.4099 131 0.3552 0.2543 0.2964 0.8818
0.3416 5.0 480 0.2583 0.3971 0.2621 0.3158 103 0.4923 0.5614 0.5246 171 0.5176 0.6718 0.5847 131 0.4873 0.5210 0.5036 0.9207
0.2637 6.0 576 0.2006 0.6316 0.5825 0.6061 103 0.6490 0.7895 0.7124 171 0.7124 0.8321 0.7676 131 0.6667 0.7506 0.7062 0.9489
0.2115 7.0 672 0.1649 0.7273 0.6990 0.7129 103 0.6946 0.8246 0.7540 171 0.8542 0.9389 0.8945 131 0.7534 0.8296 0.7897 0.9586
0.1785 8.0 768 0.1343 0.8316 0.7670 0.7980 103 0.7461 0.8421 0.7912 171 0.9 0.9618 0.9299 131 0.8154 0.8617 0.8379 0.9652
0.1541 9.0 864 0.1175 0.8384 0.8058 0.8218 103 0.7737 0.8596 0.8144 171 0.8936 0.9618 0.9265 131 0.8279 0.8790 0.8527 0.9682
0.1387 10.0 960 0.1095 0.8235 0.8155 0.8195 103 0.7853 0.8772 0.8287 171 0.8944 0.9695 0.9304 131 0.8299 0.8914 0.8595 0.9696
0.1275 11.0 1056 0.0995 0.85 0.8252 0.8374 103 0.7937 0.8772 0.8333 171 0.9 0.9618 0.9299 131 0.8415 0.8914 0.8657 0.9710
0.1212 12.0 1152 0.0935 0.8641 0.8641 0.8641 103 0.7917 0.8889 0.8375 171 0.9 0.9618 0.9299 131 0.8437 0.9062 0.8738 0.9724
0.1164 13.0 1248 0.0875 0.8627 0.8544 0.8585 103 0.8010 0.8947 0.8453 171 0.9 0.9618 0.9299 131 0.8476 0.9062 0.8759 0.9724
0.1105 14.0 1344 0.0820 0.8922 0.8835 0.8878 103 0.8466 0.8713 0.8588 171 0.9265 0.9618 0.9438 131 0.8841 0.9037 0.8938 0.9768
0.1063 15.0 1440 0.0793 0.9175 0.8641 0.89 103 0.7908 0.9064 0.8447 171 0.9065 0.9618 0.9333 131 0.8565 0.9136 0.8841 0.9751
0.1018 16.0 1536 0.0783 0.8762 0.8932 0.8846 103 0.8010 0.9181 0.8556 171 0.9065 0.9618 0.9333 131 0.8523 0.9259 0.8876 0.9749
0.0986 17.0 1632 0.0725 0.9109 0.8932 0.9020 103 0.8280 0.9006 0.8627 171 0.9407 0.9695 0.9549 131 0.8839 0.9210 0.9021 0.9779
0.093 18.0 1728 0.0693 0.9010 0.8835 0.8922 103 0.8432 0.9123 0.8764 171 0.9333 0.9618 0.9474 131 0.8860 0.9210 0.9031 0.9779
0.0897 19.0 1824 0.0699 0.8762 0.8932 0.8846 103 0.8470 0.9064 0.8757 171 0.9270 0.9695 0.9478 131 0.88 0.9235 0.9012 0.9782
0.0876 20.0 1920 0.0679 0.8846 0.8932 0.8889 103 0.8201 0.9064 0.8611 171 0.9065 0.9618 0.9333 131 0.8634 0.9210 0.8913 0.9765
0.0846 21.0 2016 0.0654 0.8679 0.8932 0.8804 103 0.8378 0.9064 0.8708 171 0.9197 0.9618 0.9403 131 0.8715 0.9210 0.8956 0.9785
0.0843 22.0 2112 0.0664 0.8932 0.8932 0.8932 103 0.8325 0.9298 0.8785 171 0.9197 0.9618 0.9403 131 0.8747 0.9309 0.9019 0.9787
0.0823 23.0 2208 0.0611 0.8679 0.8932 0.8804 103 0.8492 0.8889 0.8686 171 0.9621 0.9695 0.9658 131 0.8897 0.9160 0.9027 0.9801
0.0808 24.0 2304 0.0627 0.8505 0.8835 0.8667 103 0.8415 0.9006 0.8701 171 0.9549 0.9695 0.9621 131 0.8794 0.9185 0.8986 0.9798
0.0809 25.0 2400 0.0598 0.875 0.8835 0.8792 103 0.8424 0.9064 0.8732 171 0.9474 0.9618 0.9545 131 0.8836 0.9185 0.9007 0.9807
0.078 26.0 2496 0.0581 0.9010 0.8835 0.8922 103 0.8441 0.9181 0.8796 171 0.9403 0.9618 0.9509 131 0.8884 0.9235 0.9056 0.9818
0.0774 27.0 2592 0.0582 0.92 0.8932 0.9064 103 0.8503 0.9298 0.8883 171 0.9403 0.9618 0.9509 131 0.8955 0.9309 0.9128 0.9812
0.0732 28.0 2688 0.0623 0.9020 0.8932 0.8976 103 0.8659 0.9064 0.8857 171 0.9474 0.9618 0.9545 131 0.9010 0.9210 0.9109 0.9815
0.0746 29.0 2784 0.0553 0.9109 0.8932 0.9020 103 0.8827 0.9240 0.9029 171 0.9621 0.9695 0.9658 131 0.9150 0.9309 0.9229 0.9829
0.0695 30.0 2880 0.0536 0.9109 0.8932 0.9020 103 0.8827 0.9240 0.9029 171 0.9474 0.9618 0.9545 131 0.9104 0.9284 0.9193 0.9832
0.0691 31.0 2976 0.0533 0.8762 0.8932 0.8846 103 0.8807 0.9064 0.8934 171 0.9474 0.9618 0.9545 131 0.9010 0.9210 0.9109 0.9826
0.0665 32.0 3072 0.0518 0.8835 0.8835 0.8835 103 0.8652 0.9006 0.8825 171 0.9474 0.9618 0.9545 131 0.8961 0.9160 0.9060 0.9823
0.0649 33.0 3168 0.0527 0.8288 0.8932 0.8598 103 0.9018 0.8596 0.8802 171 0.9474 0.9618 0.9545 131 0.8968 0.9012 0.8990 0.9809
0.0645 34.0 3264 0.0506 0.8835 0.8835 0.8835 103 0.8966 0.9123 0.9043 171 0.9474 0.9618 0.9545 131 0.9098 0.9210 0.9153 0.9843
0.063 35.0 3360 0.0515 0.8505 0.8835 0.8667 103 0.8889 0.8889 0.8889 171 0.9474 0.9618 0.9545 131 0.8978 0.9111 0.9044 0.9826
0.0637 36.0 3456 0.0508 0.8505 0.8835 0.8667 103 0.8830 0.8830 0.8830 171 0.9474 0.9618 0.9545 131 0.8954 0.9086 0.9020 0.9818
0.0614 37.0 3552 0.0495 0.9010 0.8835 0.8922 103 0.8729 0.9240 0.8977 171 0.9474 0.9618 0.9545 131 0.9036 0.9259 0.9146 0.9829
0.0599 38.0 3648 0.0495 0.8585 0.8835 0.8708 103 0.8982 0.8772 0.8876 171 0.9474 0.9618 0.9545 131 0.9039 0.9062 0.9051 0.9820
0.06 39.0 3744 0.0495 0.8519 0.8932 0.8720 103 0.8728 0.8830 0.8779 171 0.9695 0.9695 0.9695 131 0.8981 0.9136 0.9058 0.9820
0.0576 40.0 3840 0.0480 0.8667 0.8835 0.8750 103 0.8994 0.8889 0.8941 171 0.9621 0.9695 0.9658 131 0.9113 0.9136 0.9125 0.9837
0.0597 41.0 3936 0.0485 0.8679 0.8932 0.8804 103 0.875 0.9006 0.8876 171 0.9621 0.9695 0.9658 131 0.9010 0.9210 0.9109 0.9829
0.0581 42.0 4032 0.0473 0.8598 0.8932 0.8762 103 0.8736 0.8889 0.8812 171 0.9621 0.9695 0.9658 131 0.8983 0.9160 0.9071 0.9829
0.0597 43.0 4128 0.0479 0.8679 0.8932 0.8804 103 0.8736 0.8889 0.8812 171 0.9695 0.9695 0.9695 131 0.9027 0.9160 0.9093 0.9826
0.0568 44.0 4224 0.0481 0.8519 0.8932 0.8720 103 0.8982 0.8772 0.8876 171 0.9621 0.9695 0.9658 131 0.9066 0.9111 0.9089 0.9826
0.0561 45.0 4320 0.0470 0.8519 0.8932 0.8720 103 0.8876 0.8772 0.8824 171 0.9695 0.9695 0.9695 131 0.9044 0.9111 0.9077 0.9834
0.0552 46.0 4416 0.0478 0.8519 0.8932 0.8720 103 0.9036 0.8772 0.8902 171 0.9695 0.9695 0.9695 131 0.9111 0.9111 0.9111 0.9837
0.0562 47.0 4512 0.0461 0.8762 0.8932 0.8846 103 0.8644 0.8947 0.8793 171 0.9695 0.9695 0.9695 131 0.9007 0.9185 0.9095 0.9840
0.0533 48.0 4608 0.0474 0.8545 0.9126 0.8826 103 0.9085 0.8713 0.8896 171 0.9695 0.9695 0.9695 131 0.9136 0.9136 0.9136 0.9837
0.0522 49.0 4704 0.0461 0.8468 0.9126 0.8785 103 0.8772 0.8772 0.8772 171 0.9695 0.9695 0.9695 131 0.8983 0.9160 0.9071 0.9843
0.052 50.0 4800 0.0464 0.8559 0.9223 0.8879 103 0.8793 0.8947 0.8870 171 0.9695 0.9695 0.9695 131 0.9014 0.9259 0.9135 0.9840
0.054 51.0 4896 0.0467 0.8571 0.9320 0.8930 103 0.9030 0.8713 0.8869 171 0.9695 0.9695 0.9695 131 0.9118 0.9185 0.9151 0.9854
0.0525 52.0 4992 0.0460 0.8829 0.9515 0.9159 103 0.8671 0.8772 0.8721 171 0.9695 0.9695 0.9695 131 0.9036 0.9259 0.9146 0.9840
0.0501 53.0 5088 0.0466 0.8739 0.9417 0.9065 103 0.8922 0.8713 0.8817 171 0.9695 0.9695 0.9695 131 0.9120 0.9210 0.9165 0.9848
0.0498 54.0 5184 0.0449 0.8716 0.9223 0.8962 103 0.8779 0.8830 0.8805 171 0.9695 0.9695 0.9695 131 0.9053 0.9210 0.9131 0.9840
0.0504 55.0 5280 0.0456 0.8739 0.9417 0.9065 103 0.8982 0.8772 0.8876 171 0.9695 0.9695 0.9695 131 0.9144 0.9235 0.9189 0.9854
0.0486 56.0 5376 0.0453 0.8727 0.9320 0.9014 103 0.8736 0.8889 0.8812 171 0.9695 0.9695 0.9695 131 0.9036 0.9259 0.9146 0.9845
0.0497 57.0 5472 0.0457 0.8509 0.9417 0.8940 103 0.8982 0.8772 0.8876 171 0.9695 0.9695 0.9695 131 0.9078 0.9235 0.9155 0.9845
0.0487 58.0 5568 0.0460 0.8739 0.9417 0.9065 103 0.9036 0.8772 0.8902 171 0.9695 0.9695 0.9695 131 0.9167 0.9235 0.9200 0.9854
0.0473 59.0 5664 0.0456 0.8559 0.9223 0.8879 103 0.8922 0.8713 0.8817 171 0.9695 0.9695 0.9695 131 0.9071 0.9160 0.9115 0.9851
0.0463 60.0 5760 0.0454 0.8559 0.9223 0.8879 103 0.8970 0.8655 0.8810 171 0.9695 0.9695 0.9695 131 0.9091 0.9136 0.9113 0.9854
0.0486 61.0 5856 0.0456 0.8739 0.9417 0.9065 103 0.8970 0.8655 0.8810 171 0.9695 0.9695 0.9695 131 0.9140 0.9185 0.9163 0.9856
0.0484 62.0 5952 0.0465 0.8571 0.9320 0.8930 103 0.8970 0.8655 0.8810 171 0.9695 0.9695 0.9695 131 0.9093 0.9160 0.9127 0.9851
0.0461 63.0 6048 0.0451 0.875 0.9515 0.9116 103 0.8988 0.8830 0.8909 171 0.9695 0.9695 0.9695 131 0.9148 0.9284 0.9216 0.9856
0.0455 64.0 6144 0.0451 0.8727 0.9320 0.9014 103 0.8976 0.8713 0.8843 171 0.9695 0.9695 0.9695 131 0.9140 0.9185 0.9163 0.9854
0.0472 65.0 6240 0.0453 0.8739 0.9417 0.9065 103 0.9030 0.8713 0.8869 171 0.9695 0.9695 0.9695 131 0.9165 0.9210 0.9187 0.9859
0.0453 66.0 6336 0.0451 0.8739 0.9417 0.9065 103 0.9146 0.8772 0.8955 171 0.9695 0.9695 0.9695 131 0.9212 0.9235 0.9223 0.9865
0.045 67.0 6432 0.0450 0.8739 0.9417 0.9065 103 0.9085 0.8713 0.8896 171 0.9695 0.9695 0.9695 131 0.9187 0.9210 0.9199 0.9856
0.0466 68.0 6528 0.0440 0.8909 0.9515 0.9202 103 0.9036 0.8772 0.8902 171 0.9695 0.9695 0.9695 131 0.9214 0.9259 0.9236 0.9867
0.046 69.0 6624 0.0446 0.8649 0.9320 0.8972 103 0.9085 0.8713 0.8896 171 0.9695 0.9695 0.9695 131 0.9163 0.9185 0.9174 0.9854
0.0436 70.0 6720 0.0440 0.8649 0.9320 0.8972 103 0.9091 0.8772 0.8929 171 0.9695 0.9695 0.9695 131 0.9165 0.9210 0.9187 0.9862
0.0445 71.0 6816 0.0446 0.8673 0.9515 0.9074 103 0.8988 0.8830 0.8909 171 0.9695 0.9695 0.9695 131 0.9126 0.9284 0.9204 0.9862
0.0437 72.0 6912 0.0459 0.8727 0.9320 0.9014 103 0.9024 0.8655 0.8836 171 0.9695 0.9695 0.9695 131 0.9160 0.9160 0.9160 0.9854
0.0434 73.0 7008 0.0444 0.875 0.9515 0.9116 103 0.9091 0.8772 0.8929 171 0.9695 0.9695 0.9695 131 0.9191 0.9259 0.9225 0.9862
0.0441 74.0 7104 0.0445 0.8739 0.9417 0.9065 103 0.9091 0.8772 0.8929 171 0.9695 0.9695 0.9695 131 0.9189 0.9235 0.9212 0.9862
0.0439 75.0 7200 0.0446 0.8739 0.9417 0.9065 103 0.9091 0.8772 0.8929 171 0.9695 0.9695 0.9695 131 0.9189 0.9235 0.9212 0.9862
0.042 76.0 7296 0.0447 0.8661 0.9417 0.9023 103 0.8982 0.8772 0.8876 171 0.9695 0.9695 0.9695 131 0.9122 0.9235 0.9178 0.9859
0.0428 77.0 7392 0.0449 0.8649 0.9320 0.8972 103 0.9085 0.8713 0.8896 171 0.9695 0.9695 0.9695 131 0.9163 0.9185 0.9174 0.9865
0.0435 78.0 7488 0.0444 0.8739 0.9417 0.9065 103 0.9091 0.8772 0.8929 171 0.9695 0.9695 0.9695 131 0.9189 0.9235 0.9212 0.9867
0.0416 79.0 7584 0.0439 0.8661 0.9417 0.9023 103 0.9102 0.8889 0.8994 171 0.9695 0.9695 0.9695 131 0.9171 0.9284 0.9227 0.9862
0.0414 80.0 7680 0.0436 0.8727 0.9320 0.9014 103 0.9096 0.8830 0.8961 171 0.9695 0.9695 0.9695 131 0.9189 0.9235 0.9212 0.9867
0.043 81.0 7776 0.0437 0.8727 0.9320 0.9014 103 0.9152 0.8830 0.8988 171 0.9695 0.9695 0.9695 131 0.9212 0.9235 0.9223 0.9870
0.0433 82.0 7872 0.0434 0.8818 0.9417 0.9108 103 0.9157 0.8889 0.9021 171 0.9695 0.9695 0.9695 131 0.9238 0.9284 0.9261 0.9873
0.0428 83.0 7968 0.0439 0.8661 0.9417 0.9023 103 0.9212 0.8889 0.9048 171 0.9695 0.9695 0.9695 131 0.9216 0.9284 0.9250 0.9867
0.0418 84.0 8064 0.0435 0.8739 0.9417 0.9065 103 0.9157 0.8889 0.9021 171 0.9695 0.9695 0.9695 131 0.9216 0.9284 0.9250 0.9867
0.0416 85.0 8160 0.0435 0.8727 0.9320 0.9014 103 0.9152 0.8830 0.8988 171 0.9695 0.9695 0.9695 131 0.9212 0.9235 0.9223 0.9870
0.0413 86.0 8256 0.0439 0.8739 0.9417 0.9065 103 0.9212 0.8889 0.9048 171 0.9695 0.9695 0.9695 131 0.9238 0.9284 0.9261 0.9873
0.0423 87.0 8352 0.0440 0.8727 0.9320 0.9014 103 0.9085 0.8713 0.8896 171 0.9695 0.9695 0.9695 131 0.9185 0.9185 0.9185 0.9865
0.0409 88.0 8448 0.0439 0.8818 0.9417 0.9108 103 0.9146 0.8772 0.8955 171 0.9695 0.9695 0.9695 131 0.9235 0.9235 0.9235 0.9870
0.0419 89.0 8544 0.0437 0.8661 0.9417 0.9023 103 0.9212 0.8889 0.9048 171 0.9695 0.9695 0.9695 131 0.9216 0.9284 0.9250 0.9870
0.0424 90.0 8640 0.0438 0.8661 0.9417 0.9023 103 0.9152 0.8830 0.8988 171 0.9695 0.9695 0.9695 131 0.9191 0.9259 0.9225 0.9867
0.0419 91.0 8736 0.0439 0.8739 0.9417 0.9065 103 0.9146 0.8772 0.8955 171 0.9695 0.9695 0.9695 131 0.9212 0.9235 0.9223 0.9867
0.0427 92.0 8832 0.0443 0.8739 0.9417 0.9065 103 0.9146 0.8772 0.8955 171 0.9695 0.9695 0.9695 131 0.9212 0.9235 0.9223 0.9867
0.0397 93.0 8928 0.0438 0.8739 0.9417 0.9065 103 0.9146 0.8772 0.8955 171 0.9695 0.9695 0.9695 131 0.9212 0.9235 0.9223 0.9867
0.0414 94.0 9024 0.0437 0.8739 0.9417 0.9065 103 0.9152 0.8830 0.8988 171 0.9695 0.9695 0.9695 131 0.9214 0.9259 0.9236 0.9870
0.0401 95.0 9120 0.0438 0.8818 0.9417 0.9108 103 0.9152 0.8830 0.8988 171 0.9695 0.9695 0.9695 131 0.9236 0.9259 0.9248 0.9873
0.0415 96.0 9216 0.0439 0.8727 0.9320 0.9014 103 0.9146 0.8772 0.8955 171 0.9695 0.9695 0.9695 131 0.9210 0.9210 0.9210 0.9867
0.0404 97.0 9312 0.0437 0.8818 0.9417 0.9108 103 0.9152 0.8830 0.8988 171 0.9695 0.9695 0.9695 131 0.9236 0.9259 0.9248 0.9873
0.0418 98.0 9408 0.0438 0.8739 0.9417 0.9065 103 0.9152 0.8830 0.8988 171 0.9695 0.9695 0.9695 131 0.9214 0.9259 0.9236 0.9870
0.0388 99.0 9504 0.0437 0.8739 0.9417 0.9065 103 0.9152 0.8830 0.8988 171 0.9695 0.9695 0.9695 131 0.9214 0.9259 0.9236 0.9870
0.0397 100.0 9600 0.0437 0.8739 0.9417 0.9065 103 0.9152 0.8830 0.8988 171 0.9695 0.9695 0.9695 131 0.9214 0.9259 0.9236 0.9870

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-lora-r8-4

Finetuned
(366)
this model