ner-coin-v2 / README.md
thanhdath's picture
End of training
2dcdbf2 verified
metadata
license: mit
base_model: microsoft/Multilingual-MiniLM-L12-H384
tags:
  - generated_from_trainer
metrics:
  - precision
  - recall
  - f1
  - accuracy
model-index:
  - name: ner-coin-v2
    results: []

ner-coin-v2

This model is a fine-tuned version of microsoft/Multilingual-MiniLM-L12-H384 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0182
  • Precision: 0.9837
  • Recall: 0.9947
  • F1: 0.9892
  • Accuracy: 0.9971

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 51 0.2140 0.7507 0.8582 0.8008 0.9790
No log 2.0 102 0.1401 0.9631 0.9790 0.9710 0.9936
No log 3.0 153 0.1071 0.9710 0.9790 0.9750 0.9945
No log 4.0 204 0.0852 0.9676 0.9857 0.9766 0.9948
No log 5.0 255 0.0712 0.9741 0.9880 0.9810 0.9953
No log 6.0 306 0.0586 0.9842 0.9820 0.9831 0.9962
No log 7.0 357 0.0514 0.9799 0.9865 0.9832 0.9960
No log 8.0 408 0.0472 0.9778 0.9895 0.9836 0.9957
No log 9.0 459 0.0428 0.9749 0.9895 0.9821 0.9956
0.1133 10.0 510 0.0393 0.9864 0.9782 0.9823 0.9956
0.1133 11.0 561 0.0342 0.9828 0.9857 0.9843 0.9959
0.1133 12.0 612 0.0307 0.9806 0.9880 0.9843 0.9964
0.1133 13.0 663 0.0302 0.9748 0.9880 0.9814 0.9956
0.1133 14.0 714 0.0295 0.9677 0.9895 0.9785 0.9948
0.1133 15.0 765 0.0256 0.9828 0.9872 0.9850 0.9964
0.1133 16.0 816 0.0295 0.9601 0.9932 0.9764 0.9939
0.1133 17.0 867 0.0252 0.9784 0.9865 0.9824 0.9958
0.1133 18.0 918 0.0289 0.9819 0.9775 0.9797 0.9948
0.1133 19.0 969 0.0248 0.9798 0.9842 0.9820 0.9954
0.0217 20.0 1020 0.0254 0.9741 0.9880 0.9810 0.9950
0.0217 21.0 1071 0.0219 0.9749 0.9902 0.9825 0.9956
0.0217 22.0 1122 0.0240 0.9770 0.9887 0.9828 0.9955
0.0217 23.0 1173 0.0226 0.9807 0.9887 0.9847 0.9958
0.0217 24.0 1224 0.0209 0.9756 0.9910 0.9833 0.9957
0.0217 25.0 1275 0.0203 0.9822 0.9917 0.9869 0.9963
0.0217 26.0 1326 0.0231 0.9727 0.9902 0.9814 0.9950
0.0217 27.0 1377 0.0204 0.9778 0.9895 0.9836 0.9958
0.0217 28.0 1428 0.0196 0.9771 0.9917 0.9844 0.9962
0.0217 29.0 1479 0.0206 0.9757 0.9932 0.9844 0.9957
0.0097 30.0 1530 0.0217 0.9757 0.9955 0.9855 0.9959
0.0097 31.0 1581 0.0192 0.9843 0.9872 0.9858 0.9962
0.0097 32.0 1632 0.0189 0.9844 0.9910 0.9877 0.9964
0.0097 33.0 1683 0.0174 0.9844 0.9925 0.9884 0.9966
0.0097 34.0 1734 0.0183 0.9836 0.9910 0.9873 0.9966
0.0097 35.0 1785 0.0189 0.9785 0.9917 0.9851 0.9964
0.0097 36.0 1836 0.0202 0.9757 0.9940 0.9848 0.9960
0.0097 37.0 1887 0.0203 0.9770 0.9880 0.9825 0.9957
0.0097 38.0 1938 0.0189 0.9778 0.9932 0.9855 0.9962
0.0097 39.0 1989 0.0169 0.9836 0.9895 0.9865 0.9966
0.0055 40.0 2040 0.0183 0.9778 0.9917 0.9847 0.9961
0.0055 41.0 2091 0.0159 0.9866 0.9910 0.9888 0.9968
0.0055 42.0 2142 0.0175 0.9778 0.9917 0.9847 0.9962
0.0055 43.0 2193 0.0153 0.9829 0.9940 0.9884 0.9969
0.0055 44.0 2244 0.0170 0.9778 0.9925 0.9851 0.9963
0.0055 45.0 2295 0.0184 0.9750 0.9940 0.9844 0.9962
0.0055 46.0 2346 0.0172 0.9786 0.9940 0.9862 0.9964
0.0055 47.0 2397 0.0174 0.9779 0.9947 0.9862 0.9965
0.0055 48.0 2448 0.0169 0.9778 0.9910 0.9844 0.9962
0.0055 49.0 2499 0.0193 0.9701 0.9962 0.9830 0.9958
0.0035 50.0 2550 0.0163 0.9792 0.9910 0.9851 0.9963
0.0035 51.0 2601 0.0173 0.9771 0.9925 0.9847 0.9960
0.0035 52.0 2652 0.0164 0.9829 0.9932 0.9881 0.9966
0.0035 53.0 2703 0.0177 0.9757 0.9955 0.9855 0.9961
0.0035 54.0 2754 0.0164 0.9815 0.9932 0.9873 0.9965
0.0035 55.0 2805 0.0171 0.9793 0.9947 0.9870 0.9966
0.0035 56.0 2856 0.0175 0.98 0.9925 0.9862 0.9966
0.0035 57.0 2907 0.0167 0.9801 0.9955 0.9877 0.9966
0.0035 58.0 2958 0.0168 0.9880 0.9887 0.9884 0.9966
0.0025 59.0 3009 0.0174 0.9858 0.9917 0.9888 0.9969
0.0025 60.0 3060 0.0153 0.9837 0.9940 0.9888 0.9970
0.0025 61.0 3111 0.0165 0.9829 0.9932 0.9881 0.9968
0.0025 62.0 3162 0.0150 0.9881 0.9925 0.9903 0.9971
0.0025 63.0 3213 0.0156 0.9851 0.9947 0.9899 0.9972
0.0025 64.0 3264 0.0147 0.9873 0.9940 0.9907 0.9974
0.0025 65.0 3315 0.0169 0.9815 0.9947 0.9881 0.9967
0.0025 66.0 3366 0.0186 0.9786 0.9962 0.9874 0.9964
0.0025 67.0 3417 0.0171 0.9815 0.9940 0.9877 0.9967
0.0025 68.0 3468 0.0164 0.9822 0.9932 0.9877 0.9966
0.0021 69.0 3519 0.0161 0.9829 0.9932 0.9881 0.9968
0.0021 70.0 3570 0.0156 0.9858 0.9925 0.9892 0.9970
0.0021 71.0 3621 0.0163 0.9815 0.9947 0.9881 0.9967
0.0021 72.0 3672 0.0166 0.9837 0.9947 0.9892 0.9970
0.0021 73.0 3723 0.0161 0.9866 0.9925 0.9895 0.9970
0.0021 74.0 3774 0.0165 0.9837 0.9947 0.9892 0.9970
0.0021 75.0 3825 0.0165 0.9859 0.9947 0.9903 0.9972
0.0021 76.0 3876 0.0170 0.9830 0.9947 0.9888 0.9969
0.0021 77.0 3927 0.0171 0.9844 0.9947 0.9896 0.9971
0.0021 78.0 3978 0.0179 0.9815 0.9947 0.9881 0.9967
0.0016 79.0 4029 0.0170 0.9851 0.9947 0.9899 0.9971
0.0016 80.0 4080 0.0170 0.9851 0.9947 0.9899 0.9971
0.0016 81.0 4131 0.0186 0.9779 0.9955 0.9866 0.9963
0.0016 82.0 4182 0.0179 0.9822 0.9947 0.9884 0.9968
0.0016 83.0 4233 0.0177 0.9822 0.9947 0.9884 0.9968
0.0016 84.0 4284 0.0177 0.9822 0.9947 0.9884 0.9968
0.0016 85.0 4335 0.0176 0.9830 0.9947 0.9888 0.9969
0.0016 86.0 4386 0.0182 0.9822 0.9947 0.9884 0.9968
0.0016 87.0 4437 0.0173 0.9851 0.9947 0.9899 0.9971
0.0016 88.0 4488 0.0179 0.9808 0.9947 0.9877 0.9966
0.0015 89.0 4539 0.0176 0.9837 0.9947 0.9892 0.9970
0.0015 90.0 4590 0.0181 0.9837 0.9947 0.9892 0.9970
0.0015 91.0 4641 0.0183 0.9837 0.9947 0.9892 0.9970
0.0015 92.0 4692 0.0183 0.9844 0.9947 0.9896 0.9971
0.0015 93.0 4743 0.0188 0.9837 0.9947 0.9892 0.9969
0.0015 94.0 4794 0.0189 0.9837 0.9947 0.9892 0.9969
0.0015 95.0 4845 0.0186 0.9837 0.9947 0.9892 0.9969
0.0015 96.0 4896 0.0180 0.9837 0.9947 0.9892 0.9971
0.0015 97.0 4947 0.0181 0.9837 0.9947 0.9892 0.9970
0.0015 98.0 4998 0.0182 0.9837 0.9947 0.9892 0.9970
0.0013 99.0 5049 0.0182 0.9837 0.9947 0.9892 0.9971
0.0013 100.0 5100 0.0182 0.9837 0.9947 0.9892 0.9971

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.14.5
  • Tokenizers 0.19.1