Edit model card

ner-coin

This model is a fine-tuned version of microsoft/Multilingual-MiniLM-L12-H384 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0336
  • Precision: 0.9730
  • Recall: 0.9789
  • F1: 0.9759
  • Accuracy: 0.9937

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 26 0.2470 0.7608 0.8733 0.8132 0.9776
No log 2.0 52 0.1881 0.7803 0.8733 0.8242 0.9817
No log 3.0 78 0.1495 0.9629 0.9789 0.9708 0.9929
No log 4.0 104 0.1277 0.9744 0.9759 0.9751 0.9942
No log 5.0 130 0.1107 0.9745 0.9804 0.9774 0.9949
No log 6.0 156 0.1004 0.9688 0.9819 0.9753 0.9940
No log 7.0 182 0.0895 0.9701 0.9804 0.9752 0.9942
No log 8.0 208 0.0808 0.9745 0.9789 0.9767 0.9949
No log 9.0 234 0.0731 0.9744 0.9774 0.9759 0.9947
No log 10.0 260 0.0720 0.9731 0.9819 0.9775 0.9939
No log 11.0 286 0.0662 0.9731 0.9819 0.9775 0.9940
No log 12.0 312 0.0589 0.9804 0.9804 0.9804 0.9955
No log 13.0 338 0.0573 0.9746 0.9819 0.9782 0.9945
No log 14.0 364 0.0520 0.9789 0.9789 0.9789 0.9945
No log 15.0 390 0.0507 0.9803 0.9774 0.9789 0.9947
No log 16.0 416 0.0475 0.9804 0.9789 0.9796 0.9949
No log 17.0 442 0.0461 0.9731 0.9804 0.9767 0.9944
No log 18.0 468 0.0435 0.9773 0.9744 0.9758 0.9947
No log 19.0 494 0.0400 0.9760 0.9819 0.9789 0.9952
0.1028 20.0 520 0.0390 0.9834 0.9819 0.9826 0.9960
0.1028 21.0 546 0.0386 0.9716 0.9804 0.9760 0.9945
0.1028 22.0 572 0.0373 0.9688 0.9834 0.9760 0.9942
0.1028 23.0 598 0.0355 0.9789 0.9804 0.9797 0.9950
0.1028 24.0 624 0.0381 0.9617 0.9834 0.9724 0.9924
0.1028 25.0 650 0.0328 0.9775 0.9819 0.9797 0.9950
0.1028 26.0 676 0.0329 0.9789 0.9804 0.9797 0.9952
0.1028 27.0 702 0.0357 0.9789 0.9804 0.9797 0.9950
0.1028 28.0 728 0.0357 0.9688 0.9849 0.9768 0.9940
0.1028 29.0 754 0.0382 0.9632 0.9864 0.9747 0.9921
0.1028 30.0 780 0.0303 0.9789 0.9819 0.9804 0.9953
0.1028 31.0 806 0.0289 0.9819 0.9819 0.9819 0.9957
0.1028 32.0 832 0.0296 0.9790 0.9834 0.9812 0.9955
0.1028 33.0 858 0.0290 0.9848 0.9789 0.9818 0.9953
0.1028 34.0 884 0.0301 0.9789 0.9819 0.9804 0.9953
0.1028 35.0 910 0.0294 0.9702 0.9834 0.9768 0.9944
0.1028 36.0 936 0.0347 0.9717 0.9834 0.9775 0.9936
0.1028 37.0 962 0.0303 0.9746 0.9819 0.9782 0.9939
0.1028 38.0 988 0.0344 0.9645 0.9849 0.9746 0.9923
0.0209 39.0 1014 0.0300 0.9717 0.9834 0.9775 0.9937
0.0209 40.0 1040 0.0288 0.9789 0.9819 0.9804 0.9950
0.0209 41.0 1066 0.0289 0.9804 0.9819 0.9812 0.9952
0.0209 42.0 1092 0.0296 0.9716 0.9804 0.9760 0.9939
0.0209 43.0 1118 0.0319 0.9659 0.9834 0.9746 0.9928
0.0209 44.0 1144 0.0269 0.9848 0.9759 0.9803 0.9950
0.0209 45.0 1170 0.0259 0.9804 0.9804 0.9804 0.9950
0.0209 46.0 1196 0.0306 0.9716 0.9819 0.9767 0.9939
0.0209 47.0 1222 0.0354 0.9658 0.9789 0.9723 0.9918
0.0209 48.0 1248 0.0280 0.9746 0.9819 0.9782 0.9937
0.0209 49.0 1274 0.0266 0.9833 0.9774 0.9803 0.9955
0.0209 50.0 1300 0.0287 0.9760 0.9819 0.9789 0.9945
0.0209 51.0 1326 0.0280 0.9818 0.9759 0.9788 0.9950
0.0209 52.0 1352 0.0316 0.9787 0.9713 0.9750 0.9937
0.0209 53.0 1378 0.0302 0.9744 0.9774 0.9759 0.9936
0.0209 54.0 1404 0.0309 0.9744 0.9759 0.9751 0.9932
0.0209 55.0 1430 0.0298 0.9818 0.9759 0.9788 0.9947
0.0209 56.0 1456 0.0291 0.9729 0.9744 0.9736 0.9931
0.0209 57.0 1482 0.0287 0.9773 0.9759 0.9766 0.9937
0.0099 58.0 1508 0.0349 0.9687 0.9789 0.9737 0.9921
0.0099 59.0 1534 0.0295 0.9745 0.9789 0.9767 0.9936
0.0099 60.0 1560 0.0306 0.9759 0.9789 0.9774 0.9936
0.0099 61.0 1586 0.0298 0.9775 0.9819 0.9797 0.9944
0.0099 62.0 1612 0.0296 0.9746 0.9819 0.9782 0.9944
0.0099 63.0 1638 0.0282 0.9760 0.9804 0.9782 0.9944
0.0099 64.0 1664 0.0290 0.9804 0.9804 0.9804 0.9949
0.0099 65.0 1690 0.0290 0.9745 0.9789 0.9767 0.9937
0.0099 66.0 1716 0.0277 0.9774 0.9789 0.9781 0.9944
0.0099 67.0 1742 0.0303 0.9745 0.9804 0.9774 0.9942
0.0099 68.0 1768 0.0283 0.9773 0.9759 0.9766 0.9945
0.0099 69.0 1794 0.0301 0.9759 0.9774 0.9766 0.9940
0.0099 70.0 1820 0.0304 0.9745 0.9789 0.9767 0.9940
0.0099 71.0 1846 0.0290 0.9789 0.9774 0.9781 0.9944
0.0099 72.0 1872 0.0346 0.9658 0.9789 0.9723 0.9926
0.0099 73.0 1898 0.0327 0.9687 0.9789 0.9737 0.9932
0.0099 74.0 1924 0.0315 0.9759 0.9789 0.9774 0.9940
0.0099 75.0 1950 0.0305 0.9774 0.9774 0.9774 0.9940
0.0099 76.0 1976 0.0304 0.9759 0.9789 0.9774 0.9942
0.0059 77.0 2002 0.0306 0.9716 0.9789 0.9752 0.9936
0.0059 78.0 2028 0.0304 0.9789 0.9789 0.9789 0.9944
0.0059 79.0 2054 0.0322 0.9687 0.9789 0.9737 0.9932
0.0059 80.0 2080 0.0323 0.9730 0.9789 0.9759 0.9936
0.0059 81.0 2106 0.0314 0.9730 0.9789 0.9759 0.9937
0.0059 82.0 2132 0.0315 0.9730 0.9789 0.9759 0.9937
0.0059 83.0 2158 0.0310 0.9731 0.9804 0.9767 0.9939
0.0059 84.0 2184 0.0318 0.9701 0.9804 0.9752 0.9936
0.0059 85.0 2210 0.0317 0.9745 0.9789 0.9767 0.9939
0.0059 86.0 2236 0.0316 0.9745 0.9789 0.9767 0.9939
0.0059 87.0 2262 0.0318 0.9745 0.9789 0.9767 0.9939
0.0059 88.0 2288 0.0324 0.9730 0.9789 0.9759 0.9937
0.0059 89.0 2314 0.0320 0.9745 0.9789 0.9767 0.9939
0.0059 90.0 2340 0.0336 0.9701 0.9789 0.9745 0.9934
0.0059 91.0 2366 0.0335 0.9730 0.9789 0.9759 0.9937
0.0059 92.0 2392 0.0332 0.9745 0.9789 0.9767 0.9939
0.0059 93.0 2418 0.0334 0.9730 0.9789 0.9759 0.9937
0.0059 94.0 2444 0.0335 0.9716 0.9789 0.9752 0.9936
0.0059 95.0 2470 0.0342 0.9701 0.9789 0.9745 0.9931
0.0059 96.0 2496 0.0338 0.9716 0.9789 0.9752 0.9936
0.0045 97.0 2522 0.0338 0.9730 0.9789 0.9759 0.9937
0.0045 98.0 2548 0.0338 0.9716 0.9789 0.9752 0.9936
0.0045 99.0 2574 0.0336 0.9730 0.9789 0.9759 0.9937
0.0045 100.0 2600 0.0336 0.9730 0.9789 0.9759 0.9937

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.14.5
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
118M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for thanhdath/ner-coin

Finetuned
(20)
this model