chinese-lert-small-ner

This model is a fine-tuned version of hfl/chinese-lert-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0515
  • Overall Precision: 0.9025
  • Overall Recall: 0.9137
  • Overall F1: 0.9081
  • Overall Accuracy: 0.9888
  • Ucm: 0.8877
  • Lcm: 0.8832

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 512
  • eval_batch_size: 512
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 10
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Overall Precision Overall Recall Overall F1 Overall Accuracy Ucm Lcm
0.8943 0.08 10 0.7196 0.0 0.0 0.0 0.8993 0.4668 0.4668
0.5957 0.16 20 0.5575 0.0 0.0 0.0 0.8993 0.4668 0.4668
0.6037 0.24 30 0.5471 0.0 0.0 0.0 0.8993 0.4668 0.4668
0.5671 0.32 40 0.5040 0.0 0.0 0.0 0.8993 0.4668 0.4668
0.4556 0.4 50 0.4299 0.0 0.0 0.0 0.8993 0.4668 0.4668
0.4169 0.48 60 0.3465 0.0700 0.0497 0.0581 0.9048 0.4707 0.4571
0.3508 0.56 70 0.3040 0.0657 0.0563 0.0606 0.9077 0.4490 0.4393
0.3154 0.64 80 0.2654 0.1816 0.2953 0.2249 0.9282 0.4327 0.4309
0.2339 0.72 90 0.2073 0.3894 0.5246 0.4470 0.9515 0.5806 0.5670
0.1968 0.8 100 0.1686 0.5405 0.6360 0.5844 0.9598 0.6226 0.6072
0.1735 0.88 110 0.1382 0.6907 0.7131 0.7017 0.9694 0.7473 0.7231
0.1437 0.96 120 0.1212 0.7216 0.7537 0.7373 0.9714 0.7657 0.7428
0.1291 1.04 130 0.1025 0.7340 0.7621 0.7478 0.9741 0.7844 0.7639
0.1257 1.12 140 0.0909 0.7417 0.7807 0.7607 0.9767 0.7808 0.7657
0.0992 1.2 150 0.0811 0.7482 0.7876 0.7674 0.9792 0.7838 0.7720
0.1277 1.28 160 0.0751 0.7677 0.7996 0.7833 0.9807 0.7929 0.7823
0.1016 1.3600 170 0.0713 0.7950 0.8131 0.8039 0.9818 0.8068 0.7968
0.0999 1.44 180 0.0697 0.8171 0.8009 0.8089 0.9814 0.8152 0.8062
0.0945 1.52 190 0.0632 0.8157 0.8471 0.8311 0.9839 0.8137 0.8007
0.1008 1.6 200 0.0667 0.8013 0.8557 0.8276 0.9817 0.8098 0.7938
0.0812 1.6800 210 0.0611 0.8335 0.8595 0.8463 0.9835 0.8370 0.8234
0.0958 1.76 220 0.0578 0.8436 0.8593 0.8514 0.9849 0.8427 0.8279
0.0791 1.8400 230 0.0575 0.8362 0.8634 0.8496 0.9844 0.8457 0.8255
0.0675 1.92 240 0.0535 0.8493 0.8690 0.8590 0.9861 0.8638 0.8454
0.0687 2.0 250 0.0522 0.8620 0.8732 0.8676 0.9868 0.8650 0.8530
0.0608 2.08 260 0.0514 0.8662 0.8737 0.8699 0.9868 0.8711 0.8536
0.0682 2.16 270 0.0524 0.8516 0.8724 0.8619 0.9859 0.8617 0.8457
0.0668 2.24 280 0.0544 0.8630 0.8700 0.8665 0.9859 0.8632 0.8481
0.0605 2.32 290 0.0507 0.8629 0.8737 0.8683 0.9860 0.8632 0.8496
0.0566 2.4 300 0.0502 0.8710 0.8762 0.8736 0.9872 0.8771 0.8566
0.0621 2.48 310 0.0493 0.8725 0.8737 0.8731 0.9872 0.8732 0.8578
0.0613 2.56 320 0.0495 0.8646 0.8698 0.8672 0.9866 0.8687 0.8518
0.056 2.64 330 0.0492 0.8629 0.8867 0.8746 0.9865 0.8678 0.8533
0.0615 2.7200 340 0.0463 0.8623 0.8812 0.8716 0.9873 0.8741 0.8599
0.0562 2.8 350 0.0466 0.8621 0.8901 0.8759 0.9871 0.8671 0.8557
0.0624 2.88 360 0.0469 0.8671 0.8884 0.8776 0.9874 0.875 0.8629
0.0516 2.96 370 0.0476 0.8712 0.8818 0.8764 0.9872 0.8723 0.8611
0.0588 3.04 380 0.0466 0.8674 0.8936 0.8803 0.9872 0.8753 0.8629
0.0532 3.12 390 0.0490 0.8699 0.8919 0.8807 0.9863 0.8705 0.8584
0.0551 3.2 400 0.0463 0.8828 0.8901 0.8864 0.9878 0.8835 0.8723
0.0606 3.2800 410 0.0468 0.8756 0.8874 0.8814 0.9871 0.8768 0.8650
0.0521 3.36 420 0.0480 0.8632 0.8929 0.8778 0.9869 0.8726 0.8545
0.0456 3.44 430 0.0470 0.8796 0.8668 0.8732 0.9870 0.8747 0.8590
0.057 3.52 440 0.0459 0.8712 0.8904 0.8807 0.9878 0.8816 0.8653
0.0533 3.6 450 0.0455 0.8749 0.8940 0.8843 0.9874 0.8744 0.8581
0.0493 3.68 460 0.0452 0.8679 0.9021 0.8847 0.9870 0.8717 0.8593
0.0564 3.76 470 0.0463 0.8824 0.8820 0.8822 0.9871 0.8798 0.8629
0.0509 3.84 480 0.0432 0.8802 0.9002 0.8901 0.9879 0.8838 0.8729
0.0455 3.92 490 0.0432 0.8724 0.8934 0.8828 0.9880 0.8825 0.8687
0.0494 4.0 500 0.0446 0.8701 0.8996 0.8846 0.9872 0.8786 0.8611
0.0469 4.08 510 0.0445 0.8805 0.8981 0.8892 0.9876 0.8819 0.8668
0.0456 4.16 520 0.0433 0.8746 0.9032 0.8887 0.9879 0.8762 0.8678
0.0431 4.24 530 0.0460 0.8827 0.8929 0.8878 0.9875 0.8795 0.8678
0.0371 4.32 540 0.0442 0.8727 0.8981 0.8852 0.9877 0.8801 0.8675
0.0409 4.4 550 0.0455 0.8744 0.8934 0.8838 0.9872 0.8810 0.8638
0.0476 4.48 560 0.0446 0.8675 0.9002 0.8836 0.9871 0.8789 0.8635
0.038 4.5600 570 0.0446 0.8692 0.9124 0.8903 0.9876 0.8783 0.8693
0.0483 4.64 580 0.0438 0.8658 0.9120 0.8883 0.9875 0.8732 0.8626
0.0484 4.72 590 0.0435 0.8908 0.9051 0.8979 0.9885 0.8862 0.8759
0.0392 4.8 600 0.0453 0.8813 0.9045 0.8927 0.9875 0.8825 0.8723
0.0416 4.88 610 0.0451 0.8865 0.8914 0.8890 0.9873 0.8807 0.8702
0.0417 4.96 620 0.0459 0.8835 0.8784 0.8809 0.9869 0.8693 0.8620
0.0284 5.04 630 0.0456 0.8812 0.8910 0.8861 0.9872 0.8765 0.8687
0.0355 5.12 640 0.0459 0.8802 0.9096 0.8947 0.9876 0.8819 0.8720
0.0433 5.2 650 0.0440 0.8795 0.9017 0.8905 0.9877 0.8822 0.8702
0.032 5.28 660 0.0445 0.8876 0.9047 0.8961 0.9880 0.8825 0.8711
0.0382 5.36 670 0.0441 0.8815 0.9002 0.8908 0.9879 0.8819 0.8696
0.0326 5.44 680 0.0445 0.8776 0.9047 0.8910 0.9877 0.8832 0.8711
0.0391 5.52 690 0.0430 0.8896 0.9009 0.8952 0.9883 0.8883 0.8771
0.0377 5.6 700 0.0429 0.8934 0.9024 0.8978 0.9886 0.8907 0.8825
0.0417 5.68 710 0.0453 0.8800 0.9135 0.8964 0.9877 0.8807 0.8711
0.0402 5.76 720 0.0427 0.8937 0.9109 0.9022 0.9889 0.8919 0.8819
0.0351 5.84 730 0.0450 0.8837 0.8951 0.8894 0.9876 0.8798 0.8714
0.0295 5.92 740 0.0430 0.8945 0.8983 0.8964 0.9882 0.8835 0.8762
0.0379 6.0 750 0.0438 0.8876 0.9062 0.8968 0.9882 0.8874 0.8789
0.0314 6.08 760 0.0448 0.8831 0.9094 0.8961 0.9878 0.8822 0.8726
0.026 6.16 770 0.0452 0.8805 0.9154 0.8976 0.9882 0.8841 0.875
0.0277 6.24 780 0.0443 0.8874 0.8996 0.8934 0.9881 0.8819 0.8741
0.0329 6.32 790 0.0459 0.8987 0.9024 0.9005 0.9883 0.8868 0.8795
0.0334 6.4 800 0.0446 0.8939 0.9088 0.9013 0.9883 0.8886 0.8783
0.0263 6.48 810 0.0474 0.8857 0.9073 0.8963 0.9875 0.8871 0.8753
0.0415 6.5600 820 0.0432 0.8854 0.9116 0.8983 0.9886 0.8883 0.8786
0.0322 6.64 830 0.0424 0.8871 0.9101 0.8984 0.9887 0.8868 0.8786
0.0317 6.72 840 0.0444 0.8964 0.9064 0.9014 0.9883 0.8868 0.8801
0.0342 6.8 850 0.0457 0.8927 0.9011 0.8968 0.9873 0.8798 0.8744
0.0296 6.88 860 0.0451 0.8921 0.9069 0.8994 0.9882 0.8859 0.8789
0.0353 6.96 870 0.0442 0.8870 0.9096 0.8982 0.9880 0.8829 0.8747
0.027 7.04 880 0.0465 0.8880 0.9099 0.8988 0.9877 0.8871 0.8774
0.0265 7.12 890 0.0488 0.8835 0.9060 0.8946 0.9872 0.8816 0.8747
0.0272 7.2 900 0.0474 0.8909 0.8974 0.8942 0.9873 0.8813 0.8720
0.0272 7.28 910 0.0474 0.8822 0.9060 0.8939 0.9876 0.8822 0.8738
0.0309 7.36 920 0.0448 0.8909 0.9075 0.8991 0.9884 0.8816 0.8756
0.0295 7.44 930 0.0442 0.8934 0.9049 0.8991 0.9885 0.8877 0.8795
0.032 7.52 940 0.0473 0.8856 0.9154 0.9003 0.9876 0.8829 0.875
0.0244 7.6 950 0.0440 0.8949 0.9154 0.9050 0.9884 0.8859 0.8789
0.0231 7.68 960 0.0451 0.8995 0.9088 0.9041 0.9885 0.8895 0.8810
0.0285 7.76 970 0.0455 0.8978 0.9049 0.9014 0.9883 0.8871 0.8792
0.0245 7.84 980 0.0451 0.8872 0.9099 0.8984 0.9882 0.8853 0.8768
0.0299 7.92 990 0.0460 0.8861 0.9060 0.8959 0.9879 0.8841 0.8744
0.0245 8.0 1000 0.0452 0.8886 0.9105 0.8994 0.9880 0.8816 0.8741
0.021 8.08 1010 0.0464 0.8929 0.9051 0.8990 0.9880 0.8841 0.8738
0.0297 8.16 1020 0.0460 0.8882 0.9165 0.9021 0.9883 0.8825 0.875
0.0254 8.24 1030 0.0455 0.8947 0.9120 0.9033 0.9889 0.8910 0.8838
0.0233 8.32 1040 0.0473 0.8949 0.9011 0.8980 0.9884 0.8883 0.8792
0.0243 8.4 1050 0.0458 0.9009 0.9015 0.9012 0.9887 0.8895 0.8825
0.028 8.48 1060 0.0445 0.8981 0.9139 0.9060 0.9891 0.8901 0.8838
0.027 8.56 1070 0.0446 0.9045 0.9090 0.9068 0.9893 0.8937 0.8865
0.0196 8.64 1080 0.0441 0.9027 0.9099 0.9063 0.9891 0.8916 0.8838
0.0352 8.72 1090 0.0447 0.8980 0.9143 0.9061 0.9891 0.8955 0.8874
0.0275 8.8 1100 0.0446 0.9011 0.9128 0.9069 0.9892 0.8943 0.8868
0.0233 8.88 1110 0.0461 0.8999 0.9051 0.9025 0.9885 0.8822 0.8765
0.0292 8.96 1120 0.0461 0.8960 0.9131 0.9044 0.9885 0.8868 0.8801
0.0144 9.04 1130 0.0456 0.8991 0.9141 0.9066 0.9886 0.8886 0.8804
0.024 9.12 1140 0.0471 0.9030 0.9113 0.9072 0.9891 0.8946 0.8868
0.0239 9.2 1150 0.0468 0.8941 0.9056 0.8998 0.9886 0.8883 0.8789
0.0282 9.28 1160 0.0483 0.8995 0.9062 0.9028 0.9887 0.8931 0.8822
0.0189 9.36 1170 0.0468 0.8997 0.8974 0.8986 0.9885 0.8880 0.8807
0.0232 9.44 1180 0.0449 0.9041 0.9128 0.9085 0.9891 0.8955 0.8889
0.0242 9.52 1190 0.0477 0.9082 0.9113 0.9098 0.9890 0.8986 0.8898
0.0245 9.6 1200 0.0456 0.9002 0.9058 0.9030 0.9885 0.8892 0.8804
0.0231 9.68 1210 0.0463 0.8967 0.9122 0.9044 0.9885 0.8886 0.8810
0.0231 9.76 1220 0.0445 0.8963 0.9199 0.9080 0.9888 0.8910 0.8850
0.0212 9.84 1230 0.0455 0.8931 0.9122 0.9025 0.9887 0.8898 0.8816
0.0217 9.92 1240 0.0483 0.8990 0.9056 0.9023 0.9881 0.8886 0.8816
0.0221 10.0 1250 0.0490 0.8960 0.9060 0.9010 0.9882 0.8898 0.8810
0.0236 10.08 1260 0.0498 0.8953 0.9028 0.8990 0.9882 0.8835 0.8753
0.0229 10.16 1270 0.0483 0.89 0.9148 0.9022 0.9886 0.8856 0.8780
0.0224 10.24 1280 0.0478 0.8990 0.9116 0.9053 0.9888 0.8883 0.8795
0.0163 10.32 1290 0.0488 0.9023 0.9073 0.9048 0.9886 0.8865 0.8792
0.0213 10.4 1300 0.0474 0.8986 0.9073 0.9029 0.9888 0.8838 0.8774
0.0206 10.48 1310 0.0515 0.8910 0.9069 0.8989 0.9878 0.8822 0.8723
0.0212 10.56 1320 0.0484 0.8923 0.9150 0.9035 0.9885 0.8865 0.8798
0.023 10.64 1330 0.0487 0.8934 0.9131 0.9031 0.9885 0.8868 0.8783
0.0221 10.72 1340 0.0472 0.9016 0.9141 0.9078 0.9888 0.8901 0.8847
0.017 10.8 1350 0.0471 0.8937 0.9167 0.9051 0.9885 0.8874 0.8780
0.0246 10.88 1360 0.0470 0.9004 0.9017 0.9010 0.9885 0.8856 0.8795
0.0186 10.96 1370 0.0472 0.9006 0.9094 0.9050 0.9884 0.8892 0.8819
0.018 11.04 1380 0.0485 0.8961 0.9176 0.9067 0.9882 0.8859 0.8786
0.0205 11.12 1390 0.0509 0.8968 0.9079 0.9023 0.9879 0.8847 0.8753
0.0157 11.2 1400 0.0491 0.8996 0.9058 0.9027 0.9883 0.8835 0.8771
0.0202 11.28 1410 0.0481 0.8923 0.9193 0.9056 0.9886 0.8868 0.8807
0.016 11.36 1420 0.0484 0.8984 0.9150 0.9066 0.9887 0.8883 0.8804
0.0219 11.44 1430 0.0474 0.8996 0.9191 0.9092 0.9889 0.8880 0.8813
0.0193 11.52 1440 0.0484 0.8951 0.9096 0.9023 0.9886 0.8853 0.8774
0.0191 11.6 1450 0.0462 0.9006 0.9062 0.9034 0.9888 0.8822 0.8774
0.0187 11.68 1460 0.0470 0.9002 0.9113 0.9057 0.9888 0.8907 0.8829
0.0152 11.76 1470 0.0492 0.9021 0.9096 0.9059 0.9885 0.8862 0.8804
0.0182 11.84 1480 0.0508 0.8947 0.9135 0.9040 0.9881 0.8862 0.8768
0.0277 11.92 1490 0.0481 0.8985 0.9171 0.9077 0.9886 0.8895 0.8838
0.0193 12.0 1500 0.0484 0.8993 0.9086 0.9039 0.9887 0.8874 0.8813
0.0146 12.08 1510 0.0477 0.9038 0.9116 0.9077 0.9891 0.8916 0.8856
0.0173 12.16 1520 0.0494 0.9001 0.9126 0.9063 0.9887 0.8871 0.8813
0.0134 12.24 1530 0.0483 0.9033 0.9165 0.9099 0.9889 0.8892 0.8841
0.0193 12.32 1540 0.0482 0.9019 0.9171 0.9094 0.9890 0.8904 0.8853
0.0219 12.4 1550 0.0487 0.9006 0.9118 0.9062 0.9886 0.8892 0.8813
0.0145 12.48 1560 0.0482 0.9057 0.9131 0.9094 0.9889 0.8904 0.8841
0.0179 12.56 1570 0.0492 0.9006 0.9135 0.9070 0.9888 0.8883 0.8804
0.0167 12.64 1580 0.0490 0.9013 0.9171 0.9091 0.9888 0.8895 0.8819
0.0203 12.72 1590 0.0483 0.9055 0.9126 0.9090 0.9891 0.8940 0.8865
0.0196 12.8 1600 0.0489 0.8997 0.9120 0.9058 0.9888 0.8889 0.8832
0.0197 12.88 1610 0.0494 0.8982 0.9141 0.9061 0.9887 0.8886 0.8810
0.0132 12.96 1620 0.0489 0.9081 0.9094 0.9087 0.9887 0.8886 0.8822
0.0202 13.04 1630 0.0495 0.9021 0.9173 0.9097 0.9887 0.8877 0.8813
0.0113 13.12 1640 0.0487 0.9033 0.9180 0.9106 0.9889 0.8937 0.8871
0.0167 13.2 1650 0.0505 0.9028 0.9128 0.9078 0.9887 0.8895 0.8822
0.0141 13.28 1660 0.0509 0.9004 0.9135 0.9069 0.9886 0.8883 0.8822
0.0209 13.36 1670 0.0501 0.8960 0.9135 0.9047 0.9883 0.8871 0.8804
0.0163 13.44 1680 0.0501 0.9015 0.9099 0.9057 0.9886 0.8901 0.8832
0.0177 13.52 1690 0.0504 0.9028 0.9167 0.9097 0.9888 0.8895 0.8829
0.0211 13.6 1700 0.0511 0.8987 0.9120 0.9053 0.9882 0.8847 0.8783
0.0255 13.68 1710 0.0494 0.8955 0.9152 0.9052 0.9884 0.8844 0.8801
0.0179 13.76 1720 0.0504 0.9036 0.9077 0.9057 0.9887 0.8910 0.8847
0.0122 13.84 1730 0.0494 0.8976 0.9143 0.9059 0.9891 0.8907 0.8850
0.0119 13.92 1740 0.0497 0.8961 0.9139 0.9049 0.9886 0.8859 0.8804
0.0126 14.0 1750 0.0495 0.9024 0.9143 0.9083 0.9886 0.8883 0.8822
0.0155 14.08 1760 0.0500 0.9055 0.9131 0.9093 0.9888 0.8919 0.8856
0.0175 14.16 1770 0.0506 0.9003 0.9131 0.9067 0.9885 0.8898 0.8832
0.0178 14.24 1780 0.0507 0.9007 0.9128 0.9067 0.9887 0.8856 0.8795
0.0176 14.32 1790 0.0521 0.9005 0.9092 0.9048 0.9885 0.8871 0.8807
0.0209 14.4 1800 0.0520 0.8971 0.9131 0.9050 0.9883 0.8868 0.8810
0.0121 14.48 1810 0.0503 0.9014 0.9101 0.9057 0.9886 0.8865 0.8816
0.0164 14.56 1820 0.0505 0.9043 0.9182 0.9112 0.9890 0.8913 0.8853
0.0162 14.64 1830 0.0503 0.9035 0.9137 0.9085 0.9889 0.8898 0.8844
0.0113 14.72 1840 0.0500 0.9023 0.9156 0.9089 0.9888 0.8877 0.8822
0.0155 14.8 1850 0.0509 0.8998 0.9133 0.9065 0.9887 0.8886 0.8832
0.0136 14.88 1860 0.0511 0.9005 0.9124 0.9064 0.9888 0.8889 0.8832
0.0131 14.96 1870 0.0501 0.9048 0.9141 0.9095 0.9890 0.8898 0.8847
0.013 15.04 1880 0.0506 0.9017 0.9154 0.9085 0.9889 0.8889 0.8832
0.0171 15.12 1890 0.0502 0.9032 0.9154 0.9093 0.9889 0.8901 0.8859
0.0142 15.2 1900 0.0499 0.9011 0.9135 0.9073 0.9888 0.8892 0.8847
0.0151 15.28 1910 0.0507 0.9021 0.9137 0.9079 0.9889 0.8889 0.8825
0.0169 15.36 1920 0.0510 0.9057 0.9109 0.9083 0.9890 0.8907 0.8847
0.0154 15.44 1930 0.0509 0.9017 0.9137 0.9077 0.9891 0.8916 0.8850
0.0104 15.52 1940 0.0506 0.9021 0.9154 0.9087 0.9890 0.8898 0.8850
0.0135 15.6 1950 0.0505 0.9057 0.9111 0.9084 0.9890 0.8907 0.8865
0.0145 15.68 1960 0.0511 0.9022 0.9141 0.9081 0.9890 0.8883 0.8841
0.0178 15.76 1970 0.0507 0.9029 0.9137 0.9083 0.9891 0.8907 0.8862
0.0125 15.84 1980 0.0510 0.9018 0.9105 0.9061 0.9888 0.8895 0.8841
0.0134 15.92 1990 0.0516 0.8994 0.9173 0.9083 0.9888 0.8871 0.8813
0.0163 16.0 2000 0.0509 0.9025 0.9101 0.9063 0.9888 0.8865 0.8816
0.0135 16.08 2010 0.0507 0.9018 0.9128 0.9073 0.9889 0.8874 0.8825
0.0102 16.16 2020 0.0510 0.9032 0.9116 0.9074 0.9888 0.8883 0.8829
0.0103 16.24 2030 0.0515 0.9047 0.9152 0.9099 0.9888 0.8892 0.8844
0.0194 16.32 2040 0.0517 0.9059 0.9137 0.9098 0.9888 0.8895 0.8850
0.0119 16.4 2050 0.0515 0.9043 0.9148 0.9095 0.9888 0.8880 0.8832
0.015 16.48 2060 0.0518 0.8993 0.9141 0.9067 0.9886 0.8853 0.8801
0.0118 16.56 2070 0.0514 0.9015 0.9116 0.9065 0.9887 0.8868 0.8813
0.0107 16.64 2080 0.0516 0.8981 0.9137 0.9058 0.9887 0.8865 0.8807
0.0142 16.72 2090 0.0521 0.8980 0.9124 0.9052 0.9886 0.8865 0.8795
0.0189 16.8 2100 0.0514 0.9043 0.9126 0.9085 0.9888 0.8877 0.8816
0.0143 16.88 2110 0.0510 0.9036 0.9131 0.9083 0.9889 0.8871 0.8825
0.0107 16.96 2120 0.0514 0.8978 0.9161 0.9068 0.9888 0.8862 0.8813
0.0137 17.04 2130 0.0511 0.9039 0.9124 0.9081 0.9887 0.8889 0.8829
0.0115 17.12 2140 0.0513 0.9031 0.9099 0.9065 0.9887 0.8892 0.8829
0.0113 17.2 2150 0.0518 0.9004 0.9133 0.9068 0.9887 0.8880 0.8810
0.0151 17.28 2160 0.0517 0.9025 0.9135 0.9079 0.9888 0.8886 0.8819
0.0192 17.36 2170 0.0515 0.9043 0.9141 0.9092 0.9889 0.8883 0.8829
0.0104 17.44 2180 0.0513 0.9027 0.9141 0.9084 0.9888 0.8877 0.8822
0.0122 17.52 2190 0.0512 0.9032 0.9152 0.9092 0.9889 0.8871 0.8822
0.0142 17.6 2200 0.0510 0.9046 0.9141 0.9094 0.9889 0.8874 0.8829
0.0145 17.68 2210 0.0511 0.9042 0.9139 0.9091 0.9888 0.8871 0.8829
0.0129 17.76 2220 0.0512 0.9025 0.9137 0.9081 0.9888 0.8883 0.8838
0.0115 17.84 2230 0.0513 0.9044 0.9141 0.9093 0.9888 0.8889 0.8841
0.0119 17.92 2240 0.0516 0.9014 0.9158 0.9086 0.9889 0.8877 0.8822
0.0162 18.0 2250 0.0515 0.9033 0.9143 0.9088 0.9888 0.8895 0.8841
0.0091 18.08 2260 0.0515 0.9026 0.9131 0.9078 0.9888 0.8883 0.8829
0.0115 18.16 2270 0.0515 0.9027 0.9137 0.9082 0.9888 0.8883 0.8832
0.0121 18.24 2280 0.0514 0.9022 0.9146 0.9083 0.9888 0.8874 0.8829
0.012 18.32 2290 0.0512 0.9028 0.9131 0.9079 0.9888 0.8868 0.8825
0.0188 18.4 2300 0.0512 0.9021 0.9113 0.9067 0.9887 0.8859 0.8816
0.0132 18.48 2310 0.0513 0.9021 0.9131 0.9075 0.9888 0.8871 0.8825
0.0157 18.56 2320 0.0514 0.9016 0.9141 0.9078 0.9888 0.8877 0.8825
0.0146 18.64 2330 0.0515 0.9018 0.9143 0.9080 0.9888 0.8886 0.8835
0.0122 18.72 2340 0.0515 0.9016 0.9148 0.9082 0.9888 0.8889 0.8838
0.0135 18.8 2350 0.0516 0.9014 0.9146 0.9080 0.9888 0.8886 0.8835
0.0166 18.88 2360 0.0516 0.9016 0.9139 0.9077 0.9888 0.8886 0.8838
0.0117 18.96 2370 0.0516 0.9012 0.9141 0.9076 0.9888 0.8883 0.8835
0.0107 19.04 2380 0.0516 0.9024 0.9143 0.9083 0.9888 0.8886 0.8841
0.0108 19.12 2390 0.0515 0.9026 0.9143 0.9084 0.9888 0.8883 0.8838
0.0108 19.2 2400 0.0515 0.9023 0.9141 0.9082 0.9888 0.8880 0.8835
0.0116 19.28 2410 0.0515 0.9033 0.9139 0.9086 0.9888 0.8880 0.8835
0.013 19.36 2420 0.0515 0.9029 0.9137 0.9083 0.9888 0.8880 0.8835
0.0127 19.44 2430 0.0515 0.9027 0.9137 0.9082 0.9888 0.8880 0.8835
0.0119 19.52 2440 0.0515 0.9025 0.9137 0.9081 0.9888 0.8877 0.8832
0.0148 19.6 2450 0.0515 0.9023 0.9137 0.9080 0.9888 0.8877 0.8832
0.0151 19.68 2460 0.0515 0.9023 0.9137 0.9080 0.9888 0.8877 0.8832
0.0095 19.76 2470 0.0515 0.9023 0.9137 0.9080 0.9888 0.8877 0.8832
0.0132 19.84 2480 0.0515 0.9023 0.9137 0.9080 0.9888 0.8877 0.8832
0.0117 19.92 2490 0.0515 0.9025 0.9137 0.9081 0.9888 0.8877 0.8832
0.0126 20.0 2500 0.0515 0.9025 0.9137 0.9081 0.9888 0.8877 0.8832

Framework versions

  • Transformers 4.50.0.dev0
  • Pytorch 2.6.0+cu126
  • Datasets 3.4.1
  • Tokenizers 0.21.1
Downloads last month
2
Safetensors
Model size
15M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for jetaudio/chinese-lert-small-ner

Finetuned
(1)
this model