finBert_SA_20e / README.md
vinh120203's picture
rwBK-sentiment-analysis-finBert_20
81852aa verified
|
raw
history blame
15.7 kB
metadata
base_model: ProsusAI/finbert
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
  - precision
  - recall
model-index:
  - name: finBert_SA_20e
    results: []

finBert_SA_20e

This model is a fine-tuned version of ProsusAI/finbert on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3343
  • Accuracy: 0.8882
  • F1: 0.8878
  • Precision: 0.8875
  • Recall: 0.8889

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
1.2897 0.1323 50 0.5582 0.7846 0.7828 0.7861 0.7858
0.5094 0.2646 100 0.4590 0.8304 0.8249 0.8329 0.8315
0.4218 0.3968 150 0.4386 0.8467 0.8441 0.8473 0.8476
0.3986 0.5291 200 0.3985 0.8508 0.8487 0.8505 0.8516
0.3866 0.6614 250 0.3847 0.8553 0.8563 0.8584 0.8554
0.391 0.7937 300 0.3528 0.8707 0.8703 0.8699 0.8713
0.3739 0.9259 350 0.3515 0.8697 0.8691 0.8688 0.8702
0.3499 1.0582 400 0.3539 0.8656 0.8670 0.8696 0.8659
0.3211 1.1905 450 0.3659 0.8728 0.8719 0.8718 0.8734
0.312 1.3228 500 0.3514 0.8726 0.8721 0.8722 0.8730
0.3001 1.4550 550 0.3325 0.8738 0.8749 0.8765 0.8741
0.2968 1.5873 600 0.3355 0.8756 0.8751 0.8752 0.8761
0.2893 1.7196 650 0.3639 0.8718 0.8700 0.8717 0.8725
0.3084 1.8519 700 0.3285 0.8817 0.8812 0.8810 0.8822
0.2838 1.9841 750 0.3390 0.8833 0.8833 0.8832 0.8838
0.21 2.1164 800 0.3772 0.8809 0.8808 0.8810 0.8815
0.2195 2.2487 850 0.3524 0.8820 0.8808 0.8814 0.8827
0.2109 2.3810 900 0.3530 0.8788 0.8776 0.8785 0.8794
0.2131 2.5132 950 0.3497 0.8838 0.8843 0.8846 0.8842
0.2028 2.6455 1000 0.3466 0.8876 0.8872 0.8870 0.8881
0.225 2.7778 1050 0.3544 0.8786 0.8804 0.8853 0.8786
0.212 2.9101 1100 0.3528 0.8850 0.8860 0.8878 0.8851
0.1993 3.0423 1150 0.3674 0.8896 0.8906 0.8923 0.8898
0.1349 3.1746 1200 0.3870 0.8904 0.8906 0.8906 0.8908
0.1371 3.3069 1250 0.3886 0.8876 0.8865 0.8871 0.8882
0.1587 3.4392 1300 0.3873 0.8857 0.8843 0.8856 0.8865
0.1406 3.5714 1350 0.4001 0.8833 0.8839 0.8852 0.8834
0.1884 3.7037 1400 0.3576 0.8861 0.8867 0.8875 0.8864
0.1539 3.8360 1450 0.3761 0.8927 0.8919 0.8922 0.8933
0.1426 3.9683 1500 0.3774 0.8902 0.8908 0.8917 0.8905
0.106 4.1005 1550 0.4692 0.8823 0.8836 0.8867 0.8825
0.0955 4.2328 1600 0.4351 0.8891 0.8891 0.8890 0.8894
0.1062 4.3651 1650 0.4300 0.8896 0.8889 0.8891 0.8901
0.1204 4.4974 1700 0.4308 0.8872 0.8862 0.8865 0.8878
0.0932 4.6296 1750 0.4403 0.8902 0.8905 0.8905 0.8906
0.1163 4.7619 1800 0.4199 0.8961 0.8954 0.8957 0.8966
0.1225 4.8942 1850 0.4211 0.8855 0.8844 0.8851 0.8861
0.1087 5.0265 1900 0.4421 0.8927 0.8929 0.8928 0.8931
0.0693 5.1587 1950 0.5357 0.8842 0.8847 0.8857 0.8843
0.0727 5.2910 2000 0.5088 0.8864 0.8862 0.8867 0.8867
0.0874 5.4233 2050 0.4516 0.8940 0.8943 0.8943 0.8943
0.0707 5.5556 2100 0.4983 0.8934 0.8935 0.8933 0.8938
0.0745 5.6878 2150 0.4946 0.8901 0.8903 0.8902 0.8905
0.0706 5.8201 2200 0.5088 0.8964 0.8958 0.8960 0.8970
0.0935 5.9524 2250 0.4664 0.8923 0.8917 0.8917 0.8927
0.0649 6.0847 2300 0.5200 0.8896 0.8899 0.8902 0.8897
0.0528 6.2169 2350 0.5412 0.8937 0.8941 0.8944 0.8939
0.0638 6.3492 2400 0.5140 0.8968 0.8962 0.8964 0.8973
0.0639 6.4815 2450 0.5087 0.8960 0.8957 0.8955 0.8964
0.0597 6.6138 2500 0.5272 0.8958 0.8951 0.8952 0.8963
0.0507 6.7460 2550 0.5685 0.8958 0.8950 0.8958 0.8965
0.062 6.8783 2600 0.5272 0.8944 0.8942 0.8940 0.8949
0.0604 7.0106 2650 0.5083 0.8988 0.8990 0.8989 0.8991
0.044 7.1429 2700 0.5663 0.8951 0.8959 0.8973 0.8953
0.0471 7.2751 2750 0.5610 0.8963 0.8964 0.8963 0.8967
0.0526 7.4074 2800 0.5725 0.8930 0.8937 0.8944 0.8932
0.0487 7.5397 2850 0.5943 0.8982 0.8981 0.8982 0.8986
0.0548 7.6720 2900 0.5556 0.9001 0.9003 0.9004 0.9004
0.0461 7.8042 2950 0.5452 0.9007 0.9004 0.9003 0.9011
0.041 7.9365 3000 0.5505 0.8978 0.8971 0.8974 0.8984
0.0388 8.0688 3050 0.6078 0.8981 0.8971 0.8980 0.8987
0.0318 8.2011 3100 0.6324 0.8947 0.8950 0.8953 0.8949
0.033 8.3333 3150 0.6211 0.8953 0.8956 0.8957 0.8956
0.0459 8.4656 3200 0.6161 0.8988 0.8990 0.8992 0.8992
0.0462 8.5979 3250 0.5925 0.8953 0.8954 0.8952 0.8956
0.0321 8.7302 3300 0.6416 0.8920 0.8914 0.8916 0.8925
0.0452 8.8624 3350 0.5777 0.8968 0.8967 0.8966 0.8972
0.0468 8.9947 3400 0.5743 0.8959 0.8964 0.8970 0.8962
0.0294 9.1270 3450 0.5977 0.8996 0.8992 0.8991 0.9000
0.0373 9.2593 3500 0.6051 0.8935 0.8944 0.8959 0.8937
0.035 9.3915 3550 0.6218 0.8986 0.8983 0.8982 0.8990
0.0304 9.5238 3600 0.6784 0.8926 0.8931 0.8938 0.8927
0.0464 9.6561 3650 0.6534 0.8968 0.8954 0.8973 0.8974
0.031 9.7884 3700 0.5966 0.8987 0.8986 0.8984 0.8990
0.0252 9.9206 3750 0.6065 0.8991 0.8988 0.8986 0.8994
0.0385 10.0529 3800 0.6120 0.8953 0.8945 0.8948 0.8958
0.0141 10.1852 3850 0.6305 0.8967 0.8971 0.8975 0.8969
0.0325 10.3175 3900 0.6163 0.9006 0.9002 0.9001 0.9010
0.0188 10.4497 3950 0.6286 0.9005 0.9002 0.9000 0.9009
0.0153 10.5820 4000 0.6769 0.8985 0.8988 0.8989 0.8987
0.03 10.7143 4050 0.6473 0.8970 0.8969 0.8969 0.8973
0.0286 10.8466 4100 0.6644 0.8991 0.8993 0.8994 0.8993
0.0311 10.9788 4150 0.6566 0.8989 0.8992 0.8994 0.8992
0.024 11.1111 4200 0.6562 0.9007 0.9011 0.9016 0.9010
0.0241 11.2434 4250 0.6290 0.9007 0.9008 0.9007 0.9011
0.0094 11.3757 4300 0.6739 0.9019 0.9015 0.9015 0.9023
0.0184 11.5079 4350 0.6819 0.8992 0.8994 0.8994 0.8995
0.017 11.6402 4400 0.6907 0.9037 0.9034 0.9033 0.9041
0.0275 11.7725 4450 0.6652 0.8983 0.8985 0.8984 0.8986
0.0138 11.9048 4500 0.6829 0.9013 0.9009 0.9008 0.9017
0.0173 12.0370 4550 0.6910 0.9016 0.9014 0.9012 0.9019
0.0129 12.1693 4600 0.7063 0.9018 0.9018 0.9017 0.9022
0.0173 12.3016 4650 0.7244 0.9015 0.9011 0.9011 0.9019
0.0223 12.4339 4700 0.7097 0.9013 0.9012 0.9010 0.9017
0.0179 12.5661 4750 0.7458 0.8967 0.8964 0.8963 0.8970
0.0162 12.6984 4800 0.7249 0.8987 0.8989 0.8988 0.8990
0.0144 12.8307 4850 0.7354 0.8990 0.8990 0.8987 0.8993
0.0189 12.9630 4900 0.7119 0.8999 0.8996 0.8994 0.9003
0.0097 13.0952 4950 0.7425 0.9012 0.9011 0.9009 0.9016
0.0122 13.2275 5000 0.7447 0.8991 0.8990 0.8990 0.8995
0.0171 13.3598 5050 0.7508 0.8980 0.8983 0.8986 0.8981
0.013 13.4921 5100 0.7380 0.9015 0.9016 0.9014 0.9017
0.0141 13.6243 5150 0.7380 0.9025 0.9026 0.9024 0.9028
0.0092 13.7566 5200 0.7636 0.8987 0.8992 0.8998 0.8989
0.0151 13.8889 5250 0.7474 0.9004 0.9008 0.9009 0.9006
0.0064 14.0212 5300 0.7812 0.8989 0.8992 0.8993 0.8992
0.0112 14.1534 5350 0.7392 0.9025 0.9022 0.9020 0.9029
0.008 14.2857 5400 0.7737 0.9035 0.9029 0.9034 0.9041
0.0042 14.4180 5450 0.7880 0.9012 0.9013 0.9012 0.9015
0.0084 14.5503 5500 0.7928 0.9025 0.9025 0.9023 0.9028
0.0054 14.6825 5550 0.8009 0.8990 0.8993 0.8993 0.8992
0.0099 14.8148 5600 0.7738 0.9020 0.9019 0.9017 0.9023
0.0087 14.9471 5650 0.8047 0.9023 0.9019 0.9021 0.9028
0.0136 15.0794 5700 0.7985 0.9018 0.9020 0.9020 0.9021
0.0048 15.2116 5750 0.8070 0.9027 0.9029 0.9030 0.9030
0.0083 15.3439 5800 0.8263 0.9025 0.9022 0.9026 0.9030
0.0038 15.4762 5850 0.8046 0.9040 0.9037 0.9036 0.9044
0.0098 15.6085 5900 0.7831 0.9028 0.9027 0.9025 0.9031
0.0107 15.7407 5950 0.7760 0.9034 0.9033 0.9031 0.9038
0.0099 15.8730 6000 0.8014 0.9015 0.9016 0.9016 0.9018
0.0087 16.0053 6050 0.7972 0.9022 0.9019 0.9018 0.9026
0.0034 16.1376 6100 0.8133 0.9001 0.9004 0.9006 0.9003
0.0063 16.2698 6150 0.7995 0.9028 0.9028 0.9026 0.9031
0.004 16.4021 6200 0.8010 0.9035 0.9033 0.9031 0.9038
0.0091 16.5344 6250 0.7946 0.9026 0.9023 0.9021 0.9029
0.0068 16.6667 6300 0.8044 0.9031 0.9032 0.9031 0.9035
0.0048 16.7989 6350 0.8205 0.9041 0.9038 0.9037 0.9045
0.0093 16.9312 6400 0.8196 0.9021 0.9020 0.9018 0.9024
0.0061 17.0635 6450 0.8313 0.9007 0.9009 0.9008 0.9010
0.0058 17.1958 6500 0.8315 0.9001 0.9003 0.9002 0.9005
0.0025 17.3280 6550 0.8407 0.9009 0.9010 0.9009 0.9012
0.0059 17.4603 6600 0.8447 0.8989 0.8992 0.8992 0.8992
0.0038 17.5926 6650 0.8379 0.9029 0.9027 0.9025 0.9033
0.0044 17.7249 6700 0.8374 0.9016 0.9017 0.9016 0.9019
0.0077 17.8571 6750 0.8314 0.9033 0.9029 0.9028 0.9037
0.0039 17.9894 6800 0.8312 0.9017 0.9013 0.9012 0.9021
0.0051 18.1217 6850 0.8277 0.9024 0.9021 0.9019 0.9028
0.0053 18.2540 6900 0.8340 0.9021 0.9021 0.9019 0.9024
0.0015 18.3862 6950 0.8395 0.9018 0.9018 0.9017 0.9021
0.0038 18.5185 7000 0.8436 0.9021 0.9022 0.9020 0.9025
0.0044 18.6508 7050 0.8463 0.9025 0.9023 0.9021 0.9028
0.0051 18.7831 7100 0.8470 0.9021 0.9020 0.9018 0.9025
0.0035 18.9153 7150 0.8476 0.9027 0.9027 0.9025 0.9031
0.0028 19.0476 7200 0.8485 0.9028 0.9027 0.9025 0.9032
0.0022 19.1799 7250 0.8495 0.9027 0.9027 0.9025 0.9031
0.0069 19.3122 7300 0.8527 0.9017 0.9019 0.9018 0.9020
0.0058 19.4444 7350 0.8535 0.9013 0.9015 0.9013 0.9016
0.0032 19.5767 7400 0.8536 0.9022 0.9022 0.9020 0.9025
0.004 19.7090 7450 0.8525 0.9023 0.9023 0.9021 0.9026
0.0039 19.8413 7500 0.8521 0.9021 0.9021 0.9019 0.9025
0.0017 19.9735 7550 0.8526 0.9024 0.9023 0.9021 0.9027

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.2.1+cu121
  • Tokenizers 0.19.1