Edit model card

text-message-analyzer-finetuned

This model is a fine-tuned version of cardiffnlp/twitter-roberta-base-sentiment-latest on the daily_dialog dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5913
  • Accuracy: 0.762
  • F1: 0.7650
  • Precision: 0.7706
  • Recall: 0.762

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 15
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
No log 0.01 5 0.8666 0.589 0.5903 0.6104 0.589
No log 0.01 10 0.7596 0.661 0.6590 0.6603 0.661
No log 0.02 15 1.1783 0.521 0.5244 0.7242 0.521
No log 0.02 20 0.8909 0.615 0.6318 0.6910 0.615
No log 0.03 25 0.7995 0.666 0.6743 0.6918 0.666
No log 0.03 30 0.7699 0.65 0.6585 0.6935 0.65
No log 0.04 35 0.7344 0.662 0.6691 0.6857 0.662
No log 0.04 40 0.7326 0.654 0.6675 0.7036 0.654
No log 0.05 45 0.9608 0.603 0.5705 0.7211 0.603
No log 0.05 50 0.8593 0.628 0.6338 0.7262 0.628
No log 0.06 55 0.8635 0.626 0.6066 0.7400 0.626
No log 0.07 60 0.7101 0.682 0.6782 0.6911 0.682
No log 0.07 65 0.7569 0.67 0.6780 0.7067 0.67
No log 0.08 70 0.7694 0.653 0.6608 0.7271 0.653
No log 0.08 75 0.6941 0.691 0.6925 0.7202 0.691
No log 0.09 80 0.8646 0.606 0.6168 0.7450 0.606
No log 0.09 85 0.6853 0.677 0.6895 0.7369 0.677
No log 0.1 90 0.6410 0.727 0.7264 0.7272 0.727
No log 0.1 95 0.7059 0.693 0.7020 0.7410 0.693
No log 0.11 100 0.7398 0.665 0.6734 0.7441 0.665
No log 0.11 105 0.7205 0.683 0.6884 0.7243 0.683
No log 0.12 110 0.7492 0.661 0.6741 0.7410 0.661
No log 0.12 115 0.7273 0.676 0.6932 0.7388 0.676
No log 0.13 120 0.6670 0.678 0.6853 0.7079 0.678
No log 0.14 125 0.7238 0.663 0.6707 0.7348 0.663
No log 0.14 130 0.7109 0.68 0.6948 0.7333 0.68
No log 0.15 135 0.6813 0.685 0.6832 0.7324 0.685
No log 0.15 140 0.6859 0.692 0.7002 0.7304 0.692
No log 0.16 145 0.7968 0.622 0.6231 0.7268 0.622
No log 0.16 150 0.6754 0.695 0.7022 0.7212 0.695
No log 0.17 155 0.6520 0.698 0.6981 0.7296 0.698
No log 0.17 160 0.6198 0.726 0.7282 0.7334 0.726
No log 0.18 165 0.6745 0.703 0.6974 0.7346 0.703
No log 0.18 170 0.6724 0.707 0.7182 0.7486 0.707
No log 0.19 175 0.7787 0.636 0.6392 0.7409 0.636
No log 0.2 180 0.7098 0.667 0.6663 0.7338 0.667
No log 0.2 185 0.6340 0.728 0.7290 0.7340 0.728
No log 0.21 190 0.6561 0.698 0.7023 0.7229 0.698
No log 0.21 195 0.6790 0.678 0.6804 0.7318 0.678
No log 0.22 200 0.7213 0.654 0.6497 0.7337 0.654
No log 0.22 205 0.7410 0.652 0.6609 0.7242 0.652
No log 0.23 210 0.6848 0.703 0.7084 0.7332 0.703
No log 0.23 215 0.6946 0.689 0.6796 0.7291 0.689
No log 0.24 220 0.7092 0.674 0.6870 0.7311 0.674
No log 0.24 225 0.6285 0.705 0.7085 0.7295 0.705
No log 0.25 230 0.6449 0.696 0.6990 0.7166 0.696
No log 0.25 235 0.7303 0.671 0.6694 0.7366 0.671
No log 0.26 240 0.7583 0.67 0.6822 0.7399 0.67
No log 0.27 245 0.7154 0.678 0.6866 0.7443 0.678
No log 0.27 250 0.7337 0.686 0.6852 0.7369 0.686
No log 0.28 255 0.6486 0.711 0.7136 0.7362 0.711
No log 0.28 260 0.6231 0.736 0.7350 0.7410 0.736
No log 0.29 265 0.6963 0.709 0.7211 0.7532 0.709
No log 0.29 270 0.6847 0.693 0.7028 0.7403 0.693
No log 0.3 275 0.6581 0.696 0.6969 0.7464 0.696
No log 0.3 280 0.6182 0.702 0.7061 0.7187 0.702
No log 0.31 285 0.6653 0.682 0.6898 0.7144 0.682
No log 0.31 290 0.6917 0.699 0.7091 0.7372 0.699
No log 0.32 295 0.6722 0.704 0.7067 0.7285 0.704
No log 0.33 300 0.6582 0.703 0.7073 0.7238 0.703
No log 0.33 305 0.6568 0.687 0.6934 0.7146 0.687
No log 0.34 310 0.6912 0.665 0.6605 0.7292 0.665
No log 0.34 315 0.6223 0.71 0.7119 0.7311 0.71
No log 0.35 320 0.6409 0.714 0.7146 0.7244 0.714
No log 0.35 325 0.7169 0.689 0.7023 0.7385 0.689
No log 0.36 330 0.7887 0.649 0.6580 0.7435 0.649
No log 0.36 335 0.6594 0.694 0.6987 0.7111 0.694
No log 0.37 340 0.6559 0.713 0.7121 0.7137 0.713
No log 0.37 345 0.6490 0.686 0.6927 0.7076 0.686
No log 0.38 350 0.6964 0.67 0.6837 0.7424 0.67
No log 0.39 355 0.7011 0.669 0.6873 0.7460 0.669
No log 0.39 360 0.6987 0.668 0.6875 0.7409 0.668
No log 0.4 365 0.6375 0.696 0.7057 0.7340 0.696
No log 0.4 370 0.6365 0.695 0.6972 0.7270 0.695
No log 0.41 375 0.6212 0.712 0.7190 0.7488 0.712
No log 0.41 380 0.7102 0.667 0.6770 0.7532 0.667
No log 0.42 385 0.7385 0.66 0.6616 0.7498 0.66
No log 0.42 390 0.6221 0.723 0.7276 0.7533 0.723
No log 0.43 395 0.6174 0.74 0.7469 0.7651 0.74
No log 0.43 400 0.6092 0.748 0.7538 0.7644 0.748
No log 0.44 405 0.5978 0.737 0.7412 0.7483 0.737
No log 0.44 410 0.6645 0.697 0.6964 0.7402 0.697
No log 0.45 415 0.7153 0.67 0.6654 0.7372 0.67
No log 0.46 420 0.6236 0.728 0.7343 0.7560 0.728
No log 0.46 425 0.7162 0.682 0.6915 0.7441 0.682
No log 0.47 430 0.6658 0.712 0.7228 0.7530 0.712
No log 0.47 435 0.6350 0.725 0.7326 0.7535 0.725
No log 0.48 440 0.5977 0.725 0.7293 0.7378 0.725
No log 0.48 445 0.5900 0.722 0.7246 0.7312 0.722
No log 0.49 450 0.5993 0.716 0.7198 0.7327 0.716
No log 0.49 455 0.6322 0.711 0.7189 0.7450 0.711
No log 0.5 460 0.7598 0.668 0.6824 0.7507 0.668
No log 0.5 465 0.7033 0.7 0.7133 0.7620 0.7
No log 0.51 470 0.6343 0.726 0.7348 0.7525 0.726
No log 0.52 475 0.6080 0.729 0.7352 0.7507 0.729
No log 0.52 480 0.5939 0.741 0.7455 0.7539 0.741
No log 0.53 485 0.6038 0.739 0.7448 0.7560 0.739
No log 0.53 490 0.6240 0.734 0.7386 0.7566 0.734
No log 0.54 495 0.6442 0.724 0.7323 0.7560 0.724
0.7055 0.54 500 0.7067 0.71 0.7237 0.7583 0.71
0.7055 0.55 505 0.7353 0.704 0.7133 0.7484 0.704
0.7055 0.55 510 0.6534 0.733 0.7377 0.7475 0.733
0.7055 0.56 515 0.7046 0.729 0.7315 0.7533 0.729
0.7055 0.56 520 0.7140 0.711 0.7130 0.7487 0.711
0.7055 0.57 525 0.6423 0.716 0.7193 0.7443 0.716
0.7055 0.57 530 0.6074 0.733 0.7377 0.7481 0.733
0.7055 0.58 535 0.6066 0.735 0.7405 0.7513 0.735
0.7055 0.59 540 0.5945 0.732 0.7374 0.7486 0.732
0.7055 0.59 545 0.6231 0.705 0.7112 0.7439 0.705
0.7055 0.6 550 0.6108 0.737 0.7460 0.7660 0.737
0.7055 0.6 555 0.5846 0.754 0.7572 0.7675 0.754
0.7055 0.61 560 0.5965 0.748 0.7496 0.7640 0.748
0.7055 0.61 565 0.5849 0.753 0.7577 0.7687 0.753
0.7055 0.62 570 0.6037 0.723 0.7269 0.7514 0.723
0.7055 0.62 575 0.5773 0.742 0.7455 0.7598 0.742
0.7055 0.63 580 0.5661 0.751 0.7545 0.7607 0.751
0.7055 0.63 585 0.5717 0.752 0.7555 0.7626 0.752
0.7055 0.64 590 0.5905 0.762 0.7674 0.7808 0.762
0.7055 0.65 595 0.5876 0.759 0.7649 0.7773 0.759
0.7055 0.65 600 0.5651 0.77 0.7717 0.7741 0.77
0.7055 0.66 605 0.5791 0.748 0.7465 0.7502 0.748
0.7055 0.66 610 0.6135 0.721 0.7210 0.7434 0.721
0.7055 0.67 615 0.6268 0.723 0.7242 0.7523 0.723
0.7055 0.67 620 0.6211 0.71 0.7106 0.7449 0.71
0.7055 0.68 625 0.5829 0.757 0.7607 0.7742 0.757
0.7055 0.68 630 0.5718 0.765 0.7681 0.7744 0.765
0.7055 0.69 635 0.5685 0.775 0.7769 0.7830 0.775
0.7055 0.69 640 0.5731 0.752 0.7545 0.7653 0.752
0.7055 0.7 645 0.5903 0.733 0.7356 0.7570 0.733
0.7055 0.7 650 0.5973 0.73 0.7327 0.7575 0.73
0.7055 0.71 655 0.6056 0.72 0.7213 0.7535 0.72
0.7055 0.72 660 0.5617 0.763 0.7648 0.7703 0.763
0.7055 0.72 665 0.5781 0.761 0.7576 0.7688 0.761
0.7055 0.73 670 0.5993 0.745 0.7409 0.7650 0.745
0.7055 0.73 675 0.6027 0.746 0.7504 0.7675 0.746
0.7055 0.74 680 0.5825 0.751 0.7534 0.7600 0.751
0.7055 0.74 685 0.5742 0.745 0.7469 0.7513 0.745
0.7055 0.75 690 0.5907 0.731 0.7313 0.7462 0.731
0.7055 0.75 695 0.6017 0.734 0.7340 0.7555 0.734
0.7055 0.76 700 0.5767 0.746 0.7477 0.7599 0.746
0.7055 0.76 705 0.5859 0.747 0.7510 0.7676 0.747
0.7055 0.77 710 0.6001 0.747 0.7518 0.7690 0.747
0.7055 0.78 715 0.6427 0.719 0.7233 0.7541 0.719
0.7055 0.78 720 0.6600 0.72 0.7247 0.7556 0.72
0.7055 0.79 725 0.6365 0.744 0.7468 0.7640 0.744
0.7055 0.79 730 0.6089 0.754 0.7555 0.7596 0.754
0.7055 0.8 735 0.6050 0.749 0.7484 0.7494 0.749
0.7055 0.8 740 0.6120 0.745 0.7442 0.7518 0.745
0.7055 0.81 745 0.6205 0.736 0.7356 0.7490 0.736
0.7055 0.81 750 0.6174 0.737 0.7376 0.7544 0.737
0.7055 0.82 755 0.6222 0.733 0.7358 0.7585 0.733
0.7055 0.82 760 0.6216 0.737 0.7428 0.7636 0.737
0.7055 0.83 765 0.6138 0.749 0.7548 0.7691 0.749
0.7055 0.84 770 0.5977 0.76 0.7628 0.7682 0.76
0.7055 0.84 775 0.5930 0.762 0.7639 0.7671 0.762
0.7055 0.85 780 0.6002 0.762 0.7632 0.7682 0.762
0.7055 0.85 785 0.6029 0.76 0.7621 0.7676 0.76
0.7055 0.86 790 0.6068 0.751 0.7544 0.7615 0.751
0.7055 0.86 795 0.6188 0.746 0.7508 0.7615 0.746
0.7055 0.87 800 0.6398 0.725 0.7300 0.7486 0.725
0.7055 0.87 805 0.6555 0.717 0.7205 0.7461 0.717
0.7055 0.88 810 0.6550 0.726 0.7282 0.7578 0.726
0.7055 0.88 815 0.6376 0.726 0.7283 0.7474 0.726
0.7055 0.89 820 0.6115 0.741 0.7436 0.7524 0.741
0.7055 0.89 825 0.6048 0.756 0.7583 0.7638 0.756
0.7055 0.9 830 0.6039 0.753 0.7548 0.7591 0.753
0.7055 0.91 835 0.6018 0.754 0.7559 0.7605 0.754
0.7055 0.91 840 0.5967 0.757 0.7597 0.7653 0.757
0.7055 0.92 845 0.5937 0.766 0.7687 0.7738 0.766
0.7055 0.92 850 0.5945 0.766 0.7689 0.7740 0.766
0.7055 0.93 855 0.5951 0.764 0.7669 0.7722 0.764
0.7055 0.93 860 0.5953 0.761 0.7640 0.7699 0.761
0.7055 0.94 865 0.5977 0.762 0.7651 0.7726 0.762
0.7055 0.94 870 0.5969 0.763 0.7659 0.7733 0.763
0.7055 0.95 875 0.5957 0.764 0.7667 0.7740 0.764
0.7055 0.95 880 0.5927 0.762 0.7650 0.7717 0.762
0.7055 0.96 885 0.5916 0.763 0.7660 0.7715 0.763
0.7055 0.97 890 0.5935 0.762 0.7654 0.7717 0.762
0.7055 0.97 895 0.5934 0.759 0.7625 0.7689 0.759
0.7055 0.98 900 0.5919 0.763 0.7660 0.7715 0.763
0.7055 0.98 905 0.5913 0.762 0.7650 0.7705 0.762
0.7055 0.99 910 0.5916 0.764 0.7671 0.7726 0.764
0.7055 0.99 915 0.5916 0.762 0.7650 0.7706 0.762
0.7055 1.0 920 0.5913 0.762 0.7650 0.7706 0.762

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
1
Safetensors
Model size
125M params
Tensor type
F32
·

Finetuned from

Dataset used to train matchten/text-message-analyzer-finetuned

Evaluation results