Edit model card

my_awesome_wnut_JGTt

This model is a fine-tuned version of distilbert/distilbert-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1104
  • Precision: 0.4118
  • Recall: 0.2917
  • F1: 0.3415
  • Accuracy: 0.9881

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 46 0.0614 0.0 0.0 0.0 0.9884
No log 2.0 92 0.0510 0.0 0.0 0.0 0.9884
No log 3.0 138 0.0586 0.5 0.0417 0.0769 0.9884
No log 4.0 184 0.0615 0.25 0.2083 0.2273 0.9863
No log 5.0 230 0.0665 0.6 0.25 0.3529 0.9891
No log 6.0 276 0.0631 0.4444 0.3333 0.3810 0.9884
No log 7.0 322 0.0668 0.35 0.2917 0.3182 0.9872
No log 8.0 368 0.0762 0.3846 0.2083 0.2703 0.9875
No log 9.0 414 0.0875 0.4545 0.2083 0.2857 0.9881
No log 10.0 460 0.0866 0.4545 0.2083 0.2857 0.9884
0.0196 11.0 506 0.0866 0.375 0.25 0.3 0.9872
0.0196 12.0 552 0.0899 0.4 0.25 0.3077 0.9875
0.0196 13.0 598 0.0907 0.375 0.25 0.3 0.9872
0.0196 14.0 644 0.0933 0.4286 0.25 0.3158 0.9881
0.0196 15.0 690 0.0952 0.4286 0.25 0.3158 0.9881
0.0196 16.0 736 0.0972 0.4286 0.25 0.3158 0.9881
0.0196 17.0 782 0.0989 0.4286 0.25 0.3158 0.9881
0.0196 18.0 828 0.1037 0.4545 0.2083 0.2857 0.9884
0.0196 19.0 874 0.0998 0.4118 0.2917 0.3415 0.9875
0.0196 20.0 920 0.1026 0.5556 0.2083 0.3030 0.9887
0.0196 21.0 966 0.1064 0.5 0.125 0.2 0.9887
0.0004 22.0 1012 0.0950 0.4 0.25 0.3077 0.9881
0.0004 23.0 1058 0.0916 0.5714 0.3333 0.4211 0.9897
0.0004 24.0 1104 0.0944 0.5455 0.25 0.3429 0.9894
0.0004 25.0 1150 0.0964 0.5 0.25 0.3333 0.9891
0.0004 26.0 1196 0.1129 0.4286 0.125 0.1935 0.9884
0.0004 27.0 1242 0.1003 0.375 0.25 0.3 0.9878
0.0004 28.0 1288 0.1017 0.3333 0.2083 0.2564 0.9875
0.0004 29.0 1334 0.1048 0.3125 0.2083 0.25 0.9875
0.0004 30.0 1380 0.1030 0.3 0.25 0.2727 0.9863
0.0004 31.0 1426 0.1038 0.35 0.2917 0.3182 0.9872
0.0004 32.0 1472 0.1055 0.4286 0.25 0.3158 0.9881
0.0003 33.0 1518 0.1068 0.4615 0.25 0.3243 0.9887
0.0003 34.0 1564 0.1084 0.3529 0.25 0.2927 0.9878
0.0003 35.0 1610 0.1088 0.3846 0.2083 0.2703 0.9881
0.0003 36.0 1656 0.1182 0.5714 0.1667 0.2581 0.9891
0.0003 37.0 1702 0.1145 0.4545 0.2083 0.2857 0.9881
0.0003 38.0 1748 0.1156 0.4545 0.2083 0.2857 0.9881
0.0003 39.0 1794 0.1150 0.4167 0.2083 0.2778 0.9878
0.0003 40.0 1840 0.1161 0.4167 0.2083 0.2778 0.9878
0.0003 41.0 1886 0.1118 0.4615 0.25 0.3243 0.9881
0.0003 42.0 1932 0.1119 0.4545 0.2083 0.2857 0.9881
0.0003 43.0 1978 0.1113 0.5 0.25 0.3333 0.9884
0.0002 44.0 2024 0.1182 0.6 0.125 0.2069 0.9891
0.0002 45.0 2070 0.0972 0.3889 0.2917 0.3333 0.9887
0.0002 46.0 2116 0.1022 0.3 0.375 0.3333 0.9857
0.0002 47.0 2162 0.0924 0.4091 0.375 0.3913 0.9881
0.0002 48.0 2208 0.1000 0.5 0.25 0.3333 0.9884
0.0002 49.0 2254 0.0971 0.4286 0.375 0.4000 0.9881
0.0002 50.0 2300 0.0975 0.3529 0.25 0.2927 0.9875
0.0002 51.0 2346 0.1049 0.4 0.1667 0.2353 0.9881
0.0002 52.0 2392 0.1052 0.3636 0.1667 0.2286 0.9875
0.0002 53.0 2438 0.1054 0.3571 0.2083 0.2632 0.9872
0.0002 54.0 2484 0.1069 0.3636 0.1667 0.2286 0.9875
0.0007 55.0 2530 0.1080 0.3636 0.1667 0.2286 0.9875
0.0007 56.0 2576 0.1184 0.5714 0.1667 0.2581 0.9891
0.0007 57.0 2622 0.1029 0.4545 0.4167 0.4348 0.9891
0.0007 58.0 2668 0.1033 0.4286 0.375 0.4000 0.9887
0.0007 59.0 2714 0.1039 0.4 0.3333 0.3636 0.9884
0.0007 60.0 2760 0.1047 0.3889 0.2917 0.3333 0.9884
0.0007 61.0 2806 0.1041 0.4167 0.4167 0.4167 0.9891
0.0007 62.0 2852 0.1045 0.4286 0.375 0.4000 0.9887
0.0007 63.0 2898 0.1046 0.4286 0.375 0.4000 0.9887
0.0007 64.0 2944 0.1047 0.4211 0.3333 0.3721 0.9887
0.0007 65.0 2990 0.1078 0.4348 0.4167 0.4255 0.9887
0.0002 66.0 3036 0.1054 0.4118 0.2917 0.3415 0.9881
0.0002 67.0 3082 0.1054 0.4118 0.2917 0.3415 0.9881
0.0002 68.0 3128 0.1057 0.4118 0.2917 0.3415 0.9881
0.0002 69.0 3174 0.1060 0.4118 0.2917 0.3415 0.9881
0.0002 70.0 3220 0.1062 0.4118 0.2917 0.3415 0.9881
0.0002 71.0 3266 0.1066 0.4118 0.2917 0.3415 0.9881
0.0002 72.0 3312 0.1068 0.4118 0.2917 0.3415 0.9881
0.0002 73.0 3358 0.1071 0.4118 0.2917 0.3415 0.9881
0.0002 74.0 3404 0.1071 0.4118 0.2917 0.3415 0.9881
0.0002 75.0 3450 0.1076 0.4118 0.2917 0.3415 0.9881
0.0002 76.0 3496 0.1077 0.4118 0.2917 0.3415 0.9881
0.0001 77.0 3542 0.1077 0.4118 0.2917 0.3415 0.9881
0.0001 78.0 3588 0.1076 0.4118 0.2917 0.3415 0.9881
0.0001 79.0 3634 0.1079 0.4118 0.2917 0.3415 0.9881
0.0001 80.0 3680 0.1082 0.4118 0.2917 0.3415 0.9881
0.0001 81.0 3726 0.1084 0.4118 0.2917 0.3415 0.9881
0.0001 82.0 3772 0.1085 0.4118 0.2917 0.3415 0.9881
0.0001 83.0 3818 0.1090 0.4118 0.2917 0.3415 0.9881
0.0001 84.0 3864 0.1091 0.4118 0.2917 0.3415 0.9881
0.0001 85.0 3910 0.1091 0.4118 0.2917 0.3415 0.9881
0.0001 86.0 3956 0.1094 0.4118 0.2917 0.3415 0.9881
0.0001 87.0 4002 0.1094 0.4118 0.2917 0.3415 0.9881
0.0001 88.0 4048 0.1095 0.4118 0.2917 0.3415 0.9881
0.0001 89.0 4094 0.1097 0.4118 0.2917 0.3415 0.9881
0.0001 90.0 4140 0.1098 0.4118 0.2917 0.3415 0.9881
0.0001 91.0 4186 0.1099 0.4118 0.2917 0.3415 0.9881
0.0001 92.0 4232 0.1100 0.4118 0.2917 0.3415 0.9881
0.0001 93.0 4278 0.1101 0.4118 0.2917 0.3415 0.9881
0.0001 94.0 4324 0.1102 0.4118 0.2917 0.3415 0.9881
0.0001 95.0 4370 0.1102 0.4118 0.2917 0.3415 0.9881
0.0001 96.0 4416 0.1103 0.4118 0.2917 0.3415 0.9881
0.0001 97.0 4462 0.1103 0.4118 0.2917 0.3415 0.9881
0.0001 98.0 4508 0.1103 0.4118 0.2917 0.3415 0.9881
0.0001 99.0 4554 0.1104 0.4118 0.2917 0.3415 0.9881
0.0001 100.0 4600 0.1104 0.4118 0.2917 0.3415 0.9881

Framework versions

  • Transformers 4.39.1
  • Pytorch 2.2.2+cu118
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
6
Safetensors
Model size
66.4M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for gonzalezrostani/my_awesome_wnut_JGTt

Finetuned
(6145)
this model