Edit model card

my_awesome_wnut_Place

This model is a fine-tuned version of distilbert/distilbert-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0871
  • Precision: 0.7
  • Recall: 0.8077
  • F1: 0.75
  • Accuracy: 0.9909

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 46 0.0618 0.0 0.0 0.0 0.9748
No log 2.0 92 0.0324 0.6129 0.7308 0.6667 0.9884
No log 3.0 138 0.0326 0.6452 0.7692 0.7018 0.9909
No log 4.0 184 0.0364 0.6825 0.8269 0.7478 0.9906
No log 5.0 230 0.0409 0.7455 0.7885 0.7664 0.9918
No log 6.0 276 0.0462 0.7213 0.8462 0.7788 0.9924
No log 7.0 322 0.0486 0.7368 0.8077 0.7706 0.9921
No log 8.0 368 0.0547 0.7143 0.7692 0.7407 0.9912
No log 9.0 414 0.0532 0.65 0.75 0.6964 0.9894
No log 10.0 460 0.0501 0.6667 0.7692 0.7143 0.9912
0.0249 11.0 506 0.0571 0.7018 0.7692 0.7339 0.9909
0.0249 12.0 552 0.0628 0.6667 0.8077 0.7304 0.9894
0.0249 13.0 598 0.0693 0.6418 0.8269 0.7227 0.9887
0.0249 14.0 644 0.0589 0.6833 0.7885 0.7321 0.9912
0.0249 15.0 690 0.0569 0.7321 0.7885 0.7593 0.9915
0.0249 16.0 736 0.0599 0.6885 0.8077 0.7434 0.9918
0.0249 17.0 782 0.0625 0.6885 0.8077 0.7434 0.9912
0.0249 18.0 828 0.0620 0.7193 0.7885 0.7523 0.9918
0.0249 19.0 874 0.0621 0.6935 0.8269 0.7544 0.9906
0.0249 20.0 920 0.0671 0.6364 0.8077 0.7119 0.9894
0.0249 21.0 966 0.0657 0.6719 0.8269 0.7414 0.9903
0.0006 22.0 1012 0.0685 0.6949 0.7885 0.7387 0.9903
0.0006 23.0 1058 0.0696 0.6833 0.7885 0.7321 0.9900
0.0006 24.0 1104 0.0709 0.6833 0.7885 0.7321 0.9900
0.0006 25.0 1150 0.0719 0.6949 0.7885 0.7387 0.9903
0.0006 26.0 1196 0.0736 0.7069 0.7885 0.7455 0.9909
0.0006 27.0 1242 0.0733 0.6780 0.7692 0.7207 0.9897
0.0006 28.0 1288 0.0743 0.6949 0.7885 0.7387 0.9903
0.0006 29.0 1334 0.0784 0.7143 0.7692 0.7407 0.9906
0.0006 30.0 1380 0.0766 0.7069 0.7885 0.7455 0.9906
0.0006 31.0 1426 0.0753 0.6949 0.7885 0.7387 0.9909
0.0006 32.0 1472 0.0809 0.6462 0.8077 0.7179 0.9891
0.0002 33.0 1518 0.0690 0.6613 0.7885 0.7193 0.9906
0.0002 34.0 1564 0.0674 0.7069 0.7885 0.7455 0.9912
0.0002 35.0 1610 0.0692 0.6949 0.7885 0.7387 0.9909
0.0002 36.0 1656 0.0712 0.6949 0.7885 0.7387 0.9909
0.0002 37.0 1702 0.0743 0.6721 0.7885 0.7257 0.9909
0.0002 38.0 1748 0.0746 0.6833 0.7885 0.7321 0.9912
0.0002 39.0 1794 0.0748 0.6833 0.7885 0.7321 0.9912
0.0002 40.0 1840 0.0750 0.6833 0.7885 0.7321 0.9912
0.0002 41.0 1886 0.0760 0.6833 0.7885 0.7321 0.9912
0.0002 42.0 1932 0.0765 0.6833 0.7885 0.7321 0.9912
0.0002 43.0 1978 0.0774 0.6833 0.7885 0.7321 0.9912
0.0001 44.0 2024 0.0778 0.6833 0.7885 0.7321 0.9912
0.0001 45.0 2070 0.0830 0.6308 0.7885 0.7009 0.9900
0.0001 46.0 2116 0.0709 0.7222 0.75 0.7358 0.9912
0.0001 47.0 2162 0.0711 0.6949 0.7885 0.7387 0.9915
0.0001 48.0 2208 0.0779 0.7091 0.75 0.7290 0.9909
0.0001 49.0 2254 0.0792 0.6667 0.8077 0.7304 0.9909
0.0001 50.0 2300 0.0766 0.6842 0.75 0.7156 0.9909
0.0001 51.0 2346 0.0773 0.6897 0.7692 0.7273 0.9906
0.0001 52.0 2392 0.0766 0.7018 0.7692 0.7339 0.9909
0.0001 53.0 2438 0.0803 0.6508 0.7885 0.7130 0.9906
0.0001 54.0 2484 0.0783 0.6462 0.8077 0.7179 0.9903
0.0003 55.0 2530 0.0849 0.6364 0.8077 0.7119 0.9900
0.0003 56.0 2576 0.0843 0.6406 0.7885 0.7069 0.9900
0.0003 57.0 2622 0.0833 0.6613 0.7885 0.7193 0.9906
0.0003 58.0 2668 0.0837 0.6613 0.7885 0.7193 0.9906
0.0003 59.0 2714 0.0842 0.6613 0.7885 0.7193 0.9906
0.0003 60.0 2760 0.0846 0.6613 0.7885 0.7193 0.9906
0.0003 61.0 2806 0.0855 0.6508 0.7885 0.7130 0.9903
0.0003 62.0 2852 0.0866 0.6406 0.7885 0.7069 0.9900
0.0003 63.0 2898 0.0868 0.6406 0.7885 0.7069 0.9900
0.0003 64.0 2944 0.0870 0.6406 0.7885 0.7069 0.9900
0.0003 65.0 2990 0.0872 0.6406 0.7885 0.7069 0.9900
0.0 66.0 3036 0.0879 0.6406 0.7885 0.7069 0.9900
0.0 67.0 3082 0.0881 0.6406 0.7885 0.7069 0.9900
0.0 68.0 3128 0.0884 0.6406 0.7885 0.7069 0.9900
0.0 69.0 3174 0.0885 0.6508 0.7885 0.7130 0.9903
0.0 70.0 3220 0.0887 0.6508 0.7885 0.7130 0.9903
0.0 71.0 3266 0.0889 0.6508 0.7885 0.7130 0.9903
0.0 72.0 3312 0.0890 0.6508 0.7885 0.7130 0.9903
0.0 73.0 3358 0.0892 0.6508 0.7885 0.7130 0.9903
0.0 74.0 3404 0.0893 0.6508 0.7885 0.7130 0.9903
0.0 75.0 3450 0.0894 0.6508 0.7885 0.7130 0.9903
0.0 76.0 3496 0.0896 0.6508 0.7885 0.7130 0.9903
0.0 77.0 3542 0.0897 0.6508 0.7885 0.7130 0.9903
0.0 78.0 3588 0.0898 0.6508 0.7885 0.7130 0.9903
0.0 79.0 3634 0.0899 0.6508 0.7885 0.7130 0.9903
0.0 80.0 3680 0.0900 0.6508 0.7885 0.7130 0.9903
0.0 81.0 3726 0.0902 0.6508 0.7885 0.7130 0.9903
0.0 82.0 3772 0.0895 0.6508 0.7885 0.7130 0.9903
0.0 83.0 3818 0.0881 0.6613 0.7885 0.7193 0.9906
0.0 84.0 3864 0.0861 0.6613 0.7885 0.7193 0.9906
0.0 85.0 3910 0.0862 0.6613 0.7885 0.7193 0.9906
0.0 86.0 3956 0.0865 0.6613 0.7885 0.7193 0.9906
0.0001 87.0 4002 0.0862 0.6721 0.7885 0.7257 0.9909
0.0001 88.0 4048 0.0863 0.7 0.8077 0.75 0.9909
0.0001 89.0 4094 0.0864 0.7 0.8077 0.75 0.9909
0.0001 90.0 4140 0.0865 0.7 0.8077 0.75 0.9909
0.0001 91.0 4186 0.0866 0.7 0.8077 0.75 0.9909
0.0001 92.0 4232 0.0867 0.7 0.8077 0.75 0.9909
0.0001 93.0 4278 0.0868 0.7 0.8077 0.75 0.9909
0.0001 94.0 4324 0.0869 0.7 0.8077 0.75 0.9909
0.0001 95.0 4370 0.0870 0.7 0.8077 0.75 0.9909
0.0001 96.0 4416 0.0870 0.7 0.8077 0.75 0.9909
0.0001 97.0 4462 0.0871 0.7 0.8077 0.75 0.9909
0.0 98.0 4508 0.0871 0.7 0.8077 0.75 0.9909
0.0 99.0 4554 0.0871 0.7 0.8077 0.75 0.9909
0.0 100.0 4600 0.0871 0.7 0.8077 0.75 0.9909

Framework versions

  • Transformers 4.39.1
  • Pytorch 2.2.2+cu118
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
5
Safetensors
Model size
66.4M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for gonzalezrostani/my_awesome_wnut_Place

Finetuned
(6145)
this model