Edit model card

bert-base-cased-airlines-news-multi-label

This model is a fine-tuned version of bert-base-cased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3009
  • F1: 0.8533
  • Jaccard: 0.4071
  • Precisions: 0.8126
  • Recalls: 0.8999

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 8e-05
  • train_batch_size: 24
  • eval_batch_size: 24
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 150
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss F1 Jaccard Precisions Recalls
No log 1.0 76 0.5236 0.7888 0.1283 0.8216 0.7804
No log 2.0 152 0.3180 0.8533 0.4071 0.8126 0.8999
No log 3.0 228 0.3117 0.8533 0.4071 0.8126 0.8999
No log 4.0 304 0.3106 0.8533 0.4071 0.8126 0.8999
No log 5.0 380 0.3110 0.8533 0.4071 0.8126 0.8999
No log 6.0 456 0.3095 0.8533 0.4071 0.8126 0.8999
0.3902 7.0 532 0.3096 0.8533 0.4071 0.8126 0.8999
0.3902 8.0 608 0.3089 0.8533 0.4071 0.8126 0.8999
0.3902 9.0 684 0.3094 0.8533 0.4071 0.8126 0.8999
0.3902 10.0 760 0.3092 0.8533 0.4071 0.8126 0.8999
0.3902 11.0 836 0.3088 0.8533 0.4071 0.8126 0.8999
0.3902 12.0 912 0.3082 0.8533 0.4071 0.8126 0.8999
0.3902 13.0 988 0.3086 0.8533 0.4071 0.8126 0.8999
0.3182 14.0 1064 0.3089 0.8533 0.4071 0.8126 0.8999
0.3182 15.0 1140 0.3088 0.8533 0.4071 0.8126 0.8999
0.3182 16.0 1216 0.3081 0.8533 0.4071 0.8126 0.8999
0.3182 17.0 1292 0.3076 0.8533 0.4071 0.8126 0.8999
0.3182 18.0 1368 0.3079 0.8533 0.4071 0.8126 0.8999
0.3182 19.0 1444 0.3066 0.8533 0.4071 0.8126 0.8999
0.3157 20.0 1520 0.3081 0.8533 0.4071 0.8126 0.8999
0.3157 21.0 1596 0.3079 0.8533 0.4071 0.8126 0.8999
0.3157 22.0 1672 0.3074 0.8533 0.4071 0.8126 0.8999
0.3157 23.0 1748 0.3069 0.8533 0.4071 0.8126 0.8999
0.3157 24.0 1824 0.3074 0.8533 0.4071 0.8126 0.8999
0.3157 25.0 1900 0.3061 0.8533 0.4071 0.8126 0.8999
0.3157 26.0 1976 0.3060 0.8533 0.4071 0.8126 0.8999
0.3139 27.0 2052 0.3060 0.8533 0.4071 0.8126 0.8999
0.3139 28.0 2128 0.3059 0.8533 0.4071 0.8126 0.8999
0.3139 29.0 2204 0.3057 0.8533 0.4071 0.8126 0.8999
0.3139 30.0 2280 0.3054 0.8533 0.4071 0.8126 0.8999
0.3139 31.0 2356 0.3061 0.8533 0.4071 0.8126 0.8999
0.3139 32.0 2432 0.3062 0.8533 0.4071 0.8126 0.8999
0.313 33.0 2508 0.3055 0.8533 0.4071 0.8126 0.8999
0.313 34.0 2584 0.3054 0.8533 0.4071 0.8126 0.8999
0.313 35.0 2660 0.3051 0.8533 0.4071 0.8126 0.8999
0.313 36.0 2736 0.3054 0.8533 0.4071 0.8126 0.8999
0.313 37.0 2812 0.3047 0.8533 0.4071 0.8126 0.8999
0.313 38.0 2888 0.3042 0.8533 0.4071 0.8126 0.8999
0.313 39.0 2964 0.3042 0.8533 0.4071 0.8126 0.8999
0.3117 40.0 3040 0.3044 0.8533 0.4071 0.8126 0.8999
0.3117 41.0 3116 0.3043 0.8533 0.4071 0.8126 0.8999
0.3117 42.0 3192 0.3040 0.8533 0.4071 0.8126 0.8999
0.3117 43.0 3268 0.3040 0.8533 0.4071 0.8126 0.8999
0.3117 44.0 3344 0.3040 0.8533 0.4071 0.8126 0.8999
0.3117 45.0 3420 0.3039 0.8533 0.4071 0.8126 0.8999
0.3117 46.0 3496 0.3038 0.8533 0.4071 0.8126 0.8999
0.3101 47.0 3572 0.3041 0.8533 0.4071 0.8126 0.8999
0.3101 48.0 3648 0.3042 0.8533 0.4071 0.8126 0.8999
0.3101 49.0 3724 0.3035 0.8533 0.4071 0.8126 0.8999
0.3101 50.0 3800 0.3036 0.8533 0.4071 0.8126 0.8999
0.3101 51.0 3876 0.3031 0.8533 0.4071 0.8126 0.8999
0.3101 52.0 3952 0.3029 0.8533 0.4071 0.8126 0.8999
0.3101 53.0 4028 0.3030 0.8533 0.4071 0.8126 0.8999
0.3101 54.0 4104 0.3029 0.8533 0.4071 0.8126 0.8999
0.3101 55.0 4180 0.3033 0.8533 0.4071 0.8126 0.8999
0.3101 56.0 4256 0.3027 0.8533 0.4071 0.8126 0.8999
0.3101 57.0 4332 0.3026 0.8533 0.4071 0.8126 0.8999
0.3101 58.0 4408 0.3026 0.8533 0.4071 0.8126 0.8999
0.3101 59.0 4484 0.3023 0.8533 0.4071 0.8126 0.8999
0.308 60.0 4560 0.3029 0.8533 0.4071 0.8126 0.8999
0.308 61.0 4636 0.3024 0.8533 0.4071 0.8126 0.8999
0.308 62.0 4712 0.3022 0.8533 0.4071 0.8126 0.8999
0.308 63.0 4788 0.3024 0.8533 0.4071 0.8126 0.8999
0.308 64.0 4864 0.3025 0.8533 0.4071 0.8126 0.8999
0.308 65.0 4940 0.3023 0.8533 0.4071 0.8126 0.8999
0.3078 66.0 5016 0.3019 0.8533 0.4071 0.8126 0.8999
0.3078 67.0 5092 0.3020 0.8533 0.4071 0.8126 0.8999
0.3078 68.0 5168 0.3017 0.8533 0.4071 0.8126 0.8999
0.3078 69.0 5244 0.3019 0.8533 0.4071 0.8126 0.8999
0.3078 70.0 5320 0.3020 0.8533 0.4071 0.8126 0.8999
0.3078 71.0 5396 0.3018 0.8533 0.4071 0.8126 0.8999
0.3078 72.0 5472 0.3019 0.8533 0.4071 0.8126 0.8999
0.3081 73.0 5548 0.3017 0.8533 0.4071 0.8126 0.8999
0.3081 74.0 5624 0.3016 0.8533 0.4071 0.8126 0.8999
0.3081 75.0 5700 0.3015 0.8533 0.4071 0.8126 0.8999
0.3081 76.0 5776 0.3015 0.8533 0.4071 0.8126 0.8999
0.3081 77.0 5852 0.3016 0.8533 0.4071 0.8126 0.8999
0.3081 78.0 5928 0.3014 0.8533 0.4071 0.8126 0.8999
0.3066 79.0 6004 0.3014 0.8533 0.4071 0.8126 0.8999
0.3066 80.0 6080 0.3014 0.8533 0.4071 0.8126 0.8999
0.3066 81.0 6156 0.3013 0.8533 0.4071 0.8126 0.8999
0.3066 82.0 6232 0.3013 0.8533 0.4071 0.8126 0.8999
0.3066 83.0 6308 0.3012 0.8533 0.4071 0.8126 0.8999
0.3066 84.0 6384 0.3014 0.8533 0.4071 0.8126 0.8999
0.3066 85.0 6460 0.3012 0.8533 0.4071 0.8126 0.8999
0.3076 86.0 6536 0.3012 0.8533 0.4071 0.8126 0.8999
0.3076 87.0 6612 0.3012 0.8533 0.4071 0.8126 0.8999
0.3076 88.0 6688 0.3011 0.8533 0.4071 0.8126 0.8999
0.3076 89.0 6764 0.3011 0.8533 0.4071 0.8126 0.8999
0.3076 90.0 6840 0.3010 0.8533 0.4071 0.8126 0.8999
0.3076 91.0 6916 0.3011 0.8533 0.4071 0.8126 0.8999
0.3076 92.0 6992 0.3010 0.8533 0.4071 0.8126 0.8999
0.3059 93.0 7068 0.3010 0.8533 0.4071 0.8126 0.8999
0.3059 94.0 7144 0.3010 0.8533 0.4071 0.8126 0.8999
0.3059 95.0 7220 0.3010 0.8533 0.4071 0.8126 0.8999
0.3059 96.0 7296 0.3009 0.8533 0.4071 0.8126 0.8999
0.3059 97.0 7372 0.3010 0.8533 0.4071 0.8126 0.8999
0.3059 98.0 7448 0.3009 0.8533 0.4071 0.8126 0.8999
0.306 99.0 7524 0.3009 0.8533 0.4071 0.8126 0.8999
0.306 100.0 7600 0.3009 0.8533 0.4071 0.8126 0.8999

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.2
  • Tokenizers 0.19.1
Downloads last month
9
Safetensors
Model size
109M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for dahe827/bert-base-cased-airlines-news-multi-label

Finetuned
this model