YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

license: mit

roberta-large-movies

This model is a fine-tuned version of roberta-large on the movie competition dataset. link: https://huggingface.co/spaces/competitions/movie-genre-prediction This model is nased on a MLM (Mask language modeling) finetuning. The goal is to apply a domain transfer. It needs then to be finetuned on labels.

It achieves the following results on the evaluation set:

  • Loss: 1.3261
  • Accuracy: 0.7375

Model description

roberta-large

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.7698 0.18 500 1.6168 0.6738
1.7761 0.36 1000 1.6522 0.6830
1.7626 0.54 1500 1.6534 0.6660
1.7602 0.72 2000 1.6576 0.6787
1.7587 0.89 2500 1.6266 0.6773
1.7047 1.07 3000 1.6060 0.6852
1.6782 1.25 3500 1.5990 0.6906
1.6733 1.43 4000 1.5377 0.6967
1.6664 1.61 4500 1.6435 0.6747
1.6719 1.79 5000 1.4839 0.6907
1.6502 1.97 5500 1.5351 0.6897
1.6233 2.15 6000 1.6818 0.6763
1.6127 2.32 6500 1.5865 0.6853
1.6274 2.5 7000 1.5004 0.7004
1.601 2.68 7500 1.4522 0.6930
1.6123 2.86 8000 1.5371 0.6894
1.6074 3.04 8500 1.5342 0.6952
1.563 3.22 9000 1.5682 0.6876
1.5746 3.4 9500 1.5705 0.6958
1.5539 3.58 10000 1.4711 0.7041
1.578 3.75 10500 1.5466 0.6889
1.5492 3.93 11000 1.4629 0.6969
1.5291 4.11 11500 1.4265 0.7200
1.5079 4.29 12000 1.5053 0.6966
1.5283 4.47 12500 1.5257 0.6903
1.5141 4.65 13000 1.5063 0.6950
1.4979 4.83 13500 1.5636 0.6956
1.5294 5.01 14000 1.5878 0.6835
1.4641 5.18 14500 1.5575 0.6962
1.4754 5.36 15000 1.4779 0.7007
1.4696 5.54 15500 1.4520 0.6965
1.4655 5.72 16000 1.6320 0.6830
1.4792 5.9 16500 1.4152 0.7134
1.4379 6.08 17000 1.4900 0.7042
1.4281 6.26 17500 1.5407 0.6990
1.436 6.44 18000 1.5343 0.6914
1.4342 6.61 18500 1.5324 0.7024
1.4176 6.79 19000 1.4486 0.7133
1.4308 6.97 19500 1.4598 0.7032
1.4014 7.15 20000 1.5750 0.6938
1.3661 7.33 20500 1.5404 0.6985
1.3857 7.51 21000 1.4692 0.7037
1.3846 7.69 21500 1.5511 0.6941
1.3867 7.87 22000 1.5321 0.6925
1.3658 8.04 22500 1.5500 0.7021
1.3406 8.22 23000 1.5239 0.6960
1.3405 8.4 23500 1.4414 0.7055
1.3373 8.58 24000 1.5994 0.6784
1.3527 8.76 24500 1.5106 0.6970
1.3436 8.94 25000 1.4714 0.7080
1.3069 9.12 25500 1.4990 0.6953
1.2969 9.3 26000 1.4810 0.6964
1.3009 9.47 26500 1.5965 0.6876
1.3227 9.65 27000 1.4296 0.7014
1.3259 9.83 27500 1.4137 0.7189
1.3131 10.01 28000 1.5342 0.7020
1.271 10.19 28500 1.4708 0.7113
1.2684 10.37 29000 1.4342 0.7046
1.2767 10.55 29500 1.4703 0.7094
1.2861 10.73 30000 1.3323 0.7309
1.2617 10.9 30500 1.4562 0.7003
1.2551 11.08 31000 1.4361 0.7170
1.2404 11.26 31500 1.4537 0.7035
1.2562 11.44 32000 1.4039 0.7132
1.2489 11.62 32500 1.4372 0.7064
1.2406 11.8 33000 1.4926 0.7087
1.2285 11.98 33500 1.4080 0.7152
1.2213 12.16 34000 1.4031 0.7170
1.1998 12.33 34500 1.3541 0.7223
1.2184 12.51 35000 1.3630 0.7308
1.2195 12.69 35500 1.3125 0.7281
1.2178 12.87 36000 1.4257 0.7119
1.1918 13.05 36500 1.4108 0.7153
1.1664 13.23 37000 1.3577 0.7227
1.1754 13.41 37500 1.3777 0.7206
1.1855 13.59 38000 1.3501 0.7354
1.1644 13.76 38500 1.3747 0.7207
1.1709 13.94 39000 1.3704 0.7184
1.1613 14.12 39500 1.4307 0.7247
1.1443 14.3 40000 1.3190 0.7221
1.1356 14.48 40500 1.3288 0.7331
1.1493 14.66 41000 1.3505 0.7240
1.1417 14.84 41500 1.3146 0.7320
1.1349 15.02 42000 1.3546 0.7333
1.1169 15.19 42500 1.3709 0.7247
1.1187 15.37 43000 1.4243 0.7218
1.118 15.55 43500 1.3835 0.7264
1.1165 15.73 44000 1.3240 0.7254
1.114 15.91 44500 1.3264 0.7382
1.105 16.09 45000 1.3214 0.7334
1.0924 16.27 45500 1.3847 0.7282
1.0915 16.45 46000 1.3604 0.7317
1.0968 16.62 46500 1.3540 0.7319
1.0772 16.8 47000 1.2475 0.7306
1.0975 16.98 47500 1.2636 0.7448
1.0708 17.16 48000 1.4056 0.7182
1.0654 17.34 48500 1.3769 0.7276
1.0676 17.52 49000 1.3357 0.7224
1.0507 17.7 49500 1.4088 0.7124
1.0424 17.88 50000 1.3146 0.7315
1.0524 18.06 50500 1.2896 0.7393
1.0349 18.23 51000 1.3987 0.7192
1.0217 18.41 51500 1.2938 0.7381
1.0238 18.59 52000 1.2962 0.7387
1.0292 18.77 52500 1.3195 0.7371
1.0426 18.95 53000 1.2835 0.7412
1.0196 19.13 53500 1.2346 0.7473
1.012 19.31 54000 1.3666 0.7338
1.0256 19.49 54500 1.3140 0.7365
0.9824 19.66 55000 1.2764 0.7416
1.0048 19.84 55500 1.2514 0.7488
0.9947 20.02 56000 1.3351 0.7432
0.977 20.2 56500 1.2854 0.7451
0.9862 20.38 57000 1.3666 0.7285
0.9699 20.56 57500 1.3123 0.7348
0.977 20.74 58000 1.3426 0.7255
0.9749 20.92 58500 1.3763 0.7297
0.9505 21.09 59000 1.2372 0.7434
0.9438 21.27 59500 1.4334 0.7159
0.944 21.45 60000 1.2690 0.7508
0.9427 21.63 60500 1.2186 0.7486
0.9553 21.81 61000 1.3941 0.7269
0.9571 21.99 61500 1.4163 0.7274
0.932 22.17 62000 1.2717 0.7523
0.9166 22.35 62500 1.2177 0.7396
0.9301 22.52 63000 1.3264 0.7378
0.9351 22.7 63500 1.2570 0.7520
0.9211 22.88 64000 1.2639 0.75
0.9211 23.06 64500 1.2377 0.7606
0.9196 23.24 65000 1.2739 0.7485
0.9062 23.42 65500 1.3263 0.7365
0.8965 23.6 66000 1.2814 0.7455
0.9004 23.78 66500 1.2109 0.7562
0.9094 23.95 67000 1.2629 0.7528
0.8937 24.13 67500 1.2771 0.7375
0.8711 24.31 68000 1.3746 0.7353
0.8972 24.49 68500 1.2529 0.7454
0.8863 24.67 69000 1.3219 0.7359
0.8823 24.85 69500 1.3136 0.7367
0.8759 25.03 70000 1.3152 0.7428
0.8722 25.21 70500 1.3108 0.7570
0.8548 25.38 71000 1.3503 0.7368
0.8728 25.56 71500 1.3091 0.7403
0.8633 25.74 72000 1.2952 0.7416
0.8612 25.92 72500 1.1612 0.7719
0.8677 26.1 73000 1.2855 0.7450
0.8526 26.28 73500 1.2979 0.7545
0.8594 26.46 74000 1.2570 0.7598
0.8481 26.64 74500 1.2337 0.7492
0.855 26.81 75000 1.2875 0.7444
0.835 26.99 75500 1.2270 0.7585
0.8309 27.17 76000 1.2540 0.7389
0.8326 27.35 76500 1.3611 0.7375
0.8398 27.53 77000 1.2248 0.7505
0.8304 27.71 77500 1.2403 0.7607
0.8373 27.89 78000 1.1709 0.7611
0.8462 28.07 78500 1.2891 0.7508
0.8259 28.24 79000 1.2452 0.7501
0.8334 28.42 79500 1.2986 0.7468
0.8115 28.6 80000 1.2880 0.7515
0.8205 28.78 80500 1.2728 0.7562
0.8261 28.96 81000 1.2661 0.7524
0.8299 29.14 81500 1.2592 0.7486
0.8276 29.32 82000 1.2325 0.7530
0.8112 29.5 82500 1.3154 0.7478
0.8111 29.67 83000 1.3343 0.7405
0.8148 29.85 83500 1.2806 0.7485

Framework versions

  • Transformers 4.21.3
  • Pytorch 1.12.1+cu116
  • Datasets 2.4.0
  • Tokenizers 0.12.1
Downloads last month
16
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.