Edit model card

kasrahabib/roberta-base-finetuned-iso29148-nf_sub_req-embdr

This model is a fine-tuned version of FacebookAI/roberta-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.2271
  • Validation Loss: 1.1748
  • Epoch: 28

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 270, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
  • training_precision: float32

Training results

Train Loss Validation Loss Epoch
2.7150 2.7090 0
2.7106 2.7061 1
2.7015 2.6931 2
2.6525 2.5762 3
2.4332 2.3630 4
2.1599 2.1870 5
1.8809 1.9794 6
1.5991 1.8176 7
1.3476 1.6919 8
1.1429 1.5773 9
0.9575 1.5046 10
0.8359 1.4401 11
0.7214 1.3629 12
0.6201 1.3406 13
0.5340 1.2802 14
0.4736 1.2671 15
0.4211 1.2233 16
0.3728 1.2301 17
0.3480 1.2146 18
0.3166 1.2167 19
0.2984 1.1933 20
0.2755 1.1834 21
0.2598 1.1929 22
0.2473 1.1896 23
0.2423 1.1951 24
0.2370 1.1957 25
0.2295 1.1864 26
0.2280 1.1764 27
0.2271 1.1748 28

Framework versions

  • Transformers 4.40.1
  • TensorFlow 2.15.0
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
40
Inference API
This model can be loaded on Inference API (serverless).

Finetuned from