Edit model card

kasrahabib/gpt2-finetuned-iso29148-req-detector

This model is a fine-tuned version of openai-community/gpt2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.1004
  • Validation Loss: 0.5468
  • Epoch: 29

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 3570, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
  • training_precision: float32

Training results

Train Loss Validation Loss Epoch
3.0225 2.8144 0
2.7666 2.7501 1
2.6971 2.6517 2
2.6218 2.4804 3
2.3698 2.0353 4
1.8887 1.4317 5
1.1104 0.8991 6
0.7810 0.8057 7
0.6463 0.6800 8
0.5507 0.6747 9
0.4679 0.6156 10
0.4313 0.6102 11
0.3696 0.5972 12
0.3365 0.5437 13
0.2866 0.5637 14
0.2492 0.5682 15
0.2392 0.5496 16
0.2029 0.6076 17
0.2062 0.5368 18
0.1898 0.5434 19
0.1670 0.5434 20
0.1682 0.5403 21
0.1650 0.5830 22
0.1326 0.5400 23
0.1366 0.5497 24
0.1148 0.5432 25
0.1263 0.5662 26
0.1171 0.5591 27
0.1075 0.5445 28
0.1004 0.5468 29

Framework versions

  • Transformers 4.40.1
  • TensorFlow 2.15.0
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
1

Finetuned from