--- license: apache-2.0 tags: - generated_from_keras_callback model-index: - name: CIS6930_DAAGR_T5_NoEmo results: [] --- # CIS6930_DAAGR_T5_NoEmo This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.3368 - Train Accuracy: 0.9629 - Validation Loss: 0.4438 - Validation Accuracy: 0.9496 - Epoch: 17 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 0.001, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False} - training_precision: float32 ### Training results | Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch | |:----------:|:--------------:|:---------------:|:-------------------:|:-----:| | 0.5062 | 0.9405 | 0.4590 | 0.9454 | 0 | | 0.4381 | 0.9479 | 0.4477 | 0.9472 | 1 | | 0.4249 | 0.9499 | 0.4423 | 0.9481 | 2 | | 0.4152 | 0.9513 | 0.4386 | 0.9486 | 3 | | 0.4071 | 0.9525 | 0.4365 | 0.9490 | 4 | | 0.4000 | 0.9535 | 0.4349 | 0.9493 | 5 | | 0.3935 | 0.9545 | 0.4338 | 0.9496 | 6 | | 0.3876 | 0.9553 | 0.4337 | 0.9498 | 7 | | 0.3816 | 0.9562 | 0.4338 | 0.9498 | 8 | | 0.3763 | 0.9571 | 0.4343 | 0.9499 | 9 | | 0.3708 | 0.9578 | 0.4338 | 0.9500 | 10 | | 0.3657 | 0.9586 | 0.4357 | 0.9498 | 11 | | 0.3605 | 0.9593 | 0.4355 | 0.9500 | 12 | | 0.3556 | 0.9601 | 0.4370 | 0.9499 | 13 | | 0.3507 | 0.9608 | 0.4380 | 0.9499 | 14 | | 0.3463 | 0.9615 | 0.4397 | 0.9498 | 15 | | 0.3413 | 0.9622 | 0.4427 | 0.9496 | 16 | | 0.3368 | 0.9629 | 0.4438 | 0.9496 | 17 | ### Framework versions - Transformers 4.27.4 - TensorFlow 2.11.0 - Datasets 2.11.0 - Tokenizers 0.13.2