Edit model card

nvl-ca

This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.6425
  • Rouge1: 36.2683
  • Rouge2: 17.3571
  • Rougel: 31.414
  • Rougelsum: 33.3573
  • Gen Len: 18.1

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 12

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
2.7351 1.0 50 2.0532 29.1549 10.8095 24.5213 27.1706 18.54
2.2954 2.0 100 1.8884 34.1103 15.1143 28.6964 30.6995 18.5
2.1461 3.0 150 1.7999 33.7268 15.3397 29.1248 30.7545 18.48
2.0402 4.0 200 1.7510 35.2811 16.3829 29.5922 31.3828 18.64
1.9727 5.0 250 1.7251 35.9939 17.0171 30.9116 32.514 18.3
1.9185 6.0 300 1.6982 36.1673 17.3892 31.4179 33.2171 18.06
1.8791 7.0 350 1.6809 36.0791 17.9475 31.6153 33.2867 18.2
1.8443 8.0 400 1.6631 36.3616 17.7432 31.9719 33.651 17.96
1.8322 9.0 450 1.6533 35.9061 16.9737 31.1291 33.1402 17.96
1.7978 10.0 500 1.6482 35.8366 17.0094 31.3893 33.3356 17.88
1.8037 11.0 550 1.6440 36.2683 17.3571 31.414 33.3573 18.1
1.7937 12.0 600 1.6425 36.2683 17.3571 31.414 33.3573 18.1

Framework versions

  • Transformers 4.36.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
3
Safetensors
Model size
60.5M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for veerganesh/nvl-ca

Base model

google-t5/t5-small
Finetuned
(1533)
this model