Edit model card

T10

This model is a fine-tuned version of hhhhzy/deltalm-base-xlsum on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6357

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 64

Training results

Training Loss Epoch Step Validation Loss
1.805 1.0 6 0.3684
0.2843 2.0 12 0.3604
0.2494 3.0 18 0.3970
0.1528 4.0 24 0.4507
0.0779 5.0 30 0.5024
0.0482 6.0 36 0.5399
0.0246 7.0 42 0.5612
0.0202 8.0 48 0.5788
0.0172 9.0 54 0.6024
0.0147 10.0 60 0.6003
0.0115 11.0 66 0.5960
0.0124 12.0 72 0.6035
0.0122 13.0 78 0.6135
0.0121 14.0 84 0.6105
0.0101 15.0 90 0.6155
0.0103 16.0 96 0.6188
0.0087 17.0 102 0.6192
0.015 18.0 108 0.6113
0.0092 19.0 114 0.6141
0.0091 20.0 120 0.6220
0.0088 21.0 126 0.6243
0.009 22.0 132 0.6239
0.0085 23.0 138 0.6199
0.0093 24.0 144 0.6183
0.0092 25.0 150 0.6170
0.0086 26.0 156 0.6154
0.0084 27.0 162 0.6154
0.0082 28.0 168 0.6182
0.0083 29.0 174 0.6224
0.0082 30.0 180 0.6250
0.0086 31.0 186 0.6263
0.0078 32.0 192 0.6270
0.0081 33.0 198 0.6271
0.0081 34.0 204 0.6276
0.0082 35.0 210 0.6280
0.0078 36.0 216 0.6292
0.0078 37.0 222 0.6302
0.0079 38.0 228 0.6314
0.0081 39.0 234 0.6319
0.0083 40.0 240 0.6318
0.0076 41.0 246 0.6317
0.0079 42.0 252 0.6309
0.0084 43.0 258 0.6304
0.0078 44.0 264 0.6307
0.0079 45.0 270 0.6309
0.0076 46.0 276 0.6312
0.0076 47.0 282 0.6313
0.008 48.0 288 0.6316
0.0081 49.0 294 0.6320
0.0077 50.0 300 0.6323
0.0075 51.0 306 0.6328
0.0077 52.0 312 0.6336
0.0076 53.0 318 0.6342
0.0077 54.0 324 0.6344
0.0075 55.0 330 0.6346
0.0079 56.0 336 0.6350
0.0076 57.0 342 0.6350
0.0078 58.0 348 0.6355
0.0077 59.0 354 0.6357
0.0074 60.0 360 0.6358
0.0075 61.0 366 0.6358
0.0075 62.0 372 0.6358
0.0077 63.0 378 0.6357
0.0073 64.0 384 0.6357

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.17.0
  • Tokenizers 0.15.1
Downloads last month
0
Safetensors
Model size
363M params
Tensor type
F32
·

Finetuned from