groove_midi_3 / README.md
JannikAhlers's picture
End of training
00dc415 verified
metadata
library_name: transformers
license: apache-2.0
base_model: JannikAhlers/groove_midi_2
tags:
  - generated_from_trainer
metrics:
  - bleu
model-index:
  - name: groove_midi_3
    results: []

groove_midi_3

This model is a fine-tuned version of JannikAhlers/groove_midi_2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3215
  • Bleu: 0.0
  • Gen Len: 20.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
No log 1.0 57 0.3367 0.0 20.0
No log 2.0 114 0.3348 0.0 20.0
No log 3.0 171 0.3342 0.0 20.0
No log 4.0 228 0.3322 0.0 20.0
No log 5.0 285 0.3311 0.0 20.0
No log 6.0 342 0.3305 0.0 20.0
No log 7.0 399 0.3296 0.0 20.0
No log 8.0 456 0.3284 0.0 20.0
0.3803 9.0 513 0.3276 0.0 20.0
0.3803 10.0 570 0.3273 0.0 20.0
0.3803 11.0 627 0.3267 0.0 20.0
0.3803 12.0 684 0.3259 0.0 20.0
0.3803 13.0 741 0.3258 0.0 20.0
0.3803 14.0 798 0.3250 0.0 20.0
0.3803 15.0 855 0.3250 0.0 20.0
0.3803 16.0 912 0.3243 0.0 20.0
0.3803 17.0 969 0.3237 0.0 20.0
0.3692 18.0 1026 0.3234 0.0 20.0
0.3692 19.0 1083 0.3232 0.0 20.0
0.3692 20.0 1140 0.3228 0.0 20.0
0.3692 21.0 1197 0.3228 0.0 20.0
0.3692 22.0 1254 0.3228 0.0 20.0
0.3692 23.0 1311 0.3223 0.0 20.0
0.3692 24.0 1368 0.3219 0.0 20.0
0.3692 25.0 1425 0.3218 0.0 20.0
0.3692 26.0 1482 0.3217 0.0 20.0
0.3642 27.0 1539 0.3216 0.0 20.0
0.3642 28.0 1596 0.3216 0.0 20.0
0.3642 29.0 1653 0.3215 0.0 20.0
0.3642 30.0 1710 0.3215 0.0 20.0

Framework versions

  • Transformers 4.48.2
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0