LiLT-RE-JA-SIN / README.md
kavg's picture
LiLT-RE-JA-SIN
1242eb3 verified
metadata
license: mit
base_model: kavg/LiLT-RE-JA
tags:
  - generated_from_trainer
datasets:
  - xfun
metrics:
  - precision
  - recall
  - f1
model-index:
  - name: checkpoints
    results: []

checkpoints

This model is a fine-tuned version of kavg/LiLT-RE-JA on the xfun dataset. It achieves the following results on the evaluation set:

  • Precision: 0.4744
  • Recall: 0.6540
  • F1: 0.5499
  • Loss: 0.5293

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 10000

Training results

Training Loss Epoch Step F1 Validation Loss Precision Recall
0.0815 41.67 500 0.4149 0.1502 0.3521 0.5051
0.0408 83.33 1000 0.4931 0.1593 0.4244 0.5884
0.0435 125.0 1500 0.5041 0.2311 0.4218 0.6263
0.0168 166.67 2000 0.5097 0.3195 0.4286 0.6288
0.0073 208.33 2500 0.5088 0.3313 0.4308 0.6212
0.0051 250.0 3000 0.5264 0.3939 0.4349 0.6667
0.0038 291.67 3500 0.5252 0.3958 0.4435 0.6439
0.0016 333.33 4000 0.5335 0.4708 0.4606 0.6338
0.0082 375.0 4500 0.5340 0.4429 0.4562 0.6439
0.0079 416.67 5000 0.5305 0.4498 0.4601 0.6263
0.0028 458.33 5500 0.5352 0.4993 0.4578 0.6439
0.0003 500.0 6000 0.5422 0.5253 0.4695 0.6414
0.0014 541.67 6500 0.5437 0.5134 0.4705 0.6439
0.0043 583.33 7000 0.5393 0.5308 0.4652 0.6414
0.0002 625.0 7500 0.5378 0.5572 0.4604 0.6465
0.0014 666.67 8000 0.5386 0.5451 0.4591 0.6515
0.0027 708.33 8500 0.4629 0.6465 0.5395 0.5747
0.0036 750.0 9000 0.4744 0.6540 0.5499 0.5293
0.0021 791.67 9500 0.4610 0.6566 0.5417 0.5391
0.0002 833.33 10000 0.4625 0.6540 0.5418 0.5359

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.1