--- license: cc-by-sa-4.0 tags: - generated_from_trainer metrics: - rouge model-index: - name: bart-base-japanese-tobyoki-pairwise-wo_space results: [] --- # bart-base-japanese-tobyoki-pairwise-wo_space This model is a fine-tuned version of [ku-nlp/bart-base-japanese](https://huggingface.co/ku-nlp/bart-base-japanese) on the None dataset. It achieves the following results on the evaluation set: - Loss: 3.2694 - Rouge1: 18.7407 - Rouge2: 3.1211 - Rougel: 10.9379 - Rougelsum: 15.8203 - Gen Len: 95.0245 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 1 - eval_batch_size: 1 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | |:-------------:|:-----:|:-----:|:---------------:|:-------:|:------:|:-------:|:---------:|:-------:| | 0.7833 | 1.0 | 2025 | 2.5751 | 16.3343 | 1.2888 | 11.0128 | 15.2802 | 65.3776 | | 0.3308 | 2.0 | 4050 | 2.9423 | 17.9514 | 2.8091 | 10.9133 | 15.4068 | 91.6503 | | 0.2302 | 3.0 | 6075 | 3.0625 | 16.1453 | 3.0026 | 10.272 | 14.0716 | 77.1993 | | 0.1576 | 4.0 | 8100 | 3.2308 | 17.8409 | 2.9937 | 10.8765 | 15.6203 | 88.3986 | | 0.1055 | 5.0 | 10125 | 3.2694 | 18.7407 | 3.1211 | 10.9379 | 15.8203 | 95.0245 | ### Framework versions - Transformers 4.30.0 - Pytorch 2.0.1+cu117 - Datasets 2.14.4 - Tokenizers 0.13.3