liwii's picture
factual-consistency-classification-with-prompt-ja
3dbf1f5 verified
|
raw
history blame
7.77 kB
metadata
license: apache-2.0
base_model: line-corporation/line-distilbert-base-japanese
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: factual-consistency-classification-with-prompt-ja
    results: []

factual-consistency-classification-with-prompt-ja

This model is a fine-tuned version of line-corporation/line-distilbert-base-japanese on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6642
  • Accuracy: 0.6738

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 8
  • seed: 42
  • distributed_type: tpu
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 306 1.0023 0.3223
1.0128 2.0 612 0.9182 0.5742
1.0128 3.0 918 0.8847 0.5547
0.9226 4.0 1224 0.8569 0.5859
0.8815 5.0 1530 0.8329 0.6191
0.8815 6.0 1836 0.8287 0.5840
0.8633 7.0 2142 0.8160 0.5723
0.8633 8.0 2448 0.8210 0.5098
0.8525 9.0 2754 0.8171 0.5156
0.8418 10.0 3060 0.7850 0.5840
0.8418 11.0 3366 0.7771 0.6035
0.8323 12.0 3672 0.7652 0.6797
0.8323 13.0 3978 0.7655 0.6055
0.8292 14.0 4284 0.7556 0.6719
0.8233 15.0 4590 0.7578 0.6660
0.8233 16.0 4896 0.7497 0.6270
0.8173 17.0 5202 0.7472 0.6484
0.8122 18.0 5508 0.7334 0.7090
0.8122 19.0 5814 0.7468 0.6016
0.8165 20.0 6120 0.7248 0.7363
0.8165 21.0 6426 0.7324 0.6484
0.8048 22.0 6732 0.7261 0.6836
0.8033 23.0 7038 0.7187 0.7031
0.8033 24.0 7344 0.7205 0.6816
0.8011 25.0 7650 0.7307 0.6133
0.8011 26.0 7956 0.7220 0.6680
0.7979 27.0 8262 0.7175 0.6660
0.7963 28.0 8568 0.7205 0.6367
0.7963 29.0 8874 0.7139 0.6719
0.7942 30.0 9180 0.7078 0.6875
0.7942 31.0 9486 0.7094 0.6602
0.7878 32.0 9792 0.6992 0.7148
0.7891 33.0 10098 0.7041 0.6680
0.7891 34.0 10404 0.6968 0.6973
0.7869 35.0 10710 0.7047 0.6465
0.7874 36.0 11016 0.6962 0.6934
0.7874 37.0 11322 0.7026 0.6523
0.7817 38.0 11628 0.7103 0.625
0.7817 39.0 11934 0.6917 0.6914
0.7843 40.0 12240 0.6957 0.6680
0.7805 41.0 12546 0.7016 0.6484
0.7805 42.0 12852 0.6955 0.6582
0.7777 43.0 13158 0.7004 0.6387
0.7777 44.0 13464 0.6855 0.6895
0.7783 45.0 13770 0.6835 0.6895
0.7766 46.0 14076 0.6886 0.6641
0.7766 47.0 14382 0.6969 0.6309
0.7796 48.0 14688 0.6873 0.6738
0.7796 49.0 14994 0.6796 0.6953
0.77 50.0 15300 0.6908 0.6543
0.7768 51.0 15606 0.6900 0.6367
0.7768 52.0 15912 0.6855 0.6680
0.7698 53.0 16218 0.6905 0.6504
0.7686 54.0 16524 0.6783 0.6816
0.7686 55.0 16830 0.6807 0.6777
0.7712 56.0 17136 0.6767 0.6797
0.7712 57.0 17442 0.6966 0.6152
0.7692 58.0 17748 0.6812 0.6660
0.7677 59.0 18054 0.6762 0.6777
0.7677 60.0 18360 0.6697 0.7090
0.761 61.0 18666 0.6833 0.6445
0.761 62.0 18972 0.6753 0.6777
0.7676 63.0 19278 0.6757 0.6699
0.7627 64.0 19584 0.6874 0.6426
0.7627 65.0 19890 0.6704 0.6836
0.7672 66.0 20196 0.6685 0.6934
0.7638 67.0 20502 0.6645 0.7090
0.7638 68.0 20808 0.6718 0.6797
0.765 69.0 21114 0.6658 0.6934
0.765 70.0 21420 0.6670 0.6895
0.7593 71.0 21726 0.6735 0.6719
0.7634 72.0 22032 0.6765 0.6406
0.7634 73.0 22338 0.6722 0.6641
0.754 74.0 22644 0.6664 0.6855
0.754 75.0 22950 0.6659 0.6895
0.7619 76.0 23256 0.6700 0.6621
0.7583 77.0 23562 0.6664 0.6797
0.7583 78.0 23868 0.6650 0.6836
0.7556 79.0 24174 0.6615 0.6973
0.7556 80.0 24480 0.6625 0.6934
0.7571 81.0 24786 0.6704 0.6582
0.7549 82.0 25092 0.6677 0.6719
0.7549 83.0 25398 0.6670 0.6699
0.7542 84.0 25704 0.6617 0.6875
0.756 85.0 26010 0.6638 0.6758
0.756 86.0 26316 0.6697 0.6562
0.7513 87.0 26622 0.6647 0.6738
0.7513 88.0 26928 0.6734 0.6445
0.7548 89.0 27234 0.6637 0.6836
0.7565 90.0 27540 0.6665 0.6719
0.7565 91.0 27846 0.6708 0.6504
0.7488 92.0 28152 0.6603 0.6895
0.7488 93.0 28458 0.6671 0.6582
0.7545 94.0 28764 0.6655 0.6699
0.7509 95.0 29070 0.6636 0.6777
0.7509 96.0 29376 0.6620 0.6816
0.7546 97.0 29682 0.6653 0.6719
0.7546 98.0 29988 0.6636 0.6738
0.7521 99.0 30294 0.6636 0.6758
0.755 100.0 30600 0.6642 0.6738

Framework versions

  • Transformers 4.34.0
  • Pytorch 2.0.0+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.0