ViMATCHA / README.md
TeeA's picture
TeeA/ViMATCHA
b05cc02 verified
|
raw
history blame
10.9 kB
metadata
license: apache-2.0
base_model: google/matcha-chartqa
tags:
  - generated_from_trainer
model-index:
  - name: ViMATCHA
    results: []

ViMATCHA

This model is a fine-tuned version of google/matcha-chartqa on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5748

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss
3.9077 0.1603 150 3.3649
3.1926 0.2137 200 2.7038
2.5565 0.2671 250 2.1718
2.1552 0.3205 300 1.7380
1.7834 0.3739 350 1.4287
1.428 0.4274 400 1.2302
1.3867 0.4808 450 1.1280
1.1496 0.5342 500 1.0437
1.2414 0.5876 550 0.9901
1.116 0.6410 600 0.9475
1.12 0.6944 650 0.9073
1.0132 0.7479 700 0.8859
0.9954 0.8013 750 0.8638
1.0057 0.8547 800 0.8496
0.9929 0.9081 850 0.8289
0.945 0.9615 900 0.8207
0.866 1.0150 950 0.8059
0.8901 1.0684 1000 0.7944
0.8382 1.1218 1050 0.7828
0.946 1.1752 1100 0.7748
0.9042 1.2286 1150 0.7662
0.8334 1.2821 1200 0.7549
0.8747 1.3355 1250 0.7501
0.8224 1.3889 1300 0.7424
0.7998 1.4423 1350 0.7382
0.9022 1.4957 1400 0.7341
0.8297 1.5491 1450 0.7226
0.8014 1.6026 1500 0.7165
0.8423 1.6560 1550 0.7129
0.7286 1.7094 1600 0.7063
0.7361 1.7628 1650 0.7040
0.8203 1.8162 1700 0.6982
0.8103 1.8697 1750 0.6945
0.7251 1.9231 1800 0.6926
0.7193 1.9765 1850 0.6910
0.8133 2.0299 1900 0.6843
0.7545 2.0833 1950 0.6862
0.8025 2.1368 2000 0.6768
0.7421 2.1902 2050 0.6769
0.6899 2.2436 2100 0.6744
0.7607 2.2970 2150 0.6690
0.739 2.3504 2200 0.6652
0.7095 2.4038 2250 0.6666
0.7392 2.4573 2300 0.6605
0.7307 2.5107 2350 0.6560
0.685 2.5641 2400 0.6579
0.6419 2.6175 2450 0.6499
0.6894 2.6709 2500 0.6532
0.6288 2.7244 2550 0.6482
0.7024 2.7778 2600 0.6471
0.7717 2.8312 2650 0.6475
0.7389 2.8846 2700 0.6434
0.6944 2.9380 2750 0.6406
0.6512 2.9915 2800 0.6405
0.7187 3.0449 2850 0.6410
0.6676 3.0983 2900 0.6383
0.6513 3.1517 2950 0.6359
0.5821 3.2051 3000 0.6345
0.6642 3.2585 3050 0.6338
0.6475 3.3120 3100 0.6311
0.6999 3.3654 3150 0.6281
0.696 3.4188 3200 0.6329
0.6129 3.4722 3250 0.6263
0.5601 3.5256 3300 0.6257
0.6658 3.5791 3350 0.6209
0.7212 3.6325 3400 0.6217
0.6392 3.6859 3450 0.6178
0.6877 3.7393 3500 0.6202
0.632 3.7927 3550 0.6171
0.6762 3.8462 3600 0.6175
0.6776 3.8996 3650 0.6122
0.6964 3.9530 3700 0.6120
0.6448 4.0064 3750 0.6134
0.6183 4.0598 3800 0.6125
0.6544 4.1132 3850 0.6105
0.6583 4.1667 3900 0.6102
0.6687 4.2201 3950 0.6083
0.6408 4.2735 4000 0.6078
0.6077 4.3269 4050 0.6051
0.5781 4.3803 4100 0.6078
0.645 4.4338 4150 0.6033
0.656 4.4872 4200 0.6044
0.562 4.5406 4250 0.6001
0.6568 4.5940 4300 0.6018
0.6572 4.6474 4350 0.6012
0.5749 4.7009 4400 0.6004
0.5811 4.7543 4450 0.5974
0.5765 4.8077 4500 0.6010
0.6177 4.8611 4550 0.5941
0.5371 4.9145 4600 0.5949
0.5863 4.9679 4650 0.5930
0.6274 5.0214 4700 0.5996
0.592 5.0748 4750 0.5972
0.6286 5.1282 4800 0.5936
0.6056 5.1816 4850 0.5954
0.5697 5.2350 4900 0.5943
0.5562 5.2885 4950 0.5923
0.5173 5.3419 5000 0.5944
0.6037 5.3953 5050 0.5943
0.6125 5.4487 5100 0.5921
0.5729 5.5021 5150 0.5920
0.561 5.5556 5200 0.5908
0.6244 5.6090 5250 0.5877
0.5338 5.6624 5300 0.5876
0.57 5.7158 5350 0.5912
0.555 5.7692 5400 0.5863
0.5914 5.8226 5450 0.5855
0.5785 5.8761 5500 0.5826
0.6014 5.9295 5550 0.5830
0.6192 5.9829 5600 0.5834
0.6025 6.0363 5650 0.5854
0.5327 6.0897 5700 0.5861
0.5958 6.1432 5750 0.5856
0.5319 6.1966 5800 0.5852
0.5722 6.25 5850 0.5848
0.585 6.3034 5900 0.5854
0.5549 6.3568 5950 0.5829
0.5559 6.4103 6000 0.5816
0.5539 6.4637 6050 0.5820
0.5847 6.5171 6100 0.5797
0.5964 6.5705 6150 0.5803
0.5613 6.6239 6200 0.5808
0.5194 6.6774 6250 0.5816
0.5664 6.7308 6300 0.5808
0.5971 6.7842 6350 0.5783
0.4937 6.8376 6400 0.5773
0.577 6.8910 6450 0.5779
0.5422 6.9444 6500 0.5799
0.4881 6.9979 6550 0.5789
0.5806 7.0513 6600 0.5802
0.5029 7.1047 6650 0.5798
0.5436 7.1581 6700 0.5782
0.5298 7.2115 6750 0.5783
0.5947 7.2650 6800 0.5777
0.4975 7.3184 6850 0.5762
0.5652 7.3718 6900 0.5761
0.5214 7.4252 6950 0.5793
0.535 7.4786 7000 0.5791
0.5035 7.5321 7050 0.5774
0.5004 7.5855 7100 0.5769
0.6187 7.6389 7150 0.5778
0.5042 7.6923 7200 0.5770
0.5116 7.7457 7250 0.5760
0.5178 7.7991 7300 0.5744
0.5154 7.8526 7350 0.5748
0.5641 7.9060 7400 0.5764
0.536 7.9594 7450 0.5762
0.5423 8.0128 7500 0.5743
0.5402 8.0662 7550 0.5755
0.5559 8.1197 7600 0.5755
0.4998 8.1731 7650 0.5769
0.5895 8.2265 7700 0.5777
0.5252 8.2799 7750 0.5786
0.4802 8.3333 7800 0.5774
0.5285 8.3868 7850 0.5742
0.4931 8.4402 7900 0.5740
0.5126 8.4936 7950 0.5761
0.5376 8.5470 8000 0.5754
0.4825 8.6004 8050 0.5761
0.509 8.6538 8100 0.5762
0.4823 8.7073 8150 0.5747
0.5379 8.7607 8200 0.5733
0.5283 8.8141 8250 0.5740
0.4662 8.8675 8300 0.5743
0.5325 8.9209 8350 0.5727
0.5628 8.9744 8400 0.5727
0.4885 9.0278 8450 0.5731
0.5187 9.0812 8500 0.5761
0.5286 9.1346 8550 0.5752
0.4738 9.1880 8600 0.5744
0.4931 9.2415 8650 0.5733
0.5403 9.2949 8700 0.5751
0.4927 9.3483 8750 0.5755
0.5415 9.4017 8800 0.5743
0.5627 9.4551 8850 0.5746
0.468 9.5085 8900 0.5743
0.5002 9.5620 8950 0.5751
0.4891 9.6154 9000 0.5740
0.5456 9.6688 9050 0.5747
0.5054 9.7222 9100 0.5747
0.5453 9.7756 9150 0.5741
0.477 9.8291 9200 0.5745
0.4798 9.8825 9250 0.5747
0.5003 9.9359 9300 0.5748
0.4785 9.9893 9350 0.5748

Framework versions

  • Transformers 4.40.1
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.0
  • Tokenizers 0.19.1