allen0909's picture
Training in progress, step 100
189229a verified
|
raw
history blame
No virus
7.93 kB
metadata
license: other
library_name: peft
tags:
  - trl
  - sft
  - generated_from_trainer
base_model: taide/TAIDE-LX-7B-Chat
model-index:
  - name: ROE_QA_TAIDE-LX-7B-Chat_Q100_80_20_V4
    results: []

ROE_QA_TAIDE-LX-7B-Chat_Q100_80_20_V4

This model is a fine-tuned version of taide/TAIDE-LX-7B-Chat on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3344

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 1
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 4
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 5
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
4.8611 0.0321 100 3.4673
4.6379 0.0643 200 3.2251
3.817 0.0964 300 2.8123
3.2954 0.1285 400 2.4988
2.9281 0.1607 500 2.2574
2.2501 0.1928 600 1.9403
2.1188 0.2249 700 1.7765
1.8027 0.2571 800 1.6201
1.6638 0.2892 900 1.4923
1.6204 0.3213 1000 1.3941
1.6152 0.3535 1100 1.2081
1.0876 0.3856 1200 1.0603
1.2552 0.4177 1300 1.0356
0.9443 0.4499 1400 0.8980
0.9998 0.4820 1500 0.8488
1.0518 0.5141 1600 0.7801
0.984 0.5463 1700 0.7711
0.9595 0.5784 1800 0.6924
0.9363 0.6105 1900 0.6537
0.822 0.6427 2000 0.6399
0.8791 0.6748 2100 0.6050
0.8802 0.7069 2200 0.5914
0.788 0.7391 2300 0.5741
0.7853 0.7712 2400 0.5581
0.839 0.8033 2500 0.5398
0.847 0.8355 2600 0.5204
0.8446 0.8676 2700 0.5085
0.7599 0.8997 2800 0.4867
0.7602 0.9319 2900 0.4891
0.8813 0.9640 3000 0.4717
0.6966 0.9961 3100 0.4667
0.7106 1.0283 3200 0.4463
0.6553 1.0604 3300 0.4414
0.6721 1.0925 3400 0.4383
0.6625 1.1247 3500 0.4326
0.6458 1.1568 3600 0.4255
0.6491 1.1889 3700 0.4207
0.6773 1.2211 3800 0.4246
0.6972 1.2532 3900 0.4119
0.7315 1.2853 4000 0.4101
0.6167 1.3175 4100 0.4119
0.6052 1.3496 4200 0.4070
0.6168 1.3817 4300 0.4029
0.6283 1.4139 4400 0.3986
0.6434 1.4460 4500 0.3940
0.6244 1.4781 4600 0.3940
0.6385 1.5103 4700 0.3958
0.6577 1.5424 4800 0.3952
0.6616 1.5746 4900 0.3925
0.6295 1.6067 5000 0.3846
0.6327 1.6388 5100 0.3832
0.6539 1.6710 5200 0.3826
0.6291 1.7031 5300 0.3851
0.5879 1.7352 5400 0.3833
0.6002 1.7674 5500 0.3787
0.5673 1.7995 5600 0.3755
0.5956 1.8316 5700 0.3746
0.6186 1.8638 5800 0.3735
0.5756 1.8959 5900 0.3712
0.6281 1.9280 6000 0.3735
0.5736 1.9602 6100 0.3696
0.5762 1.9923 6200 0.3700
0.3585 2.0244 6300 0.3667
0.4526 2.0566 6400 0.3662
0.3883 2.0887 6500 0.3662
0.4083 2.1208 6600 0.3664
0.3973 2.1530 6700 0.3646
0.4206 2.1851 6800 0.3673
0.5028 2.2172 6900 0.3622
0.366 2.2494 7000 0.3629
0.3968 2.2815 7100 0.3620
0.3977 2.3136 7200 0.3582
0.3899 2.3458 7300 0.3583
0.393 2.3779 7400 0.3572
0.4467 2.4100 7500 0.3576
0.4327 2.4422 7600 0.3605
0.4128 2.4743 7700 0.3550
0.3888 2.5064 7800 0.3572
0.4373 2.5386 7900 0.3553
0.3817 2.5707 8000 0.3525
0.3852 2.6028 8100 0.3518
0.4573 2.6350 8200 0.3536
0.3833 2.6671 8300 0.3499
0.3731 2.6992 8400 0.3509
0.3867 2.7314 8500 0.3497
0.351 2.7635 8600 0.3475
0.4327 2.7956 8700 0.3485
0.4183 2.8278 8800 0.3478
0.3807 2.8599 8900 0.3488
0.397 2.8920 9000 0.3461
0.4218 2.9242 9100 0.3463
0.3684 2.9563 9200 0.3457
0.3992 2.9884 9300 0.3448
0.2395 3.0206 9400 0.3463
0.2221 3.0527 9500 0.3448
0.222 3.0848 9600 0.3451
0.2359 3.1170 9700 0.3439
0.227 3.1491 9800 0.3442
0.2083 3.1812 9900 0.3430
0.2141 3.2134 10000 0.3419
0.224 3.2455 10100 0.3429
0.2031 3.2776 10200 0.3420
0.1944 3.3098 10300 0.3414
0.2109 3.3419 10400 0.3397
0.2053 3.3740 10500 0.3412
0.2172 3.4062 10600 0.3398
0.2179 3.4383 10700 0.3385
0.2299 3.4704 10800 0.3383
0.1721 3.5026 10900 0.3389
0.2131 3.5347 11000 0.3387
0.2489 3.5668 11100 0.3376
0.224 3.5990 11200 0.3373
0.201 3.6311 11300 0.3370
0.2131 3.6632 11400 0.3364
0.1869 3.6954 11500 0.3369
0.2174 3.7275 11600 0.3358
0.1999 3.7596 11700 0.3355
0.2224 3.7918 11800 0.3352
0.2195 3.8239 11900 0.3346
0.1984 3.8560 12000 0.3344
0.2121 3.8882 12100 0.3337
0.1879 3.9203 12200 0.3344

Framework versions

  • PEFT 0.12.1.dev0
  • Transformers 4.44.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1