Edit model card

DNADebertaK6f

This model is a fine-tuned version of on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3707

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 150
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
2.1545 1.24 100000 1.5895
1.5727 2.49 200000 1.5383
1.5368 3.73 300000 1.5117
1.5143 4.97 400000 1.4926
1.4969 6.22 500000 1.4789
1.4841 7.46 600000 1.4677
1.475 8.7 700000 1.4599
1.4677 9.95 800000 1.4533
1.4623 11.19 900000 1.4491
1.4576 12.43 1000000 1.4461
1.4544 13.68 1100000 1.4420
1.451 14.92 1200000 1.4381
1.4482 16.16 1300000 1.4359
1.446 17.4 1400000 1.4345
1.443 18.65 1500000 1.4320
1.4412 19.89 1600000 1.4295
1.4385 21.13 1700000 1.4278
1.4368 22.38 1800000 1.4249
1.4346 23.62 1900000 1.4237
1.433 24.86 2000000 1.4219
1.4315 26.11 2100000 1.4201
1.4297 27.35 2200000 1.4198
1.4282 28.59 2300000 1.4180
1.4266 29.84 2400000 1.4142
1.4253 31.08 2500000 1.4146
1.4238 32.32 2600000 1.4130
1.4228 33.57 2700000 1.4113
1.4221 34.81 2800000 1.4100
1.42 36.05 2900000 1.4097
1.4188 37.3 3000000 1.4085
1.4174 38.54 3100000 1.4067
1.4161 39.78 3200000 1.4064
1.4149 41.03 3300000 1.4058
1.4139 42.27 3400000 1.4024
1.4134 43.51 3500000 1.4022
1.4126 44.76 3600000 1.4025
1.4117 46.0 3700000 1.4015
1.411 47.24 3800000 1.4001
1.4098 48.49 3900000 1.3968
1.4096 49.73 4000000 1.3997
1.4089 50.97 4100000 1.3974
1.4084 52.21 4200000 1.3972
1.4072 53.46 4300000 1.3965
1.4066 54.7 4400000 1.3974
1.4062 55.94 4500000 1.3960
1.4058 57.19 4600000 1.3958
1.4053 58.43 4700000 1.3950
1.4041 59.67 4800000 1.3936
1.4041 60.92 4900000 1.3963
1.4031 62.16 5000000 1.3915
1.4023 63.4 5100000 1.3917
1.4022 64.65 5200000 1.3930
1.4017 65.89 5300000 1.3904
1.4009 67.13 5400000 1.3899
1.4007 68.38 5500000 1.3892
1.3997 69.62 5600000 1.3910
1.3996 70.86 5700000 1.3892
1.3991 72.11 5800000 1.3890
1.3983 73.35 5900000 1.3870
1.3985 74.59 6000000 1.3889
1.3975 75.84 6100000 1.3865
1.3973 77.08 6200000 1.3852
1.3969 78.32 6300000 1.3869
1.3964 79.57 6400000 1.3843
1.396 80.81 6500000 1.3853
1.3955 82.05 6600000 1.3844
1.3952 83.3 6700000 1.3863
1.395 84.54 6800000 1.3835
1.3948 85.78 6900000 1.3841
1.394 87.02 7000000 1.3850
1.3934 88.27 7100000 1.3827
1.3932 89.51 7200000 1.3830
1.3929 90.75 7300000 1.3821
1.392 92.0 7400000 1.3820
1.392 93.24 7500000 1.3837
1.3913 94.48 7600000 1.3817
1.3909 95.73 7700000 1.3836
1.3906 96.97 7800000 1.3811
1.3903 98.21 7900000 1.3806
1.3902 99.46 8000000 1.3807
1.3896 100.7 8100000 1.3804
1.3895 101.94 8200000 1.3805
1.3891 103.19 8300000 1.3821
1.3889 104.43 8400000 1.3833
1.3881 105.67 8500000 1.3788
1.388 106.92 8600000 1.3818
1.3876 108.16 8700000 1.3806
1.387 109.4 8800000 1.3766
1.387 110.65 8900000 1.3765
1.3865 111.89 9000000 1.3800
1.3864 113.13 9100000 1.3830
1.386 114.38 9200000 1.3770
1.3853 115.62 9300000 1.3771
1.3852 116.86 9400000 1.3772
1.3848 118.1 9500000 1.3771
1.384 119.35 9600000 1.3749
1.3843 120.59 9700000 1.3764
1.3836 121.83 9800000 1.3802
1.3833 123.08 9900000 1.3756
1.3831 124.32 10000000 1.3748
1.3821 125.56 10100000 1.3755
1.3817 126.81 10200000 1.3744
1.3819 128.05 10300000 1.3763
1.381 129.29 10400000 1.3743
1.3805 130.54 10500000 1.3762
1.3804 131.78 10600000 1.3725
1.3795 133.02 10700000 1.3753
1.3791 134.27 10800000 1.3780
1.3785 135.51 10900000 1.3749
1.3781 136.75 11000000 1.3749
1.3779 138.0 11100000 1.3737
1.3772 139.24 11200000 1.3715
1.3763 140.48 11300000 1.3759
1.3761 141.73 11400000 1.3745
1.3752 142.97 11500000 1.3708
1.3744 144.21 11600000 1.3735
1.3736 145.46 11700000 1.3720
1.3728 146.7 11800000 1.3702
1.3714 147.94 11900000 1.3696
1.3706 149.19 12000000 1.3707

Framework versions

  • Transformers 4.27.3
  • Pytorch 2.0.0
  • Datasets 2.10.1
  • Tokenizers 0.13.2
Downloads last month
1