Edit model card

timesheet_estimator

This model is a fine-tuned version of on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5383
  • Mse: 0.5383
  • Rmse: 0.7337
  • Mae: 0.5091
  • R2: 0.4827
  • Smape: 89.7730

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Mae Mse R2 Rmse Smape
No log 0.46 300 0.7945 0.6404 0.7945 0.2018 0.8914 120.6558
No log 0.91 600 0.6674 0.5948 0.6674 0.3295 0.8169 109.9213
No log 1.37 900 0.6795 0.5998 0.6795 0.3174 0.8243 108.4883
0.7094 1.82 1200 0.6327 0.5916 0.6327 0.3643 0.7954 111.5772
0.7094 2.28 1500 0.6216 0.5712 0.6216 0.3755 0.7884 99.9257
0.7094 2.73 1800 0.5770 0.5397 0.5770 0.4203 0.7596 100.1235
0.5129 3.19 2100 0.5791 0.5391 0.5791 0.4182 0.7610 99.9525
0.5129 3.64 2400 0.5796 0.5421 0.5796 0.4177 0.7613 99.3905
0.5129 4.1 2700 0.5720 0.5354 0.5720 0.4254 0.7563 98.9299
0.4448 4.55 3000 0.5801 0.5381 0.5801 0.4173 0.7616 96.1430
0.4448 5.01 3300 0.5437 0.5185 0.5437 0.4775 0.7373 94.1203
0.4448 5.46 3600 0.5111 0.4949 0.5111 0.5088 0.7149 92.1147
0.4448 5.92 3900 0.5234 0.5106 0.5234 0.4970 0.7235 95.4636
0.4877 6.37 4200 0.5478 0.5249 0.5478 0.4735 0.7402 94.5022
0.4877 6.83 4500 0.5172 0.5172 0.7192 0.4998 0.5029 93.0563
0.4877 7.28 4800 0.5318 0.5318 0.7293 0.5083 0.4889 90.5273
0.3889 7.74 5100 0.5845 0.5845 0.7645 0.5377 0.4383 93.8608
0.3889 8.19 5400 0.5315 0.5315 0.7291 0.5014 0.4892 90.2302
0.3889 8.65 5700 0.5356 0.5356 0.7319 0.5010 0.4852 88.9946
0.324 9.1 6000 0.5345 0.5345 0.7311 0.5028 0.4864 89.7148
0.324 9.56 6300 0.5383 0.5383 0.7337 0.5091 0.4827 89.7730

Framework versions

  • Transformers 4.27.0.dev0
  • Pytorch 1.13.1
  • Datasets 2.9.0
  • Tokenizers 0.13.2
Downloads last month
10
Inference API
Inference API (serverless) does not yet support model repos that contain custom code.