Edit model card

fresh-2-layer-qasc5000-distill-of-fresh-2-layer-mmlu_EVAL_mmlu

This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 207.7788
  • Accuracy: 0.3764

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 321
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 5000

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.64 100 202.2836 0.24
No log 1.27 200 206.3151 0.34
No log 1.91 300 195.0245 0.35
No log 2.55 400 201.5064 0.388
98.9054 3.18 500 200.6433 0.386
98.9054 3.82 600 193.7714 0.388
98.9054 4.46 700 189.6684 0.39
98.9054 5.1 800 192.2281 0.378
98.9054 5.73 900 194.5975 0.388
23.6632 6.37 1000 199.0726 0.42
23.6632 7.01 1100 192.3244 0.402
23.6632 7.64 1200 186.5772 0.402
23.6632 8.28 1300 190.7620 0.402
23.6632 8.92 1400 198.1279 0.394
14.2685 9.55 1500 194.6297 0.366

Framework versions

  • Transformers 4.34.0.dev0
  • Pytorch 2.0.1+cu117
  • Datasets 2.14.5
  • Tokenizers 0.14.0
Downloads last month
11
Inference API (serverless) does not yet support transformers models for this pipeline type.