Edit model card

fresh-2-layer-qasc2000-distill-of-fresh-2-layer-mmlu_EVAL_mmlu

This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 207.2272
  • Accuracy: 0.3112

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 321
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 5000

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.59 100 201.7617 0.23
No log 3.17 200 215.9669 0.294
No log 4.76 300 202.8176 0.334
No log 6.35 400 207.7114 0.356
92.1591 7.94 500 219.4992 0.322
92.1591 9.52 600 219.8570 0.346
92.1591 11.11 700 214.9255 0.352
92.1591 12.7 800 216.3126 0.336
92.1591 14.29 900 216.1720 0.328

Framework versions

  • Transformers 4.34.0.dev0
  • Pytorch 2.0.1+cu117
  • Datasets 2.14.5
  • Tokenizers 0.14.0
Downloads last month
4
Inference API (serverless) does not yet support transformers models for this pipeline type.