TinyBERT_SST2 / README.md
Vishnou's picture
Vishnou/TinyBERT_SST2
84977da
metadata
base_model: huawei-noah/TinyBERT_General_4L_312D
tags:
  - generated_from_trainer
datasets:
  - sst2
metrics:
  - accuracy
model-index:
  - name: TinyBERT_SST2
    results:
      - task:
          name: Text Classification
          type: text-classification
        dataset:
          name: sst2
          type: sst2
          config: default
          split: validation
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.8864678899082569

TinyBERT_SST2

This model is a fine-tuned version of huawei-noah/TinyBERT_General_4L_312D on the sst2 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5142
  • Accuracy: 0.8865

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 2

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.4686 0.06 500 0.4020 0.8337
0.384 0.12 1000 0.3666 0.8360
0.381 0.18 1500 0.3951 0.8337
0.3609 0.24 2000 0.4378 0.8555
0.3616 0.3 2500 0.3743 0.8475
0.3521 0.36 3000 0.3692 0.8589
0.3113 0.42 3500 0.5072 0.8486
0.319 0.48 4000 0.4212 0.8612
0.3034 0.53 4500 0.4555 0.8647
0.3098 0.59 5000 0.4163 0.8635
0.3113 0.65 5500 0.5226 0.8440
0.2949 0.71 6000 0.4137 0.875
0.2977 0.77 6500 0.4775 0.8486
0.3077 0.83 7000 0.4774 0.8693
0.2953 0.89 7500 0.4491 0.8589
0.2846 0.95 8000 0.5228 0.8784
0.292 1.01 8500 0.4801 0.8865
0.2185 1.07 9000 0.4889 0.8933
0.2343 1.13 9500 0.5862 0.8716
0.2667 1.19 10000 0.4796 0.8842
0.252 1.25 10500 0.5181 0.8842
0.2385 1.31 11000 0.5148 0.875
0.2144 1.37 11500 0.5345 0.8704
0.2348 1.43 12000 0.5073 0.8807
0.2166 1.48 12500 0.4885 0.8865
0.2104 1.54 13000 0.6118 0.8658
0.2145 1.6 13500 0.5091 0.8865
0.2098 1.66 14000 0.5221 0.8876
0.2111 1.72 14500 0.5031 0.8888
0.2042 1.78 15000 0.5257 0.8796
0.2091 1.84 15500 0.5175 0.8819
0.2027 1.9 16000 0.5528 0.8784
0.2173 1.96 16500 0.5142 0.8865

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0