kd-distilBERT-clinc / README.md
librarian-bot's picture
Librarian Bot: Add base_model information to model
22fd1e8
|
raw
history blame
1.92 kB
metadata
license: apache-2.0
tags:
  - generated_from_trainer
datasets:
  - clinc_oos
metrics:
  - accuracy
base_model: distilbert-base-uncased
model-index:
  - name: kd-distilBERT-clinc
    results:
      - task:
          type: text-classification
          name: Text Classification
        dataset:
          name: clinc_oos
          type: clinc_oos
          config: plus
          split: train
          args: plus
        metrics:
          - type: accuracy
            value: 0.9158064516129032
            name: Accuracy

kd-distilBERT-clinc

This model is a fine-tuned version of distilbert-base-uncased on the clinc_oos dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7857
  • Accuracy: 0.9158

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 48
  • eval_batch_size: 48
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Accuracy
4.2955 1.0 318 3.2896 0.7232
2.6293 2.0 636 1.8798 0.8410
1.5527 3.0 954 1.1648 0.8881
1.0164 4.0 1272 0.8682 0.9145
0.8043 5.0 1590 0.7857 0.9158

Framework versions

  • Transformers 4.25.1
  • Pytorch 1.12.1+cu113
  • Datasets 2.7.1
  • Tokenizers 0.13.2