yoshitomo-matsubara's picture
initial commit
fd2e5db
metadata
language: en
tags:
  - bert
  - rte
  - glue
  - kd
  - torchdistill
license: apache-2.0
datasets:
  - rte
metrics:
  - accuracy

bert-base-uncased fine-tuned on RTE dataset, using fine-tuned bert-large-uncased as a teacher model, torchdistill and Google Colab for knowledge distillation.
The training configuration (including hyperparameters) is available here.
I submitted prediction files to the GLUE leaderboard, and the overall GLUE score was 78.9.