Edit model card

bert-large-uncased fine-tuned on QNLI dataset, using torchdistill and Google Colab.
The hyperparameters are the same as those in Hugging Face's example and/or the paper of BERT, and the training configuration (including hyperparameters) is available here.
I submitted prediction files to the GLUE leaderboard, and the overall GLUE score was 80.2.

Downloads last month
Hosted inference API
Text Classification
This model can be loaded on the Inference API on-demand.