yoshitomo-matsubara commited on
Commit
370d5fe
1 Parent(s): fd2e5db

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -0
README.md CHANGED
@@ -16,3 +16,15 @@ metrics:
16
  `bert-base-uncased` fine-tuned on RTE dataset, using fine-tuned `bert-large-uncased` as a teacher model, [***torchdistill***](https://github.com/yoshitomo-matsubara/torchdistill) and [Google Colab](https://colab.research.google.com/github/yoshitomo-matsubara/torchdistill/blob/master/demo/glue_kd_and_submission.ipynb) for knowledge distillation.
17
  The training configuration (including hyperparameters) is available [here](https://github.com/yoshitomo-matsubara/torchdistill/blob/main/configs/sample/glue/rte/kd/bert_base_uncased_from_bert_large_uncased.yaml).
18
  I submitted prediction files to [the GLUE leaderboard](https://gluebenchmark.com/leaderboard), and the overall GLUE score was **78.9**.
 
 
 
 
 
 
 
 
 
 
 
 
16
  `bert-base-uncased` fine-tuned on RTE dataset, using fine-tuned `bert-large-uncased` as a teacher model, [***torchdistill***](https://github.com/yoshitomo-matsubara/torchdistill) and [Google Colab](https://colab.research.google.com/github/yoshitomo-matsubara/torchdistill/blob/master/demo/glue_kd_and_submission.ipynb) for knowledge distillation.
17
  The training configuration (including hyperparameters) is available [here](https://github.com/yoshitomo-matsubara/torchdistill/blob/main/configs/sample/glue/rte/kd/bert_base_uncased_from_bert_large_uncased.yaml).
18
  I submitted prediction files to [the GLUE leaderboard](https://gluebenchmark.com/leaderboard), and the overall GLUE score was **78.9**.
19
+
20
+ Yoshitomo Matsubara: **"torchdistill Meets Hugging Face Libraries for Reproducible, Coding-Free Deep Learning Studies: A Case Study on NLP"** at *EMNLP 2023 Workshop for Natural Language Processing Open Source Software (NLP-OSS)*
21
+
22
+ [[OpenReview](https://openreview.net/forum?id=A5Axeeu1Bo)] [[Preprint](https://arxiv.org/abs/2310.17644)]
23
+ ```bibtex
24
+ @article{matsubara2023torchdistill,
25
+ title={{torchdistill Meets Hugging Face Libraries for Reproducible, Coding-Free Deep Learning Studies: A Case Study on NLP}},
26
+ author={Matsubara, Yoshitomo},
27
+ journal={arXiv preprint arXiv:2310.17644},
28
+ year={2023}
29
+ }
30
+ ```