dmis-lab commited on
Commit
22cb3e4
1 Parent(s): ee7d088

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -0
README.md CHANGED
@@ -3,6 +3,7 @@ This model repository presents "TinyPubMedBERT", a distillated [PubMedBERT (Gu e
3
  The model is composed of 4-layers and distillated following methods introduced in the [TinyBERT paper](https://aclanthology.org/2020.findings-emnlp.372/) (Jiao et al., 2020).
4
 
5
  * For the framework, please visit https://github.com/AstraZeneca/KAZU
 
6
  * For details about the model, please see our paper entitled **Biomedical NER for the Enterprise with Distillated BERN2 and the Kazu Framework**, (EMNLP 2022 industry track).
7
 
8
  TinyPubMedBERT is used as the initial weights for the training of the [dmis-lab/KAZU-NER-module-distil-v1.0](https://huggingface.co/dmis-lab/KAZU-NER-module-distil-v1.0) for the KAZU (Korea University and AstraZeneca) framework.
3
  The model is composed of 4-layers and distillated following methods introduced in the [TinyBERT paper](https://aclanthology.org/2020.findings-emnlp.372/) (Jiao et al., 2020).
4
 
5
  * For the framework, please visit https://github.com/AstraZeneca/KAZU
6
+ * For the demo, please visit http://kazu.korea.ac.kr
7
  * For details about the model, please see our paper entitled **Biomedical NER for the Enterprise with Distillated BERN2 and the Kazu Framework**, (EMNLP 2022 industry track).
8
 
9
  TinyPubMedBERT is used as the initial weights for the training of the [dmis-lab/KAZU-NER-module-distil-v1.0](https://huggingface.co/dmis-lab/KAZU-NER-module-distil-v1.0) for the KAZU (Korea University and AstraZeneca) framework.