File size: 1,978 Bytes
e344898
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5a9e339
8c65ef1
 
 
 
 
 
 
 
 
e344898
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
---
title: README
emoji: 🏃
colorFrom: gray
colorTo: purple
sdk: static
pinned: false
---

# Model Description
DistilClinicalBERT is a distilled version of the [BioClinicalBERT](https://huggingface.co/emilyalsentzer/Bio_ClinicalBERT) model which is distilled for 3 epochs using a total batch size of 192 on the MIMIC-III notes dataset. 

# Distillation Procedure
This model uses a simple distillation technique, which tries to align the output distribution of the student model with the output distribution of the teacher based on the MLM objective. In addition, it optionally uses another alignment loss for aligning the last hidden state of the student and teacher.

# Initialisation
Following [DistilBERT](https://huggingface.co/distilbert-base-uncased?text=The+goal+of+life+is+%5BMASK%5D.), we initialise the student model by taking weights from every other layer of the teacher.

# Architecture
In this model, the size of the hidden dimension and the embedding layer are both set to 768. The vocabulary size is 28996. The number of transformer layers is 6 and the expansion rate of the feed-forward layer is 4. Overall this model has around 65 million parameters.

# Citation
If you use this model, please consider citing the following paper:

```bibtex
@misc{https://doi.org/10.48550/arxiv.2302.04725,
  doi = {10.48550/ARXIV.2302.04725},
  url = {https://arxiv.org/abs/2302.04725},
  author = {Rohanian, Omid and Nouriborji, Mohammadmahdi and Jauncey, Hannah and Kouchaki, Samaneh and Group, ISARIC Clinical Characterisation and Clifton, Lei and Merson, Laura and Clifton, David A.},
  keywords = {Computation and Language (cs.CL), Artificial Intelligence (cs.AI), Machine Learning (cs.LG), FOS: Computer and information sciences, FOS: Computer and information sciences, I.2.7, 68T50},
  title = {Lightweight Transformers for Clinical Natural Language Processing},
  publisher = {arXiv},
  year = {2023},
  copyright = {arXiv.org perpetual, non-exclusive license}
}
```