Update README.md
Browse files
README.md
CHANGED
@@ -10,7 +10,7 @@ widget:
|
|
10 |
DistilCamemBERT
|
11 |
===============
|
12 |
|
13 |
-
We present a distillation version of the well named [CamemBERT](https://huggingface.co/camembert-base), a RoBERTa French model version, alias DistilCamemBERT. The aim of distillation is to drastically reduce the complexity of the model while preserving the performances. The proof of concept is shown in the DistilBERT paper and the code used for the training is inspired by the code of [DistilBERT](https://github.com/huggingface/transformers/tree/master/examples/research_projects/distillation).
|
14 |
|
15 |
Loss function
|
16 |
-------------
|
|
|
10 |
DistilCamemBERT
|
11 |
===============
|
12 |
|
13 |
+
We present a distillation version of the well named [CamemBERT](https://huggingface.co/camembert-base), a RoBERTa French model version, alias DistilCamemBERT. The aim of distillation is to drastically reduce the complexity of the model while preserving the performances. The proof of concept is shown in the [DistilBERT paper](https://arxiv.org/abs/1910.01108) and the code used for the training is inspired by the code of [DistilBERT](https://github.com/huggingface/transformers/tree/master/examples/research_projects/distillation).
|
14 |
|
15 |
Loss function
|
16 |
-------------
|