Update README.md
Browse files
README.md
CHANGED
@@ -17,8 +17,6 @@ pipeline_tag: text2text-generation
|
|
17 |
This model belongs to the experiments done at the work of Stodden, Momen, Kallmeyer (2023). ["DEplain: A German Parallel Corpus with Intralingual Translations into Plain Language for Sentence and Document Simplification."](https://arxiv.org/abs/2305.18939) In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Toronto, Canada. Association for Computational Linguistics.
|
18 |
Detailed documentation can be found on this GitHub repository [https://github.com/rstodden/DEPlain](https://github.com/rstodden/DEPlain)
|
19 |
|
20 |
-
## Model Details
|
21 |
-
|
22 |
### Model Description
|
23 |
|
24 |
The model is a finetuned checkpoint of the pre-trained mBART model `mbart-large-cc25`. With a trimmed vocabulary to the most frequent 30k words in the German language.
|
|
|
17 |
This model belongs to the experiments done at the work of Stodden, Momen, Kallmeyer (2023). ["DEplain: A German Parallel Corpus with Intralingual Translations into Plain Language for Sentence and Document Simplification."](https://arxiv.org/abs/2305.18939) In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Toronto, Canada. Association for Computational Linguistics.
|
18 |
Detailed documentation can be found on this GitHub repository [https://github.com/rstodden/DEPlain](https://github.com/rstodden/DEPlain)
|
19 |
|
|
|
|
|
20 |
### Model Description
|
21 |
|
22 |
The model is a finetuned checkpoint of the pre-trained mBART model `mbart-large-cc25`. With a trimmed vocabulary to the most frequent 30k words in the German language.
|