Edit model card

DEplain German Text Simplification

This model belongs to the experiments done at the work of Stodden, Momen, Kallmeyer (2023). "DEplain: A German Parallel Corpus with Intralingual Translations into Plain Language for Sentence and Document Simplification." In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Toronto, Canada. Association for Computational Linguistics. Detailed documentation can be found on this GitHub repository https://github.com/rstodden/DEPlain

Model Description

The model is a finetuned checkpoint of the pre-trained mBART model mbart-large-cc25. With a trimmed vocabulary to the most frequent 30k words in the German language.

The model was finetuned towards the task of German text simplification of sentences.

The finetuning dataset included manually aligned sentences from the datasets DEplain-APA-sent and DEplain-web-sent-manual-open

Downloads last month
4
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Datasets used to train DEplain/trimmed_mbart_sents_apa_web