--- language: - it pipeline_tag: text2text-generation metrics: - f1 tags: - grammatical error correction - GEC - german --- This is a fine-tuned version of Multilingual Bart trained on German in particular on the public dataset Falko-MERLIN for Grammatical Error Correction. To initialize the model: from transformers import MBartForConditionalGeneration, MBart50TokenizerFast model = MBartForConditionalGeneration.from_pretrained("MRNH/mbart-german-grammar-corrector") To generate text using the model: tokenizer = MBart50TokenizerFast.from_pretrained("MRNH/mbart-german-grammar-corrector", src_lang="de_DE", tgt_lang="de_DE") input = tokenizer("I was here yesterday to studying",text_target="I was here yesterday to study", return_tensors='pt') output = model.generate(input["input_ids"],attention_mask=input["attention_mask"],forced_bos_token_id=tokenizer_it.lang_code_to_id["de_DE"])