--- language: - it pipeline_tag: translation --- To initialize the model: from transformers import MBartForConditionalGeneration, MBart50TokenizerFast model = MBartForConditionalGeneration.from_pretrained("facebook/mbart-large-50", output_hidden_states=True) To generate text using the model: tokenizer = MBart50TokenizerFast.from_pretrained("facebook/mbart-large-50", src_lang="it_IT", tgt_lang="it_IT") input = tokenizer("I was here yesterday to studying",text_target="I was here yesterday to study", return_tensors='pt') output = hidden_states = model.generate(input["input_ids"],attention_mask=input["attention_mask"],forced_bos_token_id=tokenizer_it.lang_code_to_id["it_IT"])