Edit model card

mbart-large-cc25-cnn-dailymail-nl

Model description

Finetuned version of mbart. We also wrote a blog post about this model here

Intended uses & limitations

It's meant for summarizing Dutch news articles.

How to use

import transformers

undisputed_best_model = transformers.MBartForConditionalGeneration.from_pretrained(
    "ml6team/mbart-large-cc25-cnn-dailymail-nl"
)
tokenizer = transformers.MBartTokenizer.from_pretrained("facebook/mbart-large-cc25")
summarization_pipeline = transformers.pipeline(
    task="summarization",
    model=undisputed_best_model,
    tokenizer=tokenizer,
)
summarization_pipeline.model.config.decoder_start_token_id = tokenizer.lang_code_to_id[
    "nl_XX"
]

article = "Kan je dit even samenvatten alsjeblief."  # Dutch
summarization_pipeline(
    article,
    do_sample=True,
    top_p=0.75,
    top_k=50,
    # num_beams=4,
    min_length=50,
    early_stopping=True,
    truncation=True,
)[0]["summary_text"]

Training data

Finetuned mbart with this dataset

Downloads last month
27
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train ml6team/mbart-large-cc25-cnn-dailymail-nl

Space using ml6team/mbart-large-cc25-cnn-dailymail-nl 1