MBartTranslator / README.md
wall-e-zz's picture
Update README.md
cd904a5

A newer version of the Gradio SDK is available: 5.5.0

Upgrade
metadata
title: MBartTranslator
emoji: 
colorFrom: pink
colorTo: indigo
sdk: gradio
sdk_version: 3.15.0
app_file: app.py
pinned: false

mBART-50

mBART-50 is a multilingual Sequence-to-Sequence model pre-trained using the "Multilingual Denoising Pretraining" objective. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper.

Model description

mBART-50 is a multilingual Sequence-to-Sequence model. It was introduced to show that multilingual translation models can be created through multilingual fine-tuning. Instead of fine-tuning on one direction, a pre-trained model is fine-tuned on many directions simultaneously. mBART-50 is created using the original mBART model and extended to add extra 25 languages to support multilingual machine translation models of 50 languages. The pre-training objective is explained below.

Docker with GPU

docker run -it --gpus all -p 7860:7860 --platform=linux/amd64 \
    registry.hf.space/wall-e-zz-mbarttranslator:latest python app.py

Docker with CPU

docker run -it -p 7860:7860 --platform=linux/amd64 \
    registry.hf.space/wall-e-zz-mbarttranslator:latest python app.py