Edit model card

mBART (a pre-trained model by Facebook) is pre-trained to de-noise multiple languages simultaneously with BART objective.

Checkpoint available in this repository is obtained after fine-tuning facebook/mbart-large-cc25 on all samples (~60K) from Bhasha (pib_v1.3) Gujarati-English parallel corpus. This checkpoint gives decent results for Gujarati-english translation.

Downloads last month
13
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train vasudevgupta/mbart-bhasha-guj-eng