vasudevgupta
commited on
Commit
•
e7f210e
1
Parent(s):
c6aab5d
Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
mBART (a pre-trained model by Facebook) is pre-trained to de-noise multiple languages simultaneously with BART objective.
|
2 |
|
3 |
-
Checkpoint available in this repository is obtained after fine-tuning `facebook/mbart-large-cc25` on all samples from Bhasha (pib_v1.3) Gujarati-English parallel corpus. This checkpoint gives decent results for Gujarati-english translation.
|
|
|
1 |
mBART (a pre-trained model by Facebook) is pre-trained to de-noise multiple languages simultaneously with BART objective.
|
2 |
|
3 |
+
Checkpoint available in this repository is obtained after fine-tuning `facebook/mbart-large-cc25` on all samples (~60K) from Bhasha (pib_v1.3) Gujarati-English parallel corpus. This checkpoint gives decent results for Gujarati-english translation.
|