jaygala24's picture
Update README.md
964f183
|
raw
history blame
1.61 kB
metadata
language:
  - as
  - bn
  - brx
  - doi
  - en
  - gom
  - gu
  - hi
  - kn
  - ks
  - ka
  - mai
  - ml
  - mr
  - mni
  - mnb
  - ne
  - or
  - pa
  - sa
  - sat
  - sd
  - snd
  - ta
  - te
  - ur
language_details: >-
  asm_Beng, ben_Beng, brx_Deva, doi_Deva, eng_Latn, gom_Deva, guj_Gujr,
  hin_Deva, kan_Knda, kas_Arab, kas_Deva, mai_Deva, mal_Mlym, mar_Deva,
  mni_Beng, mni_Mtei, npi_Deva, ory_Orya, pan_Guru, san_Deva, sat_Olck,
  snd_Arab, snd_Deva, tam_Taml, tel_Telu, urd_Arab
tags:
  - indictrans
  - translation
  - ai4bharat
  - multilingual
license: mit
datasets:
  - flores-200
metrics:
  - bleu
  - chrf
  - chrf++
  - comet
inference: false

IndicTrans

This is the model card of IndicTrans2 En-Indic 1.1B variant.

Here are the metrics for that particular checkpoint.

Please refer to Appendix D: Model Card of our preprint for further details on model training, intended use, data, metrics, limitations and recommendations.

Citation

If you consider using our work then please cite using:

@article{ai4bharat2023indictrans2,
  title   = {IndicTrans2: Towards High-Quality and Accessible Machine Translation Models for all 22 Scheduled Indian Languages},
  author  = {AI4Bharat and Jay Gala and Pranjal A. Chitale and Raghavan AK and Sumanth Doddapaneni and Varun Gumma and Aswanth Kumar and Janki Nawale and Anupama Sujatha and Ratish Puduppully and Vivek Raghavan and Pratyush Kumar and Mitesh M. Khapra and Raj Dabre and Anoop Kunchukuttan},
  year    = {2023},
  journal = {arXiv preprint arXiv: 2305.16307}
}