Edit model card

IndicTrans2

This is the model card of IndicTrans2 En-Indic Distilled 200M variant.

Please refer to section 7.6: Distilled Models in the TMLR submission for further details on model training, data and metrics.

Usage Instructions

Please refer to the github repository for a detail description on how to use HF compatible IndicTrans2 models for inference.

Note: IndicTrans2 is not compatible with AutoTokenizer, therefore we provide IndicTransTokenizer

Citation

If you consider using our work then please cite using:

@article{gala2023indictrans,
title={IndicTrans2: Towards High-Quality and Accessible Machine Translation Models for all 22 Scheduled Indian Languages},
author={Jay Gala and Pranjal A Chitale and A K Raghavan and Varun Gumma and Sumanth Doddapaneni and Aswanth Kumar M and Janki Atul Nawale and Anupama Sujatha and Ratish Puduppully and Vivek Raghavan and Pratyush Kumar and Mitesh M Khapra and Raj Dabre and Anoop Kunchukuttan},
journal={Transactions on Machine Learning Research},
issn={2835-8856},
year={2023},
url={https://openreview.net/forum?id=vfT4YuzAYA},
note={}
}
Downloads last month
25,958
Safetensors
Model size
275M params
Tensor type
F32
ยท
Inference Examples
Inference API (serverless) has been turned off for this model.

Space using ai4bharat/indictrans2-en-indic-dist-200M 1