hing-roberta-mixed / README.md
l3cube-pune's picture
Update README.md
e752a09
metadata
license: cc-by-4.0
language:
  - hi
  - en
tags:
  - hi
  - en
  - codemix
datasets:
  - L3Cube-HingCorpus

HingRoBERTa-Mixed

HingRoBERTa-Mixed is a Hindi-English code-mixed BERT model trained on roman + devanagari text. It is a xlm-RoBERTa model fine-tuned on mixed script L3Cube-HingCorpus.
[dataset link] (https://github.com/l3cube-pune/code-mixed-nlp)

More details on the dataset, models, and baseline results can be found in our [paper] (https://arxiv.org/abs/2204.08398)

@InProceedings{nayak-joshi:2022:WILDRE6,
  author    = {Nayak, Ravindra  and  Joshi, Raviraj},
  title     = {L3Cube-HingCorpus and HingBERT: A Code Mixed Hindi-English Dataset and BERT Language Models},
  booktitle      = {Proceedings of The WILDRE-6 Workshop within the 13th Language Resources and Evaluation Conference},
  month          = {June},
  year           = {2022},
  address        = {Marseille, France},
  publisher      = {European Language Resources Association},
  pages     = {7--12},
  url       = {https://aclanthology.org/2022.wildre6-1.2}
}