|
--- |
|
language: |
|
- hi |
|
- en |
|
- multilingual |
|
license: cc-by-4.0 |
|
tags: |
|
- hi |
|
- en |
|
- codemix |
|
datasets: |
|
- L3Cube-HingCorpus |
|
--- |
|
|
|
## HingRoBERTa-Mixed |
|
HingRoBERTa-Mixed is a Hindi-English code-mixed BERT model trained on roman + devanagari text. It is a xlm-RoBERTa model fine-tuned on mixed script L3Cube-HingCorpus. |
|
<br> |
|
[dataset link] (https://github.com/l3cube-pune/code-mixed-nlp) |
|
|
|
More details on the dataset, models, and baseline results can be found in our [paper] (https://arxiv.org/abs/2204.08398) |
|
|
|
Other models from HingBERT family: <br> |
|
<a href="https://huggingface.co/l3cube-pune/hing-bert"> HingBERT </a> <br> |
|
<a href="https://huggingface.co/l3cube-pune/hing-mbert"> HingMBERT </a> <br> |
|
<a href="https://huggingface.co/l3cube-pune/hing-mbert-mixed"> HingBERT-Mixed </a> <br> |
|
<a href="https://huggingface.co/l3cube-pune/hing-roberta"> HingRoBERTa </a> <br> |
|
<a href="https://huggingface.co/l3cube-pune/hing-roberta-mixed"> HingRoBERTa-Mixed </a> <br> |
|
<a href="https://huggingface.co/l3cube-pune/hing-gpt"> HingGPT </a> <br> |
|
<a href="https://huggingface.co/l3cube-pune/hing-gpt-devanagari"> HingGPT-Devanagari </a> <br> |
|
<a href="https://huggingface.co/l3cube-pune/hing-bert-lid"> HingBERT-LID </a> <br> |
|
|
|
|
|
``` |
|
@inproceedings{nayak-joshi-2022-l3cube, |
|
title = "{L}3{C}ube-{H}ing{C}orpus and {H}ing{BERT}: A Code Mixed {H}indi-{E}nglish Dataset and {BERT} Language Models", |
|
author = "Nayak, Ravindra and Joshi, Raviraj", |
|
booktitle = "Proceedings of the WILDRE-6 Workshop within the 13th Language Resources and Evaluation Conference", |
|
month = jun, |
|
year = "2022", |
|
address = "Marseille, France", |
|
publisher = "European Language Resources Association", |
|
url = "https://aclanthology.org/2022.wildre-1.2", |
|
pages = "7--12", |
|
} |
|
``` |