File size: 1,831 Bytes
cddea60
e4fc469
2a29346
 
e4fc469
 
2a29346
 
 
 
cddea60
 
 
 
 
 
 
 
 
 
 
c81ef0f
e2b81b1
 
 
 
e92bc73
e2b81b1
 
 
 
 
 
c81ef0f
1bc2753
 
 
 
 
 
 
 
 
 
c81ef0f
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
---
language:
- hi
- en
- multilingual
license: cc-by-4.0
tags:
- hi
- en
- codemix
datasets:
- L3Cube-HingCorpus
- L3Cube-HingLID
---

## HingBERT-LID
HingBERT-LID is a Hindi-English code-mixed language identification BERT model. It is a HingBERT model fine-tuned on L3Cube-HingLID dataset.
<br>
[dataset link] (https://github.com/l3cube-pune/code-mixed-nlp)

More details on the dataset, models, and baseline results can be found in our [paper] (https://arxiv.org/abs/2204.08398)

Other models from HingBERT family: <br>
<a href="https://huggingface.co/l3cube-pune/hing-bert"> HingBERT </a> <br>
<a href="https://huggingface.co/l3cube-pune/hing-mbert"> HingMBERT </a> <br>
<a href="https://huggingface.co/l3cube-pune/hing-mbert-mixed"> HingBERT-Mixed </a> <br>
<a href="https://huggingface.co/l3cube-pune/hing-mbert-mixed-v2"> HingBERT-Mixed-v2 </a> <br>
<a href="https://huggingface.co/l3cube-pune/hing-roberta"> HingRoBERTa </a> <br>
<a href="https://huggingface.co/l3cube-pune/hing-roberta-mixed"> HingRoBERTa-Mixed </a> <br>
<a href="https://huggingface.co/l3cube-pune/hing-gpt"> HingGPT </a> <br>
<a href="https://huggingface.co/l3cube-pune/hing-gpt-devanagari"> HingGPT-Devanagari </a> <br>
<a href="https://huggingface.co/l3cube-pune/hing-bert-lid"> HingBERT-LID </a> <br>

```
@inproceedings{nayak-joshi-2022-l3cube,
    title = "{L}3{C}ube-{H}ing{C}orpus and {H}ing{BERT}: A Code Mixed {H}indi-{E}nglish Dataset and {BERT} Language Models",
    author = "Nayak, Ravindra  and Joshi, Raviraj",
    booktitle = "Proceedings of the WILDRE-6 Workshop within the 13th Language Resources and Evaluation Conference",
    month = jun,
    year = "2022",
    address = "Marseille, France",
    publisher = "European Language Resources Association",
    url = "https://aclanthology.org/2022.wildre-1.2",
    pages = "7--12",
}
```