File size: 4,833 Bytes
572bda1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
05ebdd4
 
 
 
572bda1
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
# TUNiB-Electra
  
We release several new versions of the [ELECTRA](https://arxiv.org/abs/2003.10555) model, which we name TUNiB-Electra. There are two motivations. First, all the existing pre-trained Korean encoder models are monolingual, that is, they have knowledge about Korean only. Our bilingual models are based on the balanced corpora of Korean and English. Second, we want new off-the-shelf models trained on much more texts. To this end, we collected a large amount of Korean text from various sources such as blog posts, comments, news, web novels, etc., which sum up to 100 GB in total.


## How to use
  
You can use this model directly with [transformers](https://github.com/huggingface/transformers) library:
  
```python

from transformers import AutoModel, AutoTokenizer



# Base Model (Korean-English bilingual model)

tokenizer = AutoTokenizer.from_pretrained('tunib/electra-ko-en-base')

model = AutoModel.from_pretrained('tunib/electra-ko-en-base')

```

### Tokenizer example

```python

>>> from transformers import AutoTokenizer

>>> tokenizer = AutoTokenizer.from_pretrained('tunib/electra-ko-en-base')

>>> tokenizer.tokenize("tunib is a natural language processing tech startup.")

['tun', '##ib', 'is', 'a', 'natural', 'language', 'processing', 'tech', 'startup', '.']

>>> tokenizer.tokenize("νŠœλ‹™μ€ μžμ—°μ–΄μ²˜λ¦¬ ν…Œν¬ μŠ€νƒ€νŠΈμ—…μž…λ‹ˆλ‹€.")

['튜', '##λ‹™', '##은', 'μžμ—°', '##μ–΄', '##처리', 'ν…Œν¬', 'μŠ€νƒ€νŠΈμ—…', '##μž…λ‹ˆλ‹€', '.']

```
  
## Results on Korean downstream tasks

  
|                       |**# Params** |**Avg.**| **NSMC**<br/>(acc) | **Naver NER**<br/>(F1) | **PAWS**<br/>(acc) | **KorNLI**<br/>(acc) | **KorSTS**<br/>(spearman) | **Question Pair**<br/>(acc) | **KorQuaD (Dev)**<br/>(EM/F1) |**Korean-Hate-Speech (Dev)**<br/>(F1)|
|  :----------------:| :----------------: | :--------------------: | :----------------: | :------------------: | :-----------------------: | :-------------------------: | :---------------------------: | :---------------------------: | :---------------------------: | :----------------: |
|***TUNiB-Electra-ko-base*** |  110M | **85.99** |  90.95 |    87.63         |   **84.65**   | **82.27**   |    85.00   |  95.77 |   64.01 / 90.32   |71.40 |
|***TUNiB-Electra-ko-en-base*** |  133M |84.74 	|90.15      |        86.93         |    83.05      |  79.70    |  82.23 | 95.64  | 83.61 / 92.37     |67.86 |
| [KoELECTRA-base-v3](https://github.com/monologg/KoELECTRA)    |  110M | 85.92   |90.63   |      **88.11**	     |    84.45    |    82.24    |       **85.53**      |     95.25      | **84.83 / 93.45**	     |  67.61 |
| [KcELECTRA-base](https://github.com/Beomi/KcELECTRA) | 124M|  84.75     |**91.71**      |         86.90          |       74.80        |        81.65         |           82.65           |          **95.78**          |         70.60 / 90.11         | **74.49** |
| [KoBERT-base](https://github.com/SKTBrain/KoBERT)        |  90M  |   81.92       |  89.63        |         86.11          |       80.65        |        79.00         |           79.64           |            93.93            |         52.81 / 80.27         | 66.21 |
| [KcBERT-base](https://github.com/Beomi/KcBERT)         |   110M    |   79.79    | 89.62        |         84.34          |       66.95        |        74.85         |           75.57           |            93.93            |         60.25 / 84.39         |  68.77 |
| [XLM-Roberta-base](https://github.com/pytorch/fairseq/tree/master/examples/xlmr)   | 280M  | 83.03    |89.49        |         86.26          |       82.95        |        79.92         |           79.09           |            93.53            |         64.70 / 88.94         |  64.06  |



  
## Results on English downstream tasks
 
 
|                       |**# Params** | **Avg.** |**CoLA**<br/>(MCC) | **SST**<br/>(Acc) |MRPC<br/>(Acc)| **STS**<br/>(Spearman) | **QQP**<br/>(Acc) | **MNLI**<br/>(Acc) | **QNLI**<br/>(Acc) | **RTE**<br/>(Acc) | 
|  :----------------:| :----------------: | :--------------------: | :----------------: | :------------------: | :-----------------------: | :-------------------------: | :---------------------------: | :---------------------------: | :---------------------------: | :---------------------------: |
|***TUNiB-Electra-ko-en-base***  | 133M |	 85.2| **66.29** |  91.86      |    **89.95**     | 89.67     |  **90.75** | 84.72  |    91.40 |**76.90**| 
|[ELECTRA-base](https://github.com/google-research/electra) | 110M |   **85.7** |	64.6     |     **96.0**           | 88.1|  **90.2**     |    89.5   |  **88.5**  |  **93.1**      |  75.2    | 
|[BERT-base](https://github.com/google-research/bert) | 110M |   80.8| 	52.1      |      93.5           |  84.8|    85.8     |  89.2   | 84.6        |   90.5       |  66.4    |