File size: 3,540 Bytes
90c817c a6cb06f 90c817c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 |
---
language: ko
license: cc-by-4.0
---
# pko-t5-base
[Source Code](https://github.com/paust-team/pko-t5)
pko-t5 λ νκ΅μ΄ μ μ© λ°μ΄ν°λ‘ νμ΅ν [t5 v1.1 λͺ¨λΈ](https://github.com/google-research/text-to-text-transfer-transformer/blob/84f8bcc14b5f2c03de51bd3587609ba8f6bbd1cd/released_checkpoints.md)μ
λλ€.
νκ΅μ΄λ₯Ό tokenize νκΈ° μν΄μ sentencepiece λμ OOV κ° μλ BBPE λ₯Ό μ¬μ©νμΌλ©° νκ΅μ΄ λ°μ΄ν° (λ무μν€, μν€νΌλμ, λͺ¨λμλ§λμΉ λ±..) λ₯Ό T5 μ span corruption task λ₯Ό μ¬μ©ν΄μ unsupervised learning λ§ μ μ©νμ¬ νμ΅μ μ§ννμ΅λλ€.
pko-t5 λ₯Ό μ¬μ©νμ€ λλ λμ task μ νμΈνλνμ¬ μ¬μ©νμκΈ° λ°λλλ€.
## Usage
transformers μ API λ₯Ό μ¬μ©νμ¬ μ κ·Ό κ°λ₯ν©λλ€. tokenizer λ₯Ό μ¬μ©ν λλ `T5Tokenizer` κ° μλλΌ `T5TokenizerFast` λ₯Ό μ¬μ©ν΄μ£Όμμμ€. model μ T5ForConditionalGeneration λ₯Ό κ·Έλλ‘ νμ©νμλ©΄ λ©λλ€.
### Example
```python
from transformers import T5TokenizerFast, T5ForConditionalGeneration
tokenizer = T5TokenizerFast.from_pretrained('paust/pko-t5-base')
model = T5ForConditionalGeneration.from_pretrained('paust/pko-t5-base')
input_ids = tokenizer(["qa question: λΉμ μ μ΄λ¦μ 무μμΈκ°μ?"]).input_ids
labels = tokenizer(["T5 μ
λλ€."]).input_ids
outputs = model(input_ids=input_ids, labels=labels)
print(f"loss={outputs.loss} logits={outputs.logits}")
```
## Klue νκ° (dev)
| | Model | ynat (macro F1) | sts (pearsonr/F1) | nli (acc) | ner (entity-level F1) | re (micro F1) | dp (LAS) | mrc (EM/F1) |
|-----|------------------------------------------------------------------|-----------------|-------------------|-----------|-----------------------|---------------|-----------|-------------|
| | Baseline | **87.30** | **93.20/86.13** | **89.50** | 86.06 | 71.06 | 87.93 | **75.26/-** |
| FT | [pko-t5-small](https://huggingface.co/paust/pko-t5-small) (77M) | 86.21 | 77.99/77.01 | 69.20 | 82.60 | 66.46 | 93.15 | 43.81/46.58 |
| FT | [pko-t5-base](https://huggingface.co/paust/pko-t5-base) (250M) | 87.29 | 90.25/83.43 | 79.73 | 87.80 | 67.23 | 97.28 | 61.53/64.74 |
| FT | [pko-t5-large](https://huggingface.co/paust/pko-t5-large) (800M) | 87.12 | 92.05/85.24 | 84.96 | **88.18** | **75.17** | **97.60** | 68.01/71.44 |
| MT | pko-t5-small | 84.54 | 68.50/72/02 | 51.16 | 74.69 | 66.11 | 80.40 | 43.60/46.28 |
| MT | pko-t5-base | 86.89 | 83.96/80.30 | 72.03 | 85.27 | 66.59 | 95.05 | 61.11/63.94 |
| MT | pko-t5-large | 87.57 | 91.93/86.29 | 83.63 | 87.41 | 71.34 | 96.99 | 70.70/73.72 |
- FT: μ±κΈνμ€ν¬ νμΈνλ / MT: λ©ν°νμ€ν¬ νμΈνλ
- [Baseline](https://arxiv.org/abs/2105.09680): KLUE λ
Όλ¬Έμμ μκ°λ dev set μ λν SOTA μ μ
## License
[PAUST](https://paust.io)μμ λ§λ pko-t5λ [MIT license](https://github.com/paust-team/pko-t5/blob/main/LICENSE) νμ 곡κ°λμ΄ μμ΅λλ€. |