t5-v1_1-base-ko / README.md
hyunwoo3235's picture
Update README.md
e115d45
---
language: ko
license: apache-2.0
---
# hyunwoo3235/t5-v1_1-base-ko
[Google's T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) Version 1.1 that trained on korean corpus
t5-v1_1-base-ko은 ν•œκ΅­μ–΄ μ½”νΌμŠ€μ—μ„œ ν•™μŠ΅λœ t5 v1.1 λͺ¨λΈμž…λ‹ˆλ‹€.
OOV을 막기 μœ„ν•΄ BBPEλ₯Ό μ‚¬μš©ν•˜μ˜€μœΌλ©°, HyperCLOVAμ—μ„œ ν˜•νƒœμ†Œ 뢄석이 μ„±λŠ₯을 λ†’νžˆλŠ”λ° 도움이 λ˜λŠ” 것을 보고 ν† ν¬λ‚˜μ΄μ € ν•™μŠ΅ κ³Όμ •μ—μ„œ MeCab을 μ΄μš©ν•΄ ν˜ˆνƒœμ†Œκ°€ μ΄μƒν•˜κ²Œ 토큰화 λ˜μ§€ μ•Šλ„λ‘ ν•˜μ˜€μŠ΅λ‹ˆλ‹€.
## Usage
```python
from transformers import AutoTokenizer, T5ForConditionalGeneration
tokenizer = AutoTokenizer.from_pretrained('hyunwoo3235/t5-v1_1-base-ko')
model = T5ForConditionalGeneration.from_pretrained('hyunwoo3235/t5-v1_1-base-ko')
```