|
--- |
|
language: ko |
|
--- |
|
|
|
# Pretrained BART in Korean |
|
|
|
This is pretrained BART model with multiple Korean Datasets. |
|
|
|
I used multiple datasets for generalizing the model for both colloquial and written texts. |
|
|
|
The training is supported by [TPU Research Cloud](https://sites.research.google/trc/) program. |
|
|
|
The script which is used to pre-train model is [here](https://github.com/cosmoquester/transformers-bart-pretrain). |
|
|
|
When you use the reference API, you must wrap the sentence with `[BOS]` and `[EOS]` like below example. |
|
|
|
``` |
|
[BOS] ์๋
ํ์ธ์? ๋ฐ๊ฐ์์~~ [EOS] |
|
``` |
|
|
|
## Used Datasets |
|
|
|
### [๋ชจ๋์ ๋ง๋ญ์น](https://corpus.korean.go.kr/) |
|
- ์ผ์ ๋ํ ๋ง๋ญ์น 2020 |
|
- ๊ตฌ์ด ๋ง๋ญ์น |
|
- ๋ฌธ์ด ๋ง๋ญ์น |
|
- ์ ๋ฌธ ๋ง๋ญ์น |
|
|
|
### AIhub |
|
- [๊ฐ๋ฐฉ๋ฐ์ดํฐ ์ ๋ฌธ๋ถ์ผ๋ง๋ญ์น](https://aihub.or.kr/aidata/30717) |
|
- [๊ฐ๋ฐฉ๋ฐ์ดํฐ ํ๊ตญ์ด๋ํ์์ฝ](https://aihub.or.kr/aidata/30714) |
|
- [๊ฐ๋ฐฉ๋ฐ์ดํฐ ๊ฐ์ฑ ๋ํ ๋ง๋ญ์น](https://aihub.or.kr/aidata/7978) |
|
- [๊ฐ๋ฐฉ๋ฐ์ดํฐ ํ๊ตญ์ด ์์ฑ](https://aihub.or.kr/aidata/105) |
|
- [๊ฐ๋ฐฉ๋ฐ์ดํฐ ํ๊ตญ์ด SNS](https://aihub.or.kr/aidata/30718) |
|
|
|
### [์ธ์ข
๋ง๋ญ์น](https://ithub.korean.go.kr/) |
|
|