File size: 1,137 Bytes
de371e7 bdcbc63 de371e7 529fb41 de371e7 37bab52 de371e7 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 |
# KoGPT2-Transformers
KoGPT2 on Huggingface Transformers
### KoGPT2-Transformers
- [SKT-AI μμ 곡κ°ν KoGPT2 (ver 1.0)](https://github.com/SKT-AI/KoGPT2)λ₯Ό [Transformers](https://github.com/huggingface/transformers)μμ μ¬μ©νλλ‘ νμμ΅λλ€.
- **SKT-AI μμ KoGPT2 2.0μ 곡κ°νμμ΅λλ€. https://huggingface.co/skt/kogpt2-base-v2/**
### Demo
- μΌμ λν μ±λ΄ : http://demo.tmkor.com:36200/dialo
- νμ₯ν 리뷰 μμ± : http://demo.tmkor.com:36200/ctrl
### Example
```python
from transformers import GPT2LMHeadModel, PreTrainedTokenizerFast
model = GPT2LMHeadModel.from_pretrained("taeminlee/kogpt2")
tokenizer = PreTrainedTokenizerFast.from_pretrained("taeminlee/kogpt2")
input_ids = tokenizer.encode("μλ
", add_special_tokens=False, return_tensors="pt")
output_sequences = model.generate(input_ids=input_ids, do_sample=True, max_length=100, num_return_sequences=3)
for generated_sequence in output_sequences:
generated_sequence = generated_sequence.tolist()
print("GENERATED SEQUENCE : {0}".format(tokenizer.decode(generated_sequence, clean_up_tokenization_spaces=True)))
``` |