from transformers import pipeline

generator = pipeline(
    'text-generation',
    model="heegyu/ajoublue-gpt2-medium-dialog"
)

generation_args = dict(
    max_new_tokens=32,
    do_sample=True,
    top_p=0.8
)

print(generator(
    ["0 : **λŠ” κ²Œμž„ μ’‹μ•„ν•˜λ‹ˆ</s>1 :",
    "0 : μ–΄μ œ κ°•λ‚¨μ—μ„œ 살인사건 λ‚¬λŒ€ γ…œγ…œ λ„ˆλ¬΄ λ¬΄μ„œμ›Œ</s>1 : 헐 μ™œ? 무슨 일 μžˆμ—ˆμ–΄?</s>0 : μ‚¬μ§„λ³΄λ‹ˆκΉŒ 막 ν”Όν˜λ¦¬λŠ” μ‚¬λžŒμžˆκ³  경찰듀이 λ– μ„œ μ œμ••ν•˜κ³  λ‚œλ¦¬λ„ μ•„λ‹ˆμ—ˆλ‹€λ˜λ°??</s>1 :",
    "0 : μžκΈ°μ•Ό μ–΄μ œλŠ” λ‚˜ν•œν…Œ μ™œ κ·Έλž¬μ–΄?</s>1 : λ­” 일 μžˆμ—ˆμ–΄?</s>0 : μ–΄λ–»κ²Œ λ‚˜ν•œν…Œ 말도 없이 그럴 수 μžˆμ–΄? λ‚˜ μ§„μ§œ μ‹€λ§ν–ˆμ–΄</s>1 : "],
    **generation_args
))

[[{'generated_text': '0 : **λŠ” κ²Œμž„ μ’‹μ•„ν•˜λ‹ˆ</s>1 : κ²Œμž„ μ’‹μ•„ν•˜μ£ '}], [{'generated_text': '0 : μ–΄μ œ κ°•λ‚¨μ—μ„œ
 살인사건 λ‚¬λŒ€ γ…œγ…œ λ„ˆλ¬΄ λ¬΄μ„œμ›Œ</s>1 : 헐 μ™œ? 무슨 일 μžˆμ—ˆμ–΄?</s>0 : μ‚¬μ§„λ³΄λ‹ˆκΉŒ 막 ν”Όν˜λ¦¬λŠ” μ‚¬λžŒμžˆκ³  κ²½μ°°λ“€ 
이 λ– μ„œ μ œμ••ν•˜κ³  λ‚œλ¦¬λ„ μ•„λ‹ˆμ—ˆλ‹€λ˜λ°??</s>1 : 헐 μ§„μ§œ 무섭닀 γ…œγ…œ μ™œ κ·Έλž¬λŒ€?'}], [{'generated_text': '0 : 자
κΈ°μ•Ό μ–΄μ œλŠ” λ‚˜ν•œν…Œ μ™œ κ·Έλž¬μ–΄?</s>1 : λ­” 일 μžˆμ—ˆμ–΄?</s>0 : μ–΄λ–»κ²Œ λ‚˜ν•œν…Œ 말도 없이 그럴 수 μžˆμ–΄? λ‚˜ μ§„μ§œ 싀망
ν–ˆμ–΄</s>1 :  μ•Œμ•˜μ–΄. κ·Έλƒ₯ μ„œμš΄ν•œκ±° 뿐이야.'}]]
Downloads last month
12
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Space using heegyu/ajoublue-gpt2-medium-dialog 1

Collection including heegyu/ajoublue-gpt2-medium-dialog