tzem-instruct λͺ¨λΈμ μΆκ°λ‘ νμΈνλ ν λͺ¨λΈμ λλ€.
ν둬ννΈ ν νλ¦Ώ
{system}
**μ¬μ©μ:** {prompt}
**μΈκ³΅μ§λ₯:**
μ¬μ©
from transformers import AutoModelForCausalLM, AutoTokenizer
model_id = "blueapple8259/tzem-instruct-v1.1"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id)
system = "μ¬κΈ°μ μμ€ν
λ©μμ§ μ
λ ₯"
prompt = "μ¬κΈ°μ ν둬ννΈ μ
λ ₯"
text = f"{system}\n**μ¬μ©μ:** {prompt}\n**μΈκ³΅μ§λ₯:".strip()
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(
**inputs,
max_new_tokens=100,
pad_token_id=tokenizer.pad_token_id,
eos_token_id=tokenizer.eos_token_id,
do_sample=True,
top_p=0.2,
)
output = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(output)
μμ
**μ¬μ©μ:** 건κ°μ μ μ§νκΈ° μν μΈ κ°μ§ νμ μλ €μ£ΌμΈμ.
**μΈκ³΅μ§λ₯:** 1. κ· ν μ‘ν μλ¨μ μμ·¨νμΈμ: κ³ΌμΌ, μ±μ, ν΅κ³‘λ¬Ό, μ μ§λ°© λ¨λ°±μ§, μ μ§λ°© μ μ νμ λ€μνκ² μμ·¨νμΈμ. κ°κ³΅μν, μ€ν μλ£, κ³Όλν μμ ν¬ν μ§λ°©κ³Ό νΈλμ€ μ§λ°©μ νΌνμΈμ.
2. κ·μΉμ μΌλ‘ μ΄λνμΈμ: μ£Ό 5μΌ, 30λΆ μ΄μμ μ€κ°λ μ΄λμ νμΈμ. μ£Ό 3μΌ μ΄μ, μ£Ό 4μΌ μ΄μ, μ£Ό 1μΌ μ΄μ μ΄λν΄μΌ ν©λλ€.
3. μΆ©λΆν μλ©΄μ μ·¨νμΈμ: λλΆλΆμ μ±μΈμ νλ£»λ°€μ 7~8μκ°μ μλ©΄μ΄ νμν©λλ€. λ§€μΌ 7~8μκ°μ μλ©΄μ λͺ©νλ‘ νμΈμ.
λ°μ΄ν°μ
- Downloads last month
- 2
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.