BaseModel

Model Generation

from transforemrs import AutoTokenizer, AutoModelForCausalLM

model = AutoModelForCausalLM.from_pretrained("AIdenU/LLAMA-2-13b-koen-Y24_v1.0", device_map="auto")
tokenizer = AutoTokenizer.from_pretrained("AIdenU/LLAMA-2-13b-koen-Y24_v1.0", use_fast=True)

systemPrompt = "당신은 유λŠ₯ν•œ AIμž…λ‹ˆλ‹€."
prompt = "지렁이도 밟으면 κΏˆν‹€ν•˜λ‚˜μš”?"
outputs = model.generate(
  **tokenizer(
    f"[INST] <<SYS>>\n{systemPrompt}\n<</SYS>>\n\n{prompt} [/INST] ",
    return_tensors='pt'
  ).to('cuda'),
  max_new_tokens=256,
  temperature=0.2,
  top_p=1,
  do_sample=True
)
print(tokenizer.decode(outputs[0]))
Downloads last month
354
Safetensors
Model size
13B params
Tensor type
FP16
Β·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for AIdenU/LLAMA-2-13b-koen-Y24_v1.0

Quantizations
4 models

Spaces using AIdenU/LLAMA-2-13b-koen-Y24_v1.0 6