Edit model card

μ£Όμ‹νšŒμ‚¬ ν•œμ†”λ°μ½”μ˜ 곡개 도메인 데이터셋을 토큰화 및 dpo ν•™μŠ΅ν•œ ν›„, moeλ₯Ό μ μš©ν•˜μ˜€μŠ΅λ‹ˆλ‹€.

  1. davidkim205/komt-mistral-7b-v1
  2. sosoai/hansoldeco-mistral-dpov1

μ‹€ν–‰ 예제

from transformers import AutoTokenizer, AutoModelForCausalLM
from transformers import TextStreamer, GenerationConfig

model_name='sosoai/hansoldeco-mistral-dpo-v1'
model = AutoModelForCausalLM.from_pretrained(model_name, device_map="auto")
tokenizer = AutoTokenizer.from_pretrained(model_name)
streamer = TextStreamer(tokenizer)

def gen(x):
    generation_config = GenerationConfig(
        temperature=0.1,
        top_p=0.8,
        top_k=100,
        max_new_tokens=256,
        early_stopping=True,
        do_sample=True,
        repetition_penalty=1.2,
    )
    q = f"[INST]{x} [/INST]"
    gened = model.generate(
        **tokenizer(
            q,
            return_tensors='pt',
            return_token_type_ids=False
        ).to('cuda'),
        generation_config=generation_config,
        pad_token_id=tokenizer.eos_token_id,
        eos_token_id=tokenizer.eos_token_id,
        streamer=streamer,
    )
    result_str = tokenizer.decode(gened[0])

    start_tag = f"\n\n### Response: "
    start_index = result_str.find(start_tag)

    if start_index != -1:
        result_str = result_str[start_index + len(start_tag):].strip()
    return result_str

print(gen('λ§ˆκ°ν•˜μžλŠ” μ–΄λ–€ μ’…λ₯˜κ°€ μžˆλ‚˜μš”?'))
Downloads last month
1
Safetensors
Model size
12.9B params
Tensor type
BF16
Β·