Synatra-kiqu-10.7B / README.md
kaki-paper's picture
Update: Readme.md
315a351 verified
|
raw
history blame
1.19 kB
metadata
license: cc-by-sa-4.0

Join our discord

Server Link

License

cc-by-sa-4.0

Model Details

Base Model
maywell/Synatra-10.7B-v0.4

Trained On
A100 80GB * 8

Sionic AIμ—μ„œ GPU μžμ›μ„ 지원받아 μ œμž‘λ˜μ—ˆμŠ΅λ‹ˆλ‹€.

Instruction format

It follows Alpaca format.

Model Benchmark

TBD

Implementation Code

Since, chat_template already contains insturction format above. You can use the code below.

from transformers import AutoModelForCausalLM, AutoTokenizer

device = "cuda" # the device to load the model onto

model = AutoModelForCausalLM.from_pretrained("maywell/Synatra-kiqu-10.7B")
tokenizer = AutoTokenizer.from_pretrained("maywell/Synatra-kiqu-10.7B")

messages = [
    {"role": "user", "content": "λ°”λ‚˜λ‚˜λŠ” μ›λž˜ ν•˜μ–€μƒ‰μ΄μ•Ό?"},
]

encodeds = tokenizer.apply_chat_template(messages, return_tensors="pt")

model_inputs = encodeds.to(device)
model.to(device)

generated_ids = model.generate(model_inputs, max_new_tokens=1000, do_sample=True)
decoded = tokenizer.batch_decode(generated_ids)
print(decoded[0])