Edit model card

2.2bpw (high quality loss, only for 24GB vRAM test.)
4.0bpw
6.0bpw
8.0bpw

llama-3-youko-70b-instruct-exl2

Prompt template

<|begin_of_text|><|start_header_id|>system<|end_header_id|>

あなたは誠実で優秀なアシスタントです。どうか、簡潔かつ正直に答えてください。<|eot_id|><|start_header_id|>user<|end_header_id|>

西田幾多郎とはどんな人物ですか?<|eot_id|><|start_header_id|>assistant<|end_header_id|>

Cite

@misc{rinna-llama-3-youko-70b-instruct,
    title = {rinna/llama-3-youko-70b-instruct},
    author = {Mitsuda, Koh and Chen, Xinqi and Wakatsuki, Toshiaki and Sawada, Kei},
    url = {https://huggingface.co/rinna/llama-3-youko-70b-instruct}
}

@inproceedings{sawada2024release,
    title = {Release of Pre-Trained Models for the {J}apanese Language},
    author = {Sawada, Kei and Zhao, Tianyu and Shing, Makoto and Mitsui, Kentaro and Kaga, Akio and Hono, Yukiya and Wakatsuki, Toshiaki and Mitsuda, Koh},
    booktitle = {Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)},
    month = {5},
    year = {2024},
    pages = {13898--13905},
    url = {https://aclanthology.org/2024.lrec-main.1213},
    note = {\url{https://arxiv.org/abs/2404.01657}}
}


References

@article{llama3modelcard,
    title = {Llama 3 Model Card},
    author = {AI@Meta},
    year = {2024},
    url = {https://github.com/meta-llama/llama3/blob/main/MODEL_CARD.md}
}

@article{huang2023chat,
    title = {Chat Vector: A Simple Approach to Equip LLMs with Instruction Following and Model Alignment in New Languages},
    author = {Huang, Shih-Cheng and Li, Pin-Zu and Hsu, Yu-Chi and Chen, Kuang-Ming and Lin, Yu Tung and Hsiao, Shih-Kai and Tzong-Han Tsai, Richard and Lee, Hung-yi},
    year = {2023},
    url = {https://arxiv.org/abs/2310.04799}
}


License

Meta Llama 3 Community License

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for RioShiina/llama-3-youko-70b-instruct-exl2

Finetuned
(1)
this model