Meta-Llama-3-8B-CHT / README.md
suko's picture
Update README.md
11680d6 verified
|
raw
history blame
813 Bytes
---
language:
- en
- zh
license: llama3
library_name: transformers
base_model: unsloth/llama-3-8b-bnb-4bit
datasets:
- erhwenkuo/alpaca-data-gpt4-chinese-zhtw
pipeline_tag: text-generation
tags:
- llama-3
prompt_template: >-
{{ if .System }}<|start_header_id|>system<|end_header_id|> {{ .System
}}<|eot_id|>{{ end }}{{ if .Prompt }}<|start_header_id|>user<|end_header_id|>
{{ .Prompt }}<|eot_id|>{{ end }}<|start_header_id|>assistant<|end_header_id|>
{{ .Response }}<|eot_id|>
---
# LLAMA 3 8B with capable to output Traditional Chinese
## ✨ Recommend using LMStudio for this model
I try to use ollama to run it, but it become very delulu.
so stick with LMStudio first :)
The performance is not really actually good. but it's capable to answer some basic question. sometime it just act really dumb :(