LLAMA 3 8B with capable to output Traditional Chinese
✨ Recommend using LMStudio for this model
I tried using Ollama to run it, but it became quite delulu.
So for now, I'm sticking with LMStudio :)The performance isn't actually that great, but it's capable of answering some basic questions. Sometimes it just acts really dumb though :(
LLAMA 3.1 can actually output pretty well Chinese, so this repo can be ignored.
- Downloads last month
- 23
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for suko/Meta-Llama-3-8B-CHT
Base model
meta-llama/Meta-Llama-3-8B
Quantized
unsloth/llama-3-8b-bnb-4bit