--- base_model: PY007/TinyLlama-1.1B-Chat-v0.3 datasets: - cerebras/SlimPajama-627B - bigcode/starcoderdata - OpenAssistant/oasst_top1_2023-08-25 inference: false language: - en license: apache-2.0 model_creator: Zhang Peiyuan model_name: TinyLlama 1.1B Chat v0.3 model_type: tinyllama prompt_template: '<|im_start|>system {system_message}<|im_end|> <|im_start|>user {prompt}<|im_end|> <|im_start|>assistant ' quantized_by: TheBloke --- # TinyLlama 1.1B Chat v0.3 - GGUF - Model creator: [Zhang Peiyuan](https://huggingface.co/PY007) - Original model: [TinyLlama 1.1B Chat v0.3](https://huggingface.co/PY007/TinyLlama-1.1B-Chat-v0.3) - TheBloke quant: [TinyLlama-1.1B-Chat-v0.3-GGUF](https://huggingface.co/TheBloke/TinyLlama-1.1B-Chat-v0.3-GGUF) ## Support for `calm` These models support the [calm](https://github.com/iandennismiller/calm) language model runner. The particular quants selected for this repo are in support of [calm](https://github.com/iandennismiller/calm), which is a language model runner that automatically uses the right prompts, templates, context size, etc.