metadata
license: apache-2.0
Overview
The TinyLlama project aims to pretrain a 1.1B Llama model on 3 trillion tokens. This is the chat model finetuned on a diverse range of synthetic dialogues generated by ChatGPT.
Variants
No | Variant | Cortex CLI command |
---|---|---|
1 | 1b-gguf | cortex run tinyllama:1b-gguf |
Use it with Jan (UI)
- Install Jan using Quickstart
- Use in Jan model Hub:
cortexhub/tinyllama
Use it with Cortex (CLI)
- Install Cortex using Quickstart
- Run the model with command:
cortex run tinyllama
Credits
- Author: Microsoft
- Converter: Homebrew
- Original License: License
- Papers: Tinyllama Paper