tinyllama-15M-fp32 / README.md
nickypro's picture
Update README.md
15e01cb
|
raw
history blame
293 Bytes
metadata
license: mit

This is the Float32 15M parameter Llama 2 architecture model trained on the TinyStories dataset. These are converted from karpathy/tinyllamas. See the llama2.c project for more details.