File size: 293 Bytes
e6e7db5
 
 
15e01cb
d09856e
 
 
1
2
3
4
5
6
7
---
license: mit
---
This is the Float32 15M parameter Llama 2 architecture model trained on the TinyStories dataset. 
These are converted from
[karpathy/tinyllamas](https://huggingface.co/karpathy/tinyllamas).
See the [llama2.c](https://github.com/karpathy/llama2.c) project for more details.