Edit model card

I cut my TinyLlama 1.1B cinder v 2 down from 22 layers to 14. At 14 there was no coherent text but there were emerging ideas of a response. I then trained on the Reason with Cinder dataset and prunned the model again to 11 layers and only emerging responses. I then trained on a subset of open orca, sharegpt, cinder again, and tiny textbooks. I am putting it up as a base model that may need work. If you continue training please let me know on the tinyllama discord, I have some interesting plans for this model. I use the Zephyr chat format.

Downloads last month
2,555
Safetensors
Model size
616M params
Tensor type
BF16
·