Edit model card

I cut my TinyLlama 1.1B cinder v 2 down from 22 layers to 14. At 14 there was no coherent text but there were emerging ideas of a response. I then trained on the Reason with Cinder dataset and prunned the model again to 11 layers and only emerging responses. I then trained on a subset of open orca, sharegpt, cinder again, and tiny textbooks. I am putting it up as a base model that may need work. If you continue training please let me know on the tinyllama discord, I have some interesting plans for this model. I use the Zephyr chat format.

Downloads last month
55
Safetensors
Model size
616M params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Josephgflowers/Tinyllama-616M-Cinder

Quantizations
1 model