metadata
license: apache-2.0
Neuron v0.2 1B
My Granite 4.0 1b fine-tune, I would say this is a good improvement over my last Neuron release (at least for only 1b params). It is trained again for natural chat.
Training data
Similar data to Neuron v0, I modified and scaled up my own data a bit, and added some new training data from other sources. I now have a much simpler compilation locally of the data I am using.
- Private Dataset
- PIPPA
- anthracite-org/stheno-filtered-v1.1
- jondurbin/gutenberg-dpo-v0.1
- NousResearch/Hermes-3-Dataset