model-index:

  • name: cp2025

Model Description

This is a model using the llama2 architecture and only 30 million parameters. It is trained on approximately 8 billion tokens of diverse web data from the first 4000000 rows of the uncleaned c4 english dataset. The model has a context length of 2048.

Downloads last month
19
Safetensors
Model size
31.2M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train cpayne1303/cp2025