Дообучались слои 14, 15, 16.

Датасет состоял из 13 862 816 токенов.

Видеокарта для дообучения: Tesla A100.

Обучение

{'loss': 0.8521, 'grad_norm': 0.5644629001617432, 'learning_rate': 2.9148375768217733e-05, 'epoch': 1.29}
{'loss': 0.6742, 'grad_norm': 0.5370610952377319, 'learning_rate': 7.199297629499562e-06, 'epoch': 2.58}
{'train_runtime': 5708.2175, 'train_samples_per_second': 22.869, 'train_steps_per_second': 0.204, 'train_loss': 0.7442483934749853, 'epoch': 3.0}
Downloads last month
169
Safetensors
Model size
1.24B params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for radce/Llama-3.2-1B-ru-v2

Finetuned
(1)
this model

Datasets used to train radce/Llama-3.2-1B-ru-v2