--- library_name: transformers tags: - generated_from_trainer model-index: - name: rerun-09-19-2024-experiment-distill-tree-babylm2024-360-2 results: [] --- # rerun-09-19-2024-experiment-distill-tree-babylm2024-360-2 This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.7367 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.00025 - train_batch_size: 64 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 200 - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:-----:|:---------------:| | 5.146 | 1.0 | 2065 | 5.6306 | | 3.1038 | 2.0 | 4130 | 3.4614 | | 2.4109 | 3.0 | 6195 | 2.7661 | | 2.091 | 4.0 | 8260 | 2.3748 | | 1.8375 | 5.0 | 10325 | 2.1678 | | 1.7081 | 6.0 | 12390 | 1.9763 | | 1.5419 | 7.0 | 14455 | 1.8331 | | 1.4752 | 8.0 | 16520 | 1.7660 | | 1.4168 | 9.0 | 18585 | 1.7420 | | 1.4489 | 10.0 | 20650 | 1.7367 | ### Framework versions - Transformers 4.45.0.dev0 - Pytorch 2.4.1+cu121 - Tokenizers 0.19.1