--- base_model: roneneldan/TinyStories-33M library_name: Distily tags: - generated_from_trainer model-index: - name: distily_bench_obj_cross_v2.7 results: [] --- # distily_bench_obj_cross_v2.7 This student model is distilled from the teacher model [roneneldan/TinyStories-33M](https://huggingface.co/roneneldan/TinyStories-33M) using the dataset (unspecified). The [Distily](https://github.com/lapp0/distily) library was used for this distillation. It achieves the following results on the evaluation set: - eval_enwikippl: 167.9577 - eval_frwikippl: 36527.4492 - eval_zhwikippl: 179398.4219 - eval_tinystoriesppl: 10.7634 - eval_loss: 1.3170 - eval_runtime: 6.482 - eval_samples_per_second: 77.137 - eval_steps_per_second: 9.719 ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=0, loss_fn=None, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=0, loss_fn=None, layer_mapper=None, projector=None)) - train_embeddings: True - learning_rate: 0.004 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1.0 ### Resource Usage Peak GPU Memory: 8.0557 GB ### Eval-Phase Metrics | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 | | 0 | 0 | 21812.5137 | 63073.9414 | 6.0880 | 6.5014 | 76.907 | 9.69 | 11544.7471 | 48756.5938 | | 500 | 0.0808 | 207.5609 | 68082.6719 | 1.4006 | 6.4908 | 77.032 | 9.706 | 11.2631 | 330400.9375 | | 1000 | 0.1616 | 175.8123 | 42073.3594 | 1.3250 | 6.5141 | 76.756 | 9.671 | 10.6938 | 234691.875 | | 1500 | 0.2424 | 169.7431 | 37693.0312 | 1.3185 | 6.4776 | 77.19 | 9.726 | 10.7252 | 183317.4531 | | 2000 | 0.3232 | 168.7467 | 36424.6914 | 1.3180 | 6.4817 | 77.14 | 9.72 | 10.8563 | 177257.4688 | | 2500 | 0.4040 | 168.3811 | 36910.1914 | 1.3173 | 6.4798 | 77.163 | 9.723 | 10.8075 | 184790.6562 | | 3000 | 0.4848 | 166.7585 | 36342.6992 | 1.3171 | 6.4837 | 77.116 | 9.717 | 10.6836 | 179207.0781 | | 3500 | 0.5656 | 168.1269 | 36651.1523 | 1.3173 | 6.4924 | 77.014 | 9.704 | 10.7683 | 179589.9688 | | 4000 | 0.6464 | 168.4202 | 36404.2031 | 1.3173 | 6.4897 | 77.045 | 9.708 | 10.7906 | 180599.0 | | 4500 | 0.7272 | 167.8602 | 36342.6992 | 1.3173 | 6.5499 | 76.338 | 9.619 | 10.7915 | 178348.4531 | | 5000 | 0.8080 | 167.6392 | 36548.0430 | 1.3173 | 6.5041 | 76.875 | 9.686 | 10.7287 | 180261.9219 | | 5500 | 0.8888 | 168.0618 | 36527.4492 | 1.3174 | 6.5137 | 76.761 | 9.672 | 10.7701 | 179302.6406 | | 6000 | 0.9696 | 168.0098 | 36527.4492 | 1.3170 | 6.4907 | 77.034 | 9.706 | 10.7678 | 179398.4219 | | 6188 | 1.0 | 167.9577 | 36527.4492 | 1.3170 | 6.482 | 77.137 | 9.719 | 10.7634 | 179398.4219 | ### Framework versions - Distily 0.2.0 - Transformers 4.44.0 - Pytorch 2.3.0 - Datasets 2.21.0