--- license: apache-2.0 tags: - generated_from_trainer model-index: - name: tiny-mlm-imdb results: [] --- # tiny-mlm-imdb This model is a fine-tuned version of [google/bert_uncased_L-2_H-128_A-2](https://huggingface.co/google/bert_uncased_L-2_H-128_A-2) on the None dataset. It achieves the following results on the evaluation set: - Loss: 3.5540 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: constant - num_epochs: 200 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:-----:|:---------------:| | 4.2358 | 0.16 | 500 | 3.8225 | | 4.1206 | 0.32 | 1000 | 3.7793 | | 4.0857 | 0.48 | 1500 | 3.7520 | | 4.0699 | 0.64 | 2000 | 3.7277 | | 4.0378 | 0.8 | 2500 | 3.7125 | | 4.0191 | 0.96 | 3000 | 3.7019 | | 3.9747 | 1.12 | 3500 | 3.6871 | | 3.9647 | 1.28 | 4000 | 3.6735 | | 3.956 | 1.44 | 4500 | 3.6773 | | 3.9574 | 1.6 | 5000 | 3.6580 | | 3.9408 | 1.76 | 5500 | 3.6435 | | 3.9421 | 1.92 | 6000 | 3.6419 | | 3.9265 | 2.08 | 6500 | 3.6343 | | 3.9198 | 2.24 | 7000 | 3.6306 | | 3.9205 | 2.4 | 7500 | 3.6198 | | 3.8985 | 2.56 | 8000 | 3.6158 | | 3.9167 | 2.72 | 8500 | 3.6091 | | 3.9111 | 2.88 | 9000 | 3.6073 | | 3.8882 | 3.04 | 9500 | 3.5922 | | 3.8761 | 3.2 | 10000 | 3.5908 | | 3.8603 | 3.36 | 10500 | 3.5841 | | 3.8621 | 3.52 | 11000 | 3.5835 | | 3.8332 | 3.68 | 11500 | 3.5883 | | 3.8523 | 3.84 | 12000 | 3.5798 | | 3.8449 | 4.0 | 12500 | 3.5771 | | 3.8284 | 4.16 | 13000 | 3.5653 | | 3.8253 | 4.32 | 13500 | 3.5701 | | 3.8021 | 4.48 | 14000 | 3.5681 | | 3.8316 | 4.64 | 14500 | 3.5537 | | 3.8318 | 4.8 | 15000 | 3.5609 | | 3.82 | 4.96 | 15500 | 3.5579 | | 3.8094 | 5.12 | 16000 | 3.5540 | ### Framework versions - Transformers 4.25.1 - Pytorch 1.12.1 - Datasets 2.7.1 - Tokenizers 0.13.2