--- license: mit tags: - generated_from_trainer model-index: - name: Clickbait3 results: [] --- # Clickbait3 This model is a fine-tuned version of [microsoft/Multilingual-MiniLM-L12-H384](https://huggingface.co/microsoft/Multilingual-MiniLM-L12-H384) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0248 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | No log | 0.05 | 50 | 0.0373 | | No log | 0.1 | 100 | 0.0320 | | No log | 0.15 | 150 | 0.0295 | | No log | 0.21 | 200 | 0.0302 | | No log | 0.26 | 250 | 0.0331 | | No log | 0.31 | 300 | 0.0280 | | No log | 0.36 | 350 | 0.0277 | | No log | 0.41 | 400 | 0.0316 | | No log | 0.46 | 450 | 0.0277 | | 0.0343 | 0.51 | 500 | 0.0276 | | 0.0343 | 0.56 | 550 | 0.0282 | | 0.0343 | 0.62 | 600 | 0.0280 | | 0.0343 | 0.67 | 650 | 0.0271 | | 0.0343 | 0.72 | 700 | 0.0264 | | 0.0343 | 0.77 | 750 | 0.0265 | | 0.0343 | 0.82 | 800 | 0.0260 | | 0.0343 | 0.87 | 850 | 0.0263 | | 0.0343 | 0.92 | 900 | 0.0259 | | 0.0343 | 0.97 | 950 | 0.0277 | | 0.0278 | 1.03 | 1000 | 0.0281 | | 0.0278 | 1.08 | 1050 | 0.0294 | | 0.0278 | 1.13 | 1100 | 0.0256 | | 0.0278 | 1.18 | 1150 | 0.0258 | | 0.0278 | 1.23 | 1200 | 0.0254 | | 0.0278 | 1.28 | 1250 | 0.0265 | | 0.0278 | 1.33 | 1300 | 0.0252 | | 0.0278 | 1.38 | 1350 | 0.0251 | | 0.0278 | 1.44 | 1400 | 0.0264 | | 0.0278 | 1.49 | 1450 | 0.0262 | | 0.023 | 1.54 | 1500 | 0.0272 | | 0.023 | 1.59 | 1550 | 0.0278 | | 0.023 | 1.64 | 1600 | 0.0255 | | 0.023 | 1.69 | 1650 | 0.0258 | | 0.023 | 1.74 | 1700 | 0.0262 | | 0.023 | 1.79 | 1750 | 0.0250 | | 0.023 | 1.85 | 1800 | 0.0253 | | 0.023 | 1.9 | 1850 | 0.0271 | | 0.023 | 1.95 | 1900 | 0.0248 | | 0.023 | 2.0 | 1950 | 0.0258 | | 0.0224 | 2.05 | 2000 | 0.0252 | | 0.0224 | 2.1 | 2050 | 0.0259 | | 0.0224 | 2.15 | 2100 | 0.0254 | | 0.0224 | 2.21 | 2150 | 0.0260 | | 0.0224 | 2.26 | 2200 | 0.0254 | | 0.0224 | 2.31 | 2250 | 0.0266 | | 0.0224 | 2.36 | 2300 | 0.0258 | | 0.0224 | 2.41 | 2350 | 0.0258 | | 0.0224 | 2.46 | 2400 | 0.0256 | ### Framework versions - Transformers 4.17.0 - Pytorch 1.11.0 - Datasets 2.0.0 - Tokenizers 0.11.6