tiny-mlm-wikitext-from-scratch-custom-tokenizer-target-conll2003
This model is a fine-tuned version of muhtasham/tiny-mlm-wikitext-from-scratch-custom-tokenizer on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.3451
- Precision: 0.3914
- Recall: 0.5631
- F1: 0.4618
- Accuracy: 0.8978
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- training_steps: 5000
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|---|---|---|
1.0009 | 1.14 | 500 | 0.6888 | 0.1156 | 0.1160 | 0.1158 | 0.8144 |
0.6084 | 2.28 | 1000 | 0.5797 | 0.2110 | 0.2735 | 0.2382 | 0.8417 |
0.5231 | 3.42 | 1500 | 0.5113 | 0.2567 | 0.3295 | 0.2886 | 0.8560 |
0.4552 | 4.56 | 2000 | 0.4575 | 0.2947 | 0.4061 | 0.3415 | 0.8701 |
0.4 | 5.69 | 2500 | 0.4172 | 0.3182 | 0.4615 | 0.3767 | 0.8802 |
0.3587 | 6.83 | 3000 | 0.3915 | 0.3378 | 0.4921 | 0.4006 | 0.8871 |
0.3263 | 7.97 | 3500 | 0.3719 | 0.3638 | 0.5296 | 0.4313 | 0.8918 |
0.2975 | 9.11 | 4000 | 0.3605 | 0.3687 | 0.5411 | 0.4385 | 0.8939 |
0.2748 | 10.25 | 4500 | 0.3509 | 0.3868 | 0.5471 | 0.4532 | 0.8969 |
0.2602 | 11.39 | 5000 | 0.3451 | 0.3914 | 0.5631 | 0.4618 | 0.8978 |
Framework versions
- Transformers 4.26.0.dev0
- Pytorch 1.13.1+cu116
- Datasets 2.8.1.dev0
- Tokenizers 0.13.2
- Downloads last month
- 122
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.