--- license: apache-2.0 base_model: google-bert/bert-base-multilingual-cased tags: - generated_from_trainer metrics: - accuracy - f1 model-index: - name: SingPurcBERT-Katch-0329-v2 results: [] --- # SingPurcBERT-Katch-0329-v2 This model is a fine-tuned version of [google-bert/bert-base-multilingual-cased](https://huggingface.co/google-bert/bert-base-multilingual-cased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.4923 - Accuracy: 0.7693 - F1: 0.7689 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:------:| | 0.4756 | 1.0 | 2522 | 0.4923 | 0.7693 | 0.7689 | | 0.4137 | 2.0 | 5044 | 0.5065 | 0.7908 | 0.7906 | | 0.388 | 3.0 | 7566 | 0.5142 | 0.7985 | 0.7985 | | 0.3482 | 4.0 | 10088 | 0.6883 | 0.7971 | 0.7971 | | 0.349 | 5.0 | 12610 | 0.8783 | 0.7908 | 0.7904 | | 0.3124 | 6.0 | 15132 | 0.8833 | 0.7865 | 0.7864 | | 0.2986 | 7.0 | 17654 | 0.9290 | 0.7880 | 0.7880 | | 0.2404 | 8.0 | 20176 | 1.1548 | 0.7847 | 0.7846 | | 0.2243 | 9.0 | 22698 | 1.2760 | 0.7837 | 0.7836 | | 0.1488 | 10.0 | 25220 | 1.3746 | 0.7831 | 0.7830 | ### Framework versions - Transformers 4.36.0.dev0 - Pytorch 2.0.0 - Datasets 2.14.5 - Tokenizers 0.14.1