--- base_model: monologg/koelectra-small-v3-discriminator tags: - generated_from_trainer model-index: - name: find_tune_bert_output results: [] --- # find_tune_bert_output This model is a fine-tuned version of [monologg/koelectra-small-v3-discriminator](https://huggingface.co/monologg/koelectra-small-v3-discriminator) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.2110 - Overall Precision: 0.8468 - Overall Recall: 0.8561 - Overall F1: 0.8514 - Overall Accuracy: 0.9405 - Loc F1: 0.9090 - Org F1: 0.7685 - Per F1: 0.8477 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 7 ### Training results | Training Loss | Epoch | Step | Validation Loss | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | Loc F1 | Org F1 | Per F1 | |:-------------:|:-----:|:----:|:---------------:|:-----------------:|:--------------:|:----------:|:----------------:|:------:|:------:|:------:| | 0.2146 | 0.8 | 1000 | 0.2903 | 0.7632 | 0.8340 | 0.7970 | 0.9175 | 0.8729 | 0.6812 | 0.7966 | | 0.2538 | 1.6 | 2000 | 0.2374 | 0.8183 | 0.8290 | 0.8236 | 0.9299 | 0.8940 | 0.7187 | 0.8178 | | 0.2192 | 2.4 | 3000 | 0.2265 | 0.8246 | 0.8437 | 0.8340 | 0.9340 | 0.8956 | 0.7403 | 0.8322 | | 0.1967 | 3.2 | 4000 | 0.2206 | 0.8261 | 0.8529 | 0.8393 | 0.9354 | 0.9047 | 0.7499 | 0.8290 | | 0.1814 | 4.0 | 5000 | 0.2169 | 0.8371 | 0.8538 | 0.8453 | 0.9379 | 0.9057 | 0.7605 | 0.8388 | | 0.1661 | 4.8 | 6000 | 0.2169 | 0.8403 | 0.8490 | 0.8446 | 0.9382 | 0.9050 | 0.7583 | 0.8378 | | 0.1577 | 5.6 | 7000 | 0.2116 | 0.8413 | 0.8604 | 0.8507 | 0.9401 | 0.9088 | 0.7670 | 0.8472 | | 0.1544 | 6.4 | 8000 | 0.2110 | 0.8468 | 0.8561 | 0.8514 | 0.9405 | 0.9090 | 0.7685 | 0.8477 | ### Framework versions - Transformers 4.38.2 - Pytorch 2.2.1+cu121 - Datasets 2.18.0 - Tokenizers 0.15.2