--- tags: - generated_from_trainer model-index: - name: icdar23-entrydetector_plaintext_breaks_indents_left_diff results: [] --- # icdar23-entrydetector_plaintext_breaks_indents_left_diff This model is a fine-tuned version of [HueyNemud/das22-10-camembert_pretrained](https://huggingface.co/HueyNemud/das22-10-camembert_pretrained) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0091 - Ebegin: {'precision': 0.9909729187562688, 'recall': 0.9873417721518988, 'f1': 0.9891540130151844, 'number': 3002} - Eend: {'precision': 0.986648865153538, 'recall': 0.9853333333333333, 'f1': 0.9859906604402935, 'number': 3000} - Overall Precision: 0.9888 - Overall Recall: 0.9863 - Overall F1: 0.9876 - Overall Accuracy: 0.9979 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - training_steps: 6000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | No log | 0.07 | 300 | 0.0315 | 0.9611 | 0.9872 | 0.9740 | 0.9956 | | 0.1635 | 0.14 | 600 | 0.0130 | 0.9850 | 0.9908 | 0.9879 | 0.9979 | | 0.1635 | 0.21 | 900 | 0.0096 | 0.9818 | 0.9951 | 0.9884 | 0.9979 | | 0.0194 | 0.29 | 1200 | 0.0074 | 0.9888 | 0.9908 | 0.9898 | 0.9982 | | 0.0107 | 0.36 | 1500 | 0.0062 | 0.9885 | 0.9943 | 0.9914 | 0.9984 | | 0.0107 | 0.43 | 1800 | 0.0082 | 0.9928 | 0.9870 | 0.9899 | 0.9982 | | 0.0078 | 0.5 | 2100 | 0.0060 | 0.9860 | 0.9948 | 0.9904 | 0.9983 | | 0.0078 | 0.57 | 2400 | 0.0064 | 0.9865 | 0.9941 | 0.9903 | 0.9983 | | 0.0061 | 0.64 | 2700 | 0.0055 | 0.9938 | 0.9876 | 0.9907 | 0.9983 | ### Framework versions - Transformers 4.26.0 - Pytorch 1.13.1+cu116 - Datasets 2.9.0 - Tokenizers 0.13.2