bengali_pos_v1_400000
This model is a fine-tuned version of mHossain/bengali_pos_v1_300000 on the pos_tag_100k dataset. It achieves the following results on the evaluation set:
- Loss: 0.5609
- Precision: 0.7830
- Recall: 0.7856
- F1: 0.7843
- Accuracy: 0.8402
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|---|---|---|
0.5908 | 1.0 | 22500 | 0.5513 | 0.7688 | 0.7698 | 0.7693 | 0.8289 |
0.4642 | 2.0 | 45000 | 0.5415 | 0.7799 | 0.7822 | 0.7810 | 0.8382 |
0.3773 | 3.0 | 67500 | 0.5609 | 0.7830 | 0.7856 | 0.7843 | 0.8402 |
Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.1
- Tokenizers 0.15.0
- Downloads last month
- 4
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for mHossain/bengali_pos_v1_400000
Evaluation results
- Precision on pos_tag_100kvalidation set self-reported0.783
- Recall on pos_tag_100kvalidation set self-reported0.786
- F1 on pos_tag_100kvalidation set self-reported0.784
- Accuracy on pos_tag_100kvalidation set self-reported0.840