deabuse
Model description
Модель для блюра матерных слов в предложениях, дообученная модель ai-forever/ruT5-base обучалась на искусственном датасете 40к записей. Полноценно текст блюрит не совсем хорошо, однако по отдельности слова вроде неплохо.
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1
- num_epochs: 3
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
5.7858 | 0.05 | 400 | 0.7427 |
4.3608 | 0.1 | 800 | 0.4047 |
3.2919 | 0.15 | 1200 | 0.2366 |
3.2244 | 0.2 | 1600 | 0.2164 |
3.2491 | 0.25 | 2000 | 0.1757 |
1.6803 | 0.3 | 2400 | 0.1494 |
3.2828 | 0.35 | 2800 | 0.1500 |
3.39 | 0.4 | 3200 | 0.1510 |
0.0933 | 0.45 | 3600 | 0.1524 |
3.4757 | 0.5 | 4000 | 0.1423 |
3.1424 | 0.55 | 4400 | 0.1460 |
0.9616 | 0.6 | 4800 | 0.1178 |
2.6271 | 0.65 | 5200 | 0.1178 |
1.1441 | 0.7 | 5600 | 0.1190 |
3.018 | 0.75 | 6000 | 0.1136 |
1.3421 | 0.8 | 6400 | 0.0936 |
2.3062 | 0.85 | 6800 | 0.0994 |
2.5594 | 0.9 | 7200 | 0.0945 |
2.1381 | 0.95 | 7600 | 0.1061 |
1.0893 | 1.0 | 8000 | 0.1029 |
0.7525 | 1.05 | 8400 | 0.0978 |
2.1886 | 1.1 | 8800 | 0.0840 |
1.9948 | 1.15 | 9200 | 0.0952 |
0.7933 | 1.2 | 9600 | 0.0871 |
2.0757 | 1.25 | 10000 | 0.0853 |
0.6129 | 1.31 | 10400 | 0.0857 |
0.1338 | 1.36 | 10800 | 0.0936 |
2.6454 | 1.41 | 11200 | 0.0834 |
0.4243 | 1.46 | 11600 | 0.0891 |
0.6615 | 1.51 | 12000 | 0.0885 |
0.6634 | 1.56 | 12400 | 0.0942 |
0.5665 | 1.61 | 12800 | 0.0808 |
0.6661 | 1.66 | 13200 | 0.1021 |
1.1028 | 1.71 | 13600 | 0.0820 |
1.5217 | 1.76 | 14000 | 0.0769 |
0.7644 | 1.81 | 14400 | 0.0771 |
1.3725 | 1.86 | 14800 | 0.0800 |
0.846 | 1.91 | 15200 | 0.0788 |
1.7207 | 1.96 | 15600 | 0.0806 |
0.9188 | 2.01 | 16000 | 0.0806 |
1.4303 | 2.06 | 16400 | 0.0814 |
0.1599 | 2.11 | 16800 | 0.1072 |
0.1976 | 2.16 | 17200 | 0.0823 |
0.7077 | 2.21 | 17600 | 0.0830 |
1.8896 | 2.26 | 18000 | 0.0768 |
0.6957 | 2.31 | 18400 | 0.0826 |
0.7827 | 2.36 | 18800 | 0.0802 |
1.3298 | 2.41 | 19200 | 0.0791 |
0.2254 | 2.46 | 19600 | 0.0871 |
1.041 | 2.51 | 20000 | 0.0809 |
1.5451 | 2.56 | 20400 | 0.0838 |
1.6318 | 2.61 | 20800 | 0.0801 |
1.8972 | 2.66 | 21200 | 0.0774 |
1.8895 | 2.71 | 21600 | 0.0762 |
0.7721 | 2.76 | 22000 | 0.0740 |
0.3528 | 2.81 | 22400 | 0.0781 |
1.325 | 2.86 | 22800 | 0.0770 |
0.0282 | 2.91 | 23200 | 0.0785 |
1.6303 | 2.96 | 23600 | 0.0760 |
Framework versions
- Transformers 4.39.0.dev0
- Pytorch 2.2.1
- Datasets 2.16.1
- Tokenizers 0.15.2
- Downloads last month
- 15
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for wyluilipe/deabuse
Base model
ai-forever/ruT5-base