--- license: apache-2.0 tags: - generated_from_trainer datasets: - amazon_polarity metrics: - accuracy model-index: - name: amazonPolarity_ALBERT_5E results: - task: name: Text Classification type: text-classification dataset: name: amazon_polarity type: amazon_polarity config: amazon_polarity split: train args: amazon_polarity metrics: - name: Accuracy type: accuracy value: 0.9533333333333334 --- # amazonPolarity_ALBERT_5E This model is a fine-tuned version of [albert-base-v2](https://huggingface.co/albert-base-v2) on the amazon_polarity dataset. It achieves the following results on the evaluation set: - Loss: 0.2404 - Accuracy: 0.9533 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 32 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.43 | 0.05 | 50 | 0.4090 | 0.8467 | | 0.2597 | 0.11 | 100 | 0.3132 | 0.8933 | | 0.2517 | 0.16 | 150 | 0.2642 | 0.9 | | 0.2218 | 0.21 | 200 | 0.1973 | 0.9333 | | 0.21 | 0.27 | 250 | 0.2880 | 0.88 | | 0.2076 | 0.32 | 300 | 0.2646 | 0.8933 | | 0.2219 | 0.37 | 350 | 0.2053 | 0.94 | | 0.2086 | 0.43 | 400 | 0.2122 | 0.92 | | 0.1725 | 0.48 | 450 | 0.2145 | 0.92 | | 0.2074 | 0.53 | 500 | 0.2174 | 0.9267 | | 0.1966 | 0.59 | 550 | 0.2013 | 0.9467 | | 0.1777 | 0.64 | 600 | 0.2352 | 0.9133 | | 0.1695 | 0.69 | 650 | 0.2965 | 0.9133 | | 0.177 | 0.75 | 700 | 0.2204 | 0.94 | | 0.187 | 0.8 | 750 | 0.2328 | 0.9133 | | 0.1721 | 0.85 | 800 | 0.1713 | 0.9267 | | 0.1747 | 0.91 | 850 | 0.2365 | 0.9 | | 0.1627 | 0.96 | 900 | 0.2202 | 0.9267 | | 0.1421 | 1.01 | 950 | 0.2681 | 0.9133 | | 0.1516 | 1.07 | 1000 | 0.2116 | 0.9333 | | 0.1196 | 1.12 | 1050 | 0.1885 | 0.94 | | 0.1444 | 1.17 | 1100 | 0.2121 | 0.9267 | | 0.1198 | 1.23 | 1150 | 0.2335 | 0.9333 | | 0.1474 | 1.28 | 1200 | 0.2348 | 0.9067 | | 0.125 | 1.33 | 1250 | 0.2401 | 0.9267 | | 0.117 | 1.39 | 1300 | 0.2041 | 0.9467 | | 0.114 | 1.44 | 1350 | 0.1985 | 0.9467 | | 0.1293 | 1.49 | 1400 | 0.1891 | 0.9533 | | 0.1231 | 1.55 | 1450 | 0.2168 | 0.9467 | | 0.1306 | 1.6 | 1500 | 0.2097 | 0.94 | | 0.1449 | 1.65 | 1550 | 0.1790 | 0.9333 | | 0.132 | 1.71 | 1600 | 0.1838 | 0.9333 | | 0.124 | 1.76 | 1650 | 0.1890 | 0.94 | | 0.1419 | 1.81 | 1700 | 0.1575 | 0.9533 | | 0.139 | 1.87 | 1750 | 0.1794 | 0.94 | | 0.1171 | 1.92 | 1800 | 0.1981 | 0.9533 | | 0.1343 | 1.97 | 1850 | 0.1539 | 0.96 | | 0.0924 | 2.03 | 1900 | 0.1875 | 0.9533 | | 0.0662 | 2.08 | 1950 | 0.2658 | 0.9467 | | 0.1024 | 2.13 | 2000 | 0.1869 | 0.9467 | | 0.1051 | 2.19 | 2050 | 0.1967 | 0.94 | | 0.1047 | 2.24 | 2100 | 0.1625 | 0.9533 | | 0.0972 | 2.29 | 2150 | 0.1754 | 0.9533 | | 0.0885 | 2.35 | 2200 | 0.1831 | 0.94 | | 0.0999 | 2.4 | 2250 | 0.1830 | 0.9533 | | 0.0628 | 2.45 | 2300 | 0.1663 | 0.96 | | 0.0957 | 2.51 | 2350 | 0.1708 | 0.9467 | | 0.0864 | 2.56 | 2400 | 0.1977 | 0.9467 | | 0.0752 | 2.61 | 2450 | 0.2427 | 0.9467 | | 0.0913 | 2.67 | 2500 | 0.2325 | 0.94 | | 0.139 | 2.72 | 2550 | 0.1470 | 0.96 | | 0.0839 | 2.77 | 2600 | 0.2193 | 0.94 | | 0.1045 | 2.83 | 2650 | 0.1672 | 0.9533 | | 0.0775 | 2.88 | 2700 | 0.1782 | 0.96 | | 0.0909 | 2.93 | 2750 | 0.2241 | 0.94 | | 0.1182 | 2.99 | 2800 | 0.1942 | 0.9533 | | 0.0721 | 3.04 | 2850 | 0.1774 | 0.9533 | | 0.0562 | 3.09 | 2900 | 0.1877 | 0.9467 | | 0.0613 | 3.14 | 2950 | 0.1576 | 0.96 | | 0.0433 | 3.2 | 3000 | 0.2294 | 0.9467 | | 0.0743 | 3.25 | 3050 | 0.2050 | 0.9533 | | 0.0568 | 3.3 | 3100 | 0.1770 | 0.9667 | | 0.0785 | 3.36 | 3150 | 0.1732 | 0.96 | | 0.0434 | 3.41 | 3200 | 0.2130 | 0.9533 | | 0.0534 | 3.46 | 3250 | 0.1902 | 0.9667 | | 0.0748 | 3.52 | 3300 | 0.2082 | 0.9333 | | 0.0691 | 3.57 | 3350 | 0.1820 | 0.96 | | 0.0493 | 3.62 | 3400 | 0.1933 | 0.9533 | | 0.0388 | 3.68 | 3450 | 0.2319 | 0.94 | | 0.0649 | 3.73 | 3500 | 0.2071 | 0.94 | | 0.0369 | 3.78 | 3550 | 0.2092 | 0.9533 | | 0.0381 | 3.84 | 3600 | 0.2171 | 0.9533 | | 0.0461 | 3.89 | 3650 | 0.2430 | 0.9467 | | 0.0682 | 3.94 | 3700 | 0.2372 | 0.9467 | | 0.0438 | 4.0 | 3750 | 0.2335 | 0.9467 | | 0.0293 | 4.05 | 3800 | 0.2337 | 0.9533 | | 0.0313 | 4.1 | 3850 | 0.2349 | 0.9467 | | 0.0467 | 4.16 | 3900 | 0.2806 | 0.94 | | 0.0243 | 4.21 | 3950 | 0.2493 | 0.94 | | 0.0409 | 4.26 | 4000 | 0.2460 | 0.9533 | | 0.041 | 4.32 | 4050 | 0.2550 | 0.9533 | | 0.0319 | 4.37 | 4100 | 0.2438 | 0.9533 | | 0.0457 | 4.42 | 4150 | 0.2469 | 0.9533 | | 0.0343 | 4.48 | 4200 | 0.2298 | 0.9533 | | 0.0464 | 4.53 | 4250 | 0.2555 | 0.9467 | | 0.0289 | 4.58 | 4300 | 0.2486 | 0.9533 | | 0.0416 | 4.64 | 4350 | 0.2539 | 0.9533 | | 0.0422 | 4.69 | 4400 | 0.2534 | 0.9533 | | 0.037 | 4.74 | 4450 | 0.2492 | 0.9467 | | 0.0387 | 4.8 | 4500 | 0.2406 | 0.9533 | | 0.0472 | 4.85 | 4550 | 0.2411 | 0.9533 | | 0.0404 | 4.9 | 4600 | 0.2419 | 0.9533 | | 0.0267 | 4.96 | 4650 | 0.2404 | 0.9533 | ### Framework versions - Transformers 4.24.0 - Pytorch 1.13.0 - Datasets 2.6.1 - Tokenizers 0.13.1