--- license: apache-2.0 tags: - generated_from_trainer datasets: - amazon_polarity metrics: - accuracy model-index: - name: amazonPolarity_DistilBERT_5EE results: - task: name: Text Classification type: text-classification dataset: name: amazon_polarity type: amazon_polarity config: amazon_polarity split: train args: amazon_polarity metrics: - name: Accuracy type: accuracy value: 0.94 --- # amazonPolarity_DistilBERT_5EE This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the amazon_polarity dataset. It achieves the following results on the evaluation set: - Loss: 0.2899 - Accuracy: 0.94 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.6581 | 0.03 | 50 | 0.5315 | 0.84 | | 0.4321 | 0.05 | 100 | 0.2897 | 0.8933 | | 0.298 | 0.08 | 150 | 0.3165 | 0.8667 | | 0.2902 | 0.11 | 200 | 0.2552 | 0.9067 | | 0.2824 | 0.13 | 250 | 0.2277 | 0.9133 | | 0.2522 | 0.16 | 300 | 0.1998 | 0.94 | | 0.2781 | 0.19 | 350 | 0.1933 | 0.94 | | 0.2668 | 0.21 | 400 | 0.2316 | 0.92 | | 0.2619 | 0.24 | 450 | 0.1968 | 0.9333 | | 0.2446 | 0.27 | 500 | 0.1846 | 0.9467 | | 0.2677 | 0.29 | 550 | 0.1818 | 0.94 | | 0.2026 | 0.32 | 600 | 0.2348 | 0.9133 | | 0.2351 | 0.35 | 650 | 0.2127 | 0.92 | | 0.2685 | 0.37 | 700 | 0.1792 | 0.94 | | 0.2141 | 0.4 | 750 | 0.2252 | 0.9133 | | 0.2193 | 0.43 | 800 | 0.2131 | 0.9267 | | 0.2456 | 0.45 | 850 | 0.2205 | 0.9133 | | 0.2548 | 0.48 | 900 | 0.1788 | 0.94 | | 0.2353 | 0.51 | 950 | 0.1954 | 0.9267 | | 0.2546 | 0.53 | 1000 | 0.1815 | 0.9333 | | 0.2583 | 0.56 | 1050 | 0.1654 | 0.9333 | | 0.219 | 0.59 | 1100 | 0.1760 | 0.9467 | | 0.2241 | 0.61 | 1150 | 0.2107 | 0.92 | | 0.2201 | 0.64 | 1200 | 0.2381 | 0.8933 | | 0.1745 | 0.67 | 1250 | 0.1944 | 0.92 | | 0.2698 | 0.69 | 1300 | 0.1971 | 0.9267 | | 0.214 | 0.72 | 1350 | 0.1944 | 0.9333 | | 0.2436 | 0.75 | 1400 | 0.2079 | 0.92 | | 0.2318 | 0.77 | 1450 | 0.2088 | 0.9333 | | 0.2206 | 0.8 | 1500 | 0.1875 | 0.94 | | 0.2593 | 0.83 | 1550 | 0.1797 | 0.9267 | | 0.1908 | 0.85 | 1600 | 0.1924 | 0.9333 | | 0.2378 | 0.88 | 1650 | 0.1649 | 0.9267 | | 0.2332 | 0.91 | 1700 | 0.1768 | 0.94 | | 0.2125 | 0.93 | 1750 | 0.2276 | 0.92 | | 0.2174 | 0.96 | 1800 | 0.2035 | 0.9333 | | 0.19 | 0.99 | 1850 | 0.1805 | 0.94 | | 0.1515 | 1.01 | 1900 | 0.1832 | 0.94 | | 0.1671 | 1.04 | 1950 | 0.1902 | 0.94 | | 0.171 | 1.07 | 2000 | 0.2468 | 0.9267 | | 0.1495 | 1.09 | 2050 | 0.2276 | 0.9267 | | 0.1535 | 1.12 | 2100 | 0.1926 | 0.94 | | 0.2085 | 1.15 | 2150 | 0.1878 | 0.94 | | 0.1395 | 1.17 | 2200 | 0.1795 | 0.9467 | | 0.1556 | 1.2 | 2250 | 0.1554 | 0.9467 | | 0.1273 | 1.23 | 2300 | 0.1707 | 0.94 | | 0.1873 | 1.25 | 2350 | 0.1867 | 0.9467 | | 0.1589 | 1.28 | 2400 | 0.2089 | 0.9333 | | 0.1426 | 1.31 | 2450 | 0.1797 | 0.9467 | | 0.149 | 1.33 | 2500 | 0.1991 | 0.9333 | | 0.1535 | 1.36 | 2550 | 0.2116 | 0.94 | | 0.1671 | 1.39 | 2600 | 0.1704 | 0.9467 | | 0.1582 | 1.41 | 2650 | 0.1843 | 0.94 | | 0.1393 | 1.44 | 2700 | 0.1831 | 0.94 | | 0.1474 | 1.47 | 2750 | 0.1895 | 0.94 | | 0.203 | 1.49 | 2800 | 0.1843 | 0.9467 | | 0.1562 | 1.52 | 2850 | 0.2060 | 0.9467 | | 0.1886 | 1.55 | 2900 | 0.1837 | 0.94 | | 0.1332 | 1.57 | 2950 | 0.1920 | 0.9467 | | 0.1519 | 1.6 | 3000 | 0.1789 | 0.9533 | | 0.1354 | 1.63 | 3050 | 0.1974 | 0.9467 | | 0.125 | 1.65 | 3100 | 0.1890 | 0.9533 | | 0.2044 | 1.68 | 3150 | 0.1755 | 0.9533 | | 0.1746 | 1.71 | 3200 | 0.1607 | 0.9467 | | 0.1981 | 1.73 | 3250 | 0.1613 | 0.9533 | | 0.1276 | 1.76 | 3300 | 0.1825 | 0.96 | | 0.1935 | 1.79 | 3350 | 0.1707 | 0.9533 | | 0.1848 | 1.81 | 3400 | 0.1697 | 0.96 | | 0.1596 | 1.84 | 3450 | 0.1581 | 0.9667 | | 0.1797 | 1.87 | 3500 | 0.1634 | 0.96 | | 0.1493 | 1.89 | 3550 | 0.1614 | 0.9533 | | 0.1703 | 1.92 | 3600 | 0.1673 | 0.9467 | | 0.1951 | 1.95 | 3650 | 0.1589 | 0.9533 | | 0.1582 | 1.97 | 3700 | 0.1761 | 0.9467 | | 0.1974 | 2.0 | 3750 | 0.1918 | 0.94 | | 0.1056 | 2.03 | 3800 | 0.2063 | 0.94 | | 0.1109 | 2.05 | 3850 | 0.2031 | 0.9467 | | 0.113 | 2.08 | 3900 | 0.2118 | 0.9467 | | 0.0834 | 2.11 | 3950 | 0.1974 | 0.9533 | | 0.1434 | 2.13 | 4000 | 0.2075 | 0.9533 | | 0.0691 | 2.16 | 4050 | 0.2178 | 0.9533 | | 0.1144 | 2.19 | 4100 | 0.2383 | 0.9467 | | 0.1446 | 2.21 | 4150 | 0.2207 | 0.9533 | | 0.172 | 2.24 | 4200 | 0.2034 | 0.9467 | | 0.1026 | 2.27 | 4250 | 0.2048 | 0.9467 | | 0.1131 | 2.29 | 4300 | 0.2334 | 0.9467 | | 0.121 | 2.32 | 4350 | 0.2367 | 0.9333 | | 0.1144 | 2.35 | 4400 | 0.2313 | 0.9467 | | 0.1089 | 2.37 | 4450 | 0.2352 | 0.9533 | | 0.1193 | 2.4 | 4500 | 0.2440 | 0.94 | | 0.0689 | 2.43 | 4550 | 0.2379 | 0.9333 | | 0.1799 | 2.45 | 4600 | 0.2354 | 0.9467 | | 0.1068 | 2.48 | 4650 | 0.2158 | 0.9533 | | 0.0974 | 2.51 | 4700 | 0.2456 | 0.94 | | 0.0637 | 2.53 | 4750 | 0.2191 | 0.9333 | | 0.1125 | 2.56 | 4800 | 0.2390 | 0.9467 | | 0.1706 | 2.59 | 4850 | 0.2407 | 0.94 | | 0.1533 | 2.61 | 4900 | 0.2242 | 0.9533 | | 0.1357 | 2.64 | 4950 | 0.2119 | 0.9533 | | 0.1342 | 2.67 | 5000 | 0.2268 | 0.9467 | | 0.0796 | 2.69 | 5050 | 0.2450 | 0.9467 | | 0.1351 | 2.72 | 5100 | 0.2499 | 0.94 | | 0.1285 | 2.75 | 5150 | 0.2252 | 0.94 | | 0.1563 | 2.77 | 5200 | 0.2191 | 0.94 | | 0.1022 | 2.8 | 5250 | 0.2256 | 0.9533 | | 0.11 | 2.83 | 5300 | 0.2365 | 0.9467 | | 0.0926 | 2.85 | 5350 | 0.2206 | 0.9467 | | 0.1043 | 2.88 | 5400 | 0.2018 | 0.9533 | | 0.1041 | 2.91 | 5450 | 0.2268 | 0.9467 | | 0.1232 | 2.93 | 5500 | 0.2164 | 0.9467 | | 0.1537 | 2.96 | 5550 | 0.1956 | 0.9533 | | 0.1188 | 2.99 | 5600 | 0.2126 | 0.9467 | | 0.0749 | 3.01 | 5650 | 0.2249 | 0.9467 | | 0.062 | 3.04 | 5700 | 0.2254 | 0.9467 | | 0.0755 | 3.07 | 5750 | 0.2472 | 0.94 | | 0.0866 | 3.09 | 5800 | 0.2569 | 0.94 | | 0.0502 | 3.12 | 5850 | 0.2481 | 0.9467 | | 0.1158 | 3.15 | 5900 | 0.2457 | 0.94 | | 0.0413 | 3.17 | 5950 | 0.2500 | 0.94 | | 0.0966 | 3.2 | 6000 | 0.2851 | 0.9333 | | 0.0613 | 3.23 | 6050 | 0.2717 | 0.9467 | | 0.1029 | 3.25 | 6100 | 0.2714 | 0.94 | | 0.0833 | 3.28 | 6150 | 0.2683 | 0.94 | | 0.0928 | 3.31 | 6200 | 0.2490 | 0.9467 | | 0.0571 | 3.33 | 6250 | 0.2575 | 0.9533 | | 0.1252 | 3.36 | 6300 | 0.2599 | 0.9467 | | 0.0788 | 3.39 | 6350 | 0.2522 | 0.9467 | | 0.0862 | 3.41 | 6400 | 0.2489 | 0.9533 | | 0.112 | 3.44 | 6450 | 0.2452 | 0.9533 | | 0.0868 | 3.47 | 6500 | 0.2438 | 0.9533 | | 0.0979 | 3.49 | 6550 | 0.2474 | 0.94 | | 0.0739 | 3.52 | 6600 | 0.2508 | 0.94 | | 0.0786 | 3.55 | 6650 | 0.2621 | 0.94 | | 0.0872 | 3.57 | 6700 | 0.2543 | 0.9333 | | 0.0962 | 3.6 | 6750 | 0.2347 | 0.9467 | | 0.124 | 3.63 | 6800 | 0.2319 | 0.9533 | | 0.0747 | 3.65 | 6850 | 0.2448 | 0.9533 | | 0.0591 | 3.68 | 6900 | 0.2379 | 0.94 | | 0.1049 | 3.71 | 6950 | 0.2493 | 0.9333 | | 0.0772 | 3.73 | 7000 | 0.2429 | 0.94 | | 0.071 | 3.76 | 7050 | 0.2558 | 0.94 | | 0.1116 | 3.79 | 7100 | 0.2600 | 0.94 | | 0.1199 | 3.81 | 7150 | 0.2480 | 0.94 | | 0.0819 | 3.84 | 7200 | 0.2506 | 0.94 | | 0.1054 | 3.87 | 7250 | 0.2431 | 0.94 | | 0.09 | 3.89 | 7300 | 0.2582 | 0.9333 | | 0.0936 | 3.92 | 7350 | 0.2460 | 0.94 | | 0.0469 | 3.95 | 7400 | 0.2509 | 0.94 | | 0.1101 | 3.97 | 7450 | 0.2545 | 0.9467 | | 0.1077 | 4.0 | 7500 | 0.2640 | 0.9467 | | 0.0777 | 4.03 | 7550 | 0.2709 | 0.94 | | 0.0777 | 4.05 | 7600 | 0.2842 | 0.94 | | 0.0847 | 4.08 | 7650 | 0.2649 | 0.94 | | 0.0462 | 4.11 | 7700 | 0.2702 | 0.9467 | | 0.0572 | 4.13 | 7750 | 0.2628 | 0.94 | | 0.0435 | 4.16 | 7800 | 0.2689 | 0.9467 | | 0.0566 | 4.19 | 7850 | 0.2727 | 0.9467 | | 0.1149 | 4.21 | 7900 | 0.2635 | 0.9467 | | 0.0557 | 4.24 | 7950 | 0.2665 | 0.9467 | | 0.061 | 4.27 | 8000 | 0.2680 | 0.9467 | | 0.0664 | 4.29 | 8050 | 0.2767 | 0.9467 | | 0.0481 | 4.32 | 8100 | 0.2662 | 0.9467 | | 0.0893 | 4.35 | 8150 | 0.2677 | 0.9467 | | 0.0855 | 4.37 | 8200 | 0.2733 | 0.9467 | | 0.0552 | 4.4 | 8250 | 0.2589 | 0.94 | | 0.0469 | 4.43 | 8300 | 0.2733 | 0.94 | | 0.0633 | 4.45 | 8350 | 0.2799 | 0.94 | | 0.0629 | 4.48 | 8400 | 0.2838 | 0.94 | | 0.0854 | 4.51 | 8450 | 0.2837 | 0.94 | | 0.0596 | 4.53 | 8500 | 0.2808 | 0.94 | | 0.0579 | 4.56 | 8550 | 0.2839 | 0.94 | | 0.0508 | 4.59 | 8600 | 0.2844 | 0.94 | | 0.0557 | 4.61 | 8650 | 0.2833 | 0.94 | | 0.0383 | 4.64 | 8700 | 0.2878 | 0.94 | | 0.0554 | 4.67 | 8750 | 0.2924 | 0.94 | | 0.0681 | 4.69 | 8800 | 0.2868 | 0.94 | | 0.065 | 4.72 | 8850 | 0.2888 | 0.94 | | 0.0731 | 4.75 | 8900 | 0.2946 | 0.94 | | 0.0638 | 4.77 | 8950 | 0.2886 | 0.94 | | 0.043 | 4.8 | 9000 | 0.2867 | 0.94 | | 0.0658 | 4.83 | 9050 | 0.2872 | 0.94 | | 0.0249 | 4.85 | 9100 | 0.2882 | 0.94 | | 0.0612 | 4.88 | 9150 | 0.2902 | 0.94 | | 0.0271 | 4.91 | 9200 | 0.2890 | 0.94 | | 0.0308 | 4.93 | 9250 | 0.2897 | 0.94 | | 0.0896 | 4.96 | 9300 | 0.2898 | 0.94 | | 0.1172 | 4.99 | 9350 | 0.2899 | 0.94 | ### Framework versions - Transformers 4.24.0 - Pytorch 1.12.1+cu113 - Datasets 2.6.1 - Tokenizers 0.13.1