--- license: apache-2.0 base_model: mistralai/Mistral-7B-v0.1 tags: - generated_from_trainer model-index: - name: Mistral_Sparse_refined_web_50p_graceful_True results: [] --- # Mistral_Sparse_refined_web_50p_graceful_True This model is a fine-tuned version of [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) on the None dataset. It achieves the following results on the evaluation set: - Loss: 2.3270 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 1 - eval_batch_size: 1 - seed: 0 - distributed_type: multi-GPU - num_devices: 4 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - total_eval_batch_size: 4 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - training_steps: 5000 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 3.7944 | 0.0 | 25 | 2.4024 | | 3.7251 | 0.01 | 50 | 2.3892 | | 2.3685 | 0.01 | 75 | 2.5422 | | 2.2521 | 0.02 | 100 | 2.4731 | | 2.2468 | 0.02 | 125 | 2.4459 | | 2.3204 | 0.02 | 150 | 2.4365 | | 2.2561 | 0.03 | 175 | 2.4258 | | 2.2388 | 0.03 | 200 | 2.4174 | | 2.2738 | 0.04 | 225 | 2.4128 | | 2.3089 | 0.04 | 250 | 2.4093 | | 2.2252 | 0.04 | 275 | 2.4057 | | 2.2394 | 0.05 | 300 | 2.3994 | | 2.2531 | 0.05 | 325 | 2.4003 | | 2.0899 | 0.06 | 350 | 2.4004 | | 2.2479 | 0.06 | 375 | 2.3982 | | 2.2875 | 0.06 | 400 | 2.3987 | | 2.282 | 0.07 | 425 | 2.3959 | | 2.2434 | 0.07 | 450 | 2.3920 | | 2.1592 | 0.08 | 475 | 2.3930 | | 2.2374 | 0.08 | 500 | 2.3915 | | 2.2968 | 0.08 | 525 | 2.3906 | | 2.1904 | 0.09 | 550 | 2.3883 | | 2.3101 | 0.09 | 575 | 2.3894 | | 2.1126 | 0.1 | 600 | 2.3899 | | 2.2092 | 0.1 | 625 | 2.3934 | | 2.3005 | 0.1 | 650 | 2.3903 | | 2.2779 | 0.11 | 675 | 2.3876 | | 2.2523 | 0.11 | 700 | 2.3886 | | 2.2307 | 0.12 | 725 | 2.3879 | | 2.1317 | 0.12 | 750 | 2.3832 | | 2.1893 | 0.12 | 775 | 2.3848 | | 2.2732 | 0.13 | 800 | 2.3822 | | 2.2914 | 0.13 | 825 | 2.3855 | | 2.2633 | 0.14 | 850 | 2.3859 | | 2.1339 | 0.14 | 875 | 2.3866 | | 2.2065 | 0.14 | 900 | 2.3862 | | 2.1436 | 0.15 | 925 | 2.3807 | | 2.2782 | 0.15 | 950 | 2.3822 | | 2.2001 | 0.16 | 975 | 2.3800 | | 2.2608 | 0.16 | 1000 | 2.3785 | | 2.2654 | 0.16 | 1025 | 2.3817 | | 2.2662 | 0.17 | 1050 | 2.3783 | | 2.3061 | 0.17 | 1075 | 2.3777 | | 2.2297 | 0.18 | 1100 | 2.3763 | | 2.2705 | 0.18 | 1125 | 2.3775 | | 2.2261 | 0.18 | 1150 | 2.3759 | | 2.2812 | 0.19 | 1175 | 2.3773 | | 2.1363 | 0.19 | 1200 | 2.3753 | | 2.2382 | 0.2 | 1225 | 2.3756 | | 2.2064 | 0.2 | 1250 | 2.3744 | | 2.2559 | 0.2 | 1275 | 2.3698 | | 2.2875 | 0.21 | 1300 | 2.3730 | | 2.2541 | 0.21 | 1325 | 2.3737 | | 2.1415 | 0.22 | 1350 | 2.3732 | | 2.2529 | 0.22 | 1375 | 2.3721 | | 2.2271 | 0.22 | 1400 | 2.3752 | | 2.1849 | 0.23 | 1425 | 2.3738 | | 2.1707 | 0.23 | 1450 | 2.3728 | | 2.1363 | 0.24 | 1475 | 2.3729 | | 2.1778 | 0.24 | 1500 | 2.3731 | | 2.1146 | 0.24 | 1525 | 2.3795 | | 2.1843 | 0.25 | 1550 | 2.3775 | | 2.3094 | 0.25 | 1575 | 2.3727 | | 2.2488 | 0.26 | 1600 | 2.3758 | | 2.226 | 0.26 | 1625 | 2.3723 | | 2.3067 | 0.26 | 1650 | 2.3734 | | 2.2167 | 0.27 | 1675 | 2.3760 | | 2.2466 | 0.27 | 1700 | 2.3750 | | 2.2446 | 0.28 | 1725 | 2.3775 | | 2.2268 | 0.28 | 1750 | 2.3741 | | 2.2113 | 0.28 | 1775 | 2.3733 | | 2.1608 | 0.29 | 1800 | 2.3762 | | 2.2354 | 0.29 | 1825 | 2.3758 | | 2.2433 | 0.3 | 1850 | 2.3745 | | 2.2266 | 0.3 | 1875 | 2.3769 | | 2.2453 | 0.3 | 1900 | 2.3726 | | 2.3001 | 0.31 | 1925 | 2.3713 | | 2.2447 | 0.31 | 1950 | 2.3722 | | 2.2708 | 0.32 | 1975 | 2.3730 | | 2.1878 | 0.32 | 2000 | 2.3743 | | 2.2041 | 0.32 | 2025 | 2.3751 | | 2.1935 | 0.33 | 2050 | 2.3750 | | 2.1981 | 0.33 | 2075 | 2.3744 | | 2.2777 | 0.34 | 2100 | 2.3720 | | 2.3121 | 0.34 | 2125 | 2.3725 | | 2.2294 | 0.34 | 2150 | 2.3750 | | 2.1802 | 0.35 | 2175 | 2.3772 | | 2.214 | 0.35 | 2200 | 2.3738 | | 2.1631 | 0.36 | 2225 | 2.3740 | | 2.1546 | 0.36 | 2250 | 2.3764 | | 2.2841 | 0.36 | 2275 | 2.3743 | | 2.271 | 0.37 | 2300 | 2.3707 | | 2.1627 | 0.37 | 2325 | 2.3719 | | 2.2071 | 0.38 | 2350 | 2.3678 | | 2.2423 | 0.38 | 2375 | 2.3703 | | 2.2554 | 0.38 | 2400 | 2.3700 | | 2.1057 | 0.39 | 2425 | 2.3720 | | 2.0983 | 0.39 | 2450 | 2.3690 | | 2.1844 | 0.4 | 2475 | 2.3686 | | 2.2797 | 0.4 | 2500 | 2.3719 | | 2.2749 | 0.4 | 2525 | 2.3707 | | 2.1326 | 0.41 | 2550 | 2.3728 | | 2.1461 | 0.41 | 2575 | 2.3693 | | 2.2324 | 0.42 | 2600 | 2.3699 | | 2.2412 | 0.42 | 2625 | 2.3690 | | 2.28 | 0.42 | 2650 | 2.3696 | | 2.261 | 0.43 | 2675 | 2.3666 | | 2.2737 | 0.43 | 2700 | 2.3674 | | 2.2653 | 0.44 | 2725 | 2.3671 | | 2.2269 | 0.44 | 2750 | 2.3643 | | 2.245 | 0.44 | 2775 | 2.3641 | | 2.3077 | 0.45 | 2800 | 2.3665 | | 2.2143 | 0.45 | 2825 | 2.3667 | | 2.2595 | 0.46 | 2850 | 2.3662 | | 2.1638 | 0.46 | 2875 | 2.3661 | | 2.1935 | 0.46 | 2900 | 2.3645 | | 2.2063 | 0.47 | 2925 | 2.3659 | | 2.2755 | 0.47 | 2950 | 2.3664 | | 2.1977 | 0.48 | 2975 | 2.3649 | | 2.2519 | 0.48 | 3000 | 2.3616 | | 2.3353 | 0.48 | 3025 | 2.3648 | | 2.223 | 0.49 | 3050 | 2.3629 | | 2.2614 | 0.49 | 3075 | 2.3611 | | 2.2983 | 0.5 | 3100 | 2.3663 | | 2.1907 | 0.5 | 3125 | 2.3650 | | 2.2683 | 0.5 | 3150 | 2.3629 | | 2.1609 | 0.51 | 3175 | 2.3633 | | 2.2316 | 0.51 | 3200 | 2.3626 | | 2.1589 | 0.52 | 3225 | 2.3603 | | 2.1479 | 0.52 | 3250 | 2.3598 | | 2.2401 | 0.52 | 3275 | 2.3617 | | 2.2073 | 0.53 | 3300 | 2.3608 | | 2.094 | 0.53 | 3325 | 2.3609 | | 2.2297 | 0.54 | 3350 | 2.3592 | | 2.1305 | 0.54 | 3375 | 2.3585 | | 2.1517 | 0.54 | 3400 | 2.3598 | | 2.1592 | 0.55 | 3425 | 2.3626 | | 2.0812 | 0.55 | 3450 | 2.3636 | | 2.219 | 0.56 | 3475 | 2.3633 | | 2.2632 | 0.56 | 3500 | 2.3625 | | 2.2302 | 0.56 | 3525 | 2.3616 | | 2.1926 | 0.57 | 3550 | 2.3623 | | 2.1878 | 0.57 | 3575 | 2.3630 | | 2.3519 | 0.58 | 3600 | 2.3609 | | 2.1699 | 0.58 | 3625 | 2.3599 | | 2.3576 | 0.58 | 3650 | 2.3618 | | 2.1629 | 0.59 | 3675 | 2.3637 | | 2.2982 | 0.59 | 3700 | 2.3606 | | 2.1949 | 0.6 | 3725 | 2.3649 | | 2.135 | 0.6 | 3750 | 2.3618 | | 2.0752 | 0.6 | 3775 | 2.3637 | | 2.2786 | 0.61 | 3800 | 2.3643 | | 2.0974 | 0.61 | 3825 | 2.3644 | | 2.2097 | 0.62 | 3850 | 2.3621 | | 2.1372 | 0.62 | 3875 | 2.3642 | | 2.2502 | 0.62 | 3900 | 2.3652 | | 2.1817 | 0.63 | 3925 | 2.3613 | | 2.1891 | 0.63 | 3950 | 2.3659 | | 2.2261 | 0.64 | 3975 | 2.3630 | | 2.2826 | 0.64 | 4000 | 2.3591 | | 2.2308 | 0.64 | 4025 | 2.3617 | | 2.1944 | 0.65 | 4050 | 2.3615 | | 2.1638 | 0.65 | 4075 | 2.3644 | | 2.1741 | 0.66 | 4100 | 2.3600 | | 2.2092 | 0.66 | 4125 | 2.3602 | | 2.1921 | 0.66 | 4150 | 2.3610 | | 2.1029 | 0.67 | 4175 | 2.3604 | | 2.2553 | 0.67 | 4200 | 2.3578 | | 2.1924 | 0.68 | 4225 | 2.3625 | | 2.1914 | 0.68 | 4250 | 2.3619 | | 2.2556 | 0.68 | 4275 | 2.3583 | | 2.188 | 0.69 | 4300 | 2.3600 | | 2.2339 | 0.69 | 4325 | 2.3594 | | 2.2484 | 0.7 | 4350 | 2.3593 | | 2.2761 | 0.7 | 4375 | 2.3585 | | 2.2305 | 0.7 | 4400 | 2.3606 | | 2.2434 | 0.71 | 4425 | 2.3586 | | 2.2879 | 0.71 | 4450 | 2.3586 | | 2.1643 | 0.72 | 4475 | 2.3639 | | 2.1697 | 0.72 | 4500 | 2.3573 | | 2.2665 | 0.72 | 4525 | 2.3574 | | 2.2349 | 0.73 | 4550 | 2.3593 | | 2.1459 | 0.73 | 4575 | 2.3554 | | 2.1619 | 0.74 | 4600 | 2.3546 | | 2.1292 | 0.74 | 4625 | 2.3542 | | 2.3202 | 0.74 | 4650 | 2.3591 | | 2.2165 | 0.75 | 4675 | 2.3562 | | 2.233 | 0.75 | 4700 | 2.3566 | | 2.2378 | 0.76 | 4725 | 2.3574 | | 2.1236 | 0.76 | 4750 | 2.3554 | | 2.2054 | 0.76 | 4775 | 2.3586 | | 2.1807 | 0.77 | 4800 | 2.3543 | | 2.2965 | 0.77 | 4825 | 2.3570 | | 2.3543 | 0.78 | 4850 | 2.3562 | | 2.1981 | 0.78 | 4875 | 2.3575 | | 2.1156 | 0.78 | 4900 | 2.3586 | | 2.1352 | 0.79 | 4925 | 2.3560 | | 2.3162 | 0.79 | 4950 | 2.3607 | | 2.1469 | 0.8 | 4975 | 2.3575 | | 2.2243 | 0.8 | 5000 | 2.3591 | ### Framework versions - Transformers 4.37.2 - Pytorch 2.1.1+cu121 - Datasets 2.15.0 - Tokenizers 0.15.0