lukeleeai's picture
End of training
0e5f58c verified
metadata
license: apache-2.0
base_model: mistralai/Mistral-7B-Instruct-v0.1
tags:
  - generated_from_trainer
datasets:
  - openwebtext
model-index:
  - name: Mistral_Sparse_pretraining_80_percent_10000
    results: []

Mistral_Sparse_pretraining_80_percent_10000

This model is a fine-tuned version of mistralai/Mistral-7B-Instruct-v0.1 on the openwebtext dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6872

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 32
  • seed: 0
  • distributed_type: multi-GPU
  • num_devices: 6
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 96
  • total_eval_batch_size: 192
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 10000

Training results

Training Loss Epoch Step Validation Loss
1.7461 0.05 50 1.7009
1.4034 0.1 100 1.3910
1.2302 0.15 150 1.2330
1.1363 0.19 200 1.1354
1.0699 0.24 250 1.0723
1.0316 0.29 300 1.0284
1.0044 0.34 350 0.9943
0.9719 0.39 400 0.9668
0.9391 0.44 450 0.9430
0.9194 0.48 500 0.9249
0.9131 0.53 550 0.9092
0.877 0.58 600 0.8953
0.8757 0.63 650 0.8852
0.8644 0.68 700 0.8749
0.8625 0.73 750 0.8679
0.867 0.78 800 0.8594
0.852 0.82 850 0.8529
0.8482 0.87 900 0.8473
0.8372 0.92 950 0.8421
0.8391 0.97 1000 0.8366
0.8209 1.02 1050 0.8327
0.8172 1.07 1100 0.8275
0.8094 1.11 1150 0.8247
0.8107 1.16 1200 0.8210
0.8137 1.21 1250 0.8168
0.8122 1.26 1300 0.8143
0.8047 1.31 1350 0.8115
0.804 1.36 1400 0.8083
0.7955 1.41 1450 0.8062
0.7939 1.45 1500 0.8040
0.7835 1.5 1550 0.8019
0.7983 1.55 1600 0.8001
0.7953 1.6 1650 0.7975
0.7903 1.65 1700 0.7945
0.7864 1.7 1750 0.7938
0.7972 1.75 1800 0.7914
0.7855 1.79 1850 0.7905
0.7834 1.84 1900 0.7878
0.7812 1.89 1950 0.7854
0.7865 1.94 2000 0.7847
0.7875 1.99 2050 0.7837
0.7764 2.04 2100 0.7815
0.7676 2.08 2150 0.7807
0.7716 2.13 2200 0.7796
0.777 2.18 2250 0.7781
0.7706 2.23 2300 0.7769
0.7669 2.28 2350 0.7748
0.771 2.33 2400 0.7742
0.7501 2.38 2450 0.7728
0.7653 2.42 2500 0.7713
0.7715 2.47 2550 0.7699
0.7588 2.52 2600 0.7694
0.7665 2.57 2650 0.7676
0.7616 2.62 2700 0.7658
0.7597 2.67 2750 0.7654
0.756 2.71 2800 0.7644
0.7517 2.76 2850 0.7628
0.7561 2.81 2900 0.7628
0.7413 2.86 2950 0.7620
0.7545 2.91 3000 0.7603
0.7442 2.96 3050 0.7592
0.7454 3.01 3100 0.7589
0.7575 3.05 3150 0.7583
0.739 3.1 3200 0.7571
0.7446 3.15 3250 0.7558
0.7428 3.2 3300 0.7557
0.737 3.25 3350 0.7553
0.7512 3.3 3400 0.7536
0.7447 3.34 3450 0.7525
0.7417 3.39 3500 0.7525
0.7403 3.44 3550 0.7512
0.761 3.49 3600 0.7502
0.7475 3.54 3650 0.7498
0.7535 3.59 3700 0.7486
0.733 3.64 3750 0.7483
0.7347 3.68 3800 0.7470
0.7439 3.73 3850 0.7470
0.7417 3.78 3900 0.7460
0.7383 3.83 3950 0.7460
0.7316 3.88 4000 0.7450
0.7273 3.93 4050 0.7442
0.7376 3.97 4100 0.7440
0.73 4.02 4150 0.7424
0.732 4.07 4200 0.7429
0.7278 4.12 4250 0.7419
0.721 4.17 4300 0.7416
0.7309 4.22 4350 0.7410
0.7273 4.27 4400 0.7400
0.7297 4.31 4450 0.7395
0.7321 4.36 4500 0.7385
0.7348 4.41 4550 0.7381
0.7251 4.46 4600 0.7371
0.7175 4.51 4650 0.7372
0.7356 4.56 4700 0.7368
0.7306 4.6 4750 0.7363
0.7248 4.65 4800 0.7359
0.7266 4.7 4850 0.7343
0.7243 4.75 4900 0.7349
0.7256 4.8 4950 0.7338
0.7301 4.85 5000 0.7335
0.7266 4.9 5050 0.7327
0.7229 4.94 5100 0.7321
0.7355 4.99 5150 0.7315
0.7207 5.04 5200 0.7317
0.7157 5.09 5250 0.7314
0.7214 5.14 5300 0.7299
0.7104 5.19 5350 0.7304
0.7059 5.24 5400 0.7296
0.7181 5.28 5450 0.7295
0.7226 5.33 5500 0.7286
0.7077 5.38 5550 0.7282
0.7239 5.43 5600 0.7276
0.7159 5.48 5650 0.7277
0.7169 5.53 5700 0.7271
0.7101 5.57 5750 0.7269
0.7146 5.62 5800 0.7262
0.7191 5.67 5850 0.7265
0.7124 5.72 5900 0.7248
0.7085 5.77 5950 0.7238
0.7052 5.82 6000 0.7235
0.7222 5.87 6050 0.7222
0.7089 5.91 6100 0.7221
0.7088 5.96 6150 0.7222
0.7017 6.01 6200 0.7218
0.7079 6.06 6250 0.7218
0.7209 6.11 6300 0.7211
0.691 6.16 6350 0.7210
0.7035 6.2 6400 0.7203
0.7075 6.25 6450 0.7207
0.7036 6.3 6500 0.7200
0.7023 6.35 6550 0.7189
0.7201 6.4 6600 0.7192
0.7021 6.45 6650 0.7188
0.6971 6.5 6700 0.7174
0.7087 6.54 6750 0.7184
0.7044 6.59 6800 0.7176
0.6921 6.64 6850 0.7179
0.7079 6.69 6900 0.7166
0.6908 6.74 6950 0.7158
0.687 6.79 7000 0.7158
0.696 6.83 7050 0.7148
0.6954 6.88 7100 0.7152
0.7103 6.93 7150 0.7143
0.6999 6.98 7200 0.7140
0.699 7.03 7250 0.7138
0.6959 7.08 7300 0.7138
0.6871 7.13 7350 0.7122
0.6941 7.17 7400 0.7131
0.6931 7.22 7450 0.7132
0.707 7.27 7500 0.7110
0.6911 7.32 7550 0.7122
0.7036 7.37 7600 0.7113
0.7105 7.42 7650 0.7107
0.7035 7.46 7700 0.7108
0.6901 7.51 7750 0.7113
0.6944 7.56 7800 0.7096
0.6927 7.61 7850 0.7093
0.7052 7.66 7900 0.7090
0.7046 7.71 7950 0.7082
0.6949 7.76 8000 0.7082
0.6888 7.8 8050 0.7071
0.6916 7.85 8100 0.7071
0.6937 7.9 8150 0.7067
0.7077 7.95 8200 0.7066
0.6847 8.0 8250 0.7057
0.6908 8.05 8300 0.7056
0.6813 8.1 8350 0.7060
0.6756 8.14 8400 0.7055
0.7006 8.19 8450 0.7052
0.6842 8.24 8500 0.7035
0.6851 8.29 8550 0.7044
0.6944 8.34 8600 0.7042
0.6929 8.39 8650 0.7040
0.6924 8.43 8700 0.7037
0.6843 8.48 8750 0.7037
0.7005 8.53 8800 0.7028
0.6795 8.58 8850 0.7022
0.6946 8.63 8900 0.7019
0.6761 8.68 8950 0.7016
0.6817 8.73 9000 0.7012
0.6838 8.77 9050 0.7012
0.6877 8.82 9100 0.7006
0.6812 8.87 9150 0.7004
0.6966 8.92 9200 0.7005
0.6778 8.97 9250 0.6993
0.6844 9.02 9300 0.6991
0.6853 9.06 9350 0.7000
0.6839 9.11 9400 0.6998
0.6813 9.16 9450 0.6984
0.6903 9.21 9500 0.6985
0.6819 9.26 9550 0.6987
0.6749 9.31 9600 0.6980
0.6782 9.36 9650 0.6979
0.6805 9.4 9700 0.6975
0.6907 9.45 9750 0.6974
0.6854 9.5 9800 0.6967
0.6803 9.55 9850 0.6969
0.6854 9.6 9900 0.6964
0.6761 9.65 9950 0.6966
0.6939 9.69 10000 0.6959

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.1+cu121
  • Datasets 2.15.0
  • Tokenizers 0.15.0