ahmedabdelwahed commited on
Commit
1548700
1 Parent(s): 802123b
Files changed (3) hide show
  1. README.md +48 -48
  2. adapter_model.safetensors +1 -1
  3. training_args.bin +1 -1
README.md CHANGED
@@ -17,14 +17,14 @@ should probably proofread and complete it, then remove this comment. -->
17
  This model is a fine-tuned version of [ahmedabdelwahed/Mojiz-sft](https://huggingface.co/ahmedabdelwahed/Mojiz-sft) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
  - Loss: 0.0000
20
- - Rewards/chosen: 19.0228
21
- - Rewards/rejected: -9.2707
22
  - Rewards/accuracies: 1.0
23
- - Rewards/margins: 28.2934
24
- - Logps/rejected: -89.5808
25
- - Logps/chosen: -288.2675
26
- - Logits/rejected: -11.5033
27
- - Logits/chosen: -12.5545
28
 
29
  ## Model description
30
 
@@ -43,7 +43,7 @@ More information needed
43
  ### Training hyperparameters
44
 
45
  The following hyperparameters were used during training:
46
- - learning_rate: 0.0001
47
  - train_batch_size: 4
48
  - eval_batch_size: 8
49
  - seed: 42
@@ -56,46 +56,46 @@ The following hyperparameters were used during training:
56
 
57
  | Training Loss | Epoch | Step | Validation Loss | Rewards/chosen | Rewards/rejected | Rewards/accuracies | Rewards/margins | Logps/rejected | Logps/chosen | Logits/rejected | Logits/chosen |
58
  |:-------------:|:-----:|:----:|:---------------:|:--------------:|:----------------:|:------------------:|:---------------:|:--------------:|:------------:|:---------------:|:-------------:|
59
- | 0.0017 | 0.41 | 100 | 0.0000 | 9.9359 | -3.7597 | 1.0 | 13.6956 | -78.5589 | -306.4413 | -11.4127 | -12.4541 |
60
- | 0.0002 | 0.82 | 200 | 0.0000 | 14.2330 | -5.8158 | 1.0 | 20.0488 | -82.6710 | -297.8470 | -11.3000 | -12.2684 |
61
- | 0.0035 | 1.22 | 300 | 0.0000 | 14.9349 | -6.8685 | 1.0 | 21.8033 | -84.7764 | -296.4433 | -11.2636 | -12.2227 |
62
- | 0.0 | 1.63 | 400 | 0.0000 | 15.6103 | -8.0363 | 1.0 | 23.6466 | -87.1120 | -295.0924 | -11.2134 | -12.1369 |
63
- | 0.0 | 2.04 | 500 | 0.0000 | 15.9830 | -8.2642 | 1.0 | 24.2473 | -87.5679 | -294.3470 | -11.2229 | -12.1521 |
64
- | 0.0 | 2.45 | 600 | 0.0000 | 16.4577 | -8.2301 | 1.0 | 24.6877 | -87.4996 | -293.3977 | -11.2626 | -12.2109 |
65
- | 0.0 | 2.86 | 700 | 0.0000 | 16.2790 | -8.9139 | 1.0 | 25.1929 | -88.8673 | -293.7551 | -11.2326 | -12.1416 |
66
- | 0.0 | 3.27 | 800 | 0.0000 | 17.2718 | -7.9491 | 1.0 | 25.2209 | -86.9377 | -291.7695 | -11.3610 | -12.3472 |
67
- | 0.0 | 3.67 | 900 | 0.0000 | 17.2795 | -7.9729 | 1.0 | 25.2524 | -86.9852 | -291.7540 | -11.3608 | -12.3468 |
68
- | 0.0 | 4.08 | 1000 | 0.0000 | 17.3648 | -8.1290 | 1.0 | 25.4939 | -87.2975 | -291.5834 | -11.3658 | -12.3537 |
69
- | 0.0 | 4.49 | 1100 | 0.0000 | 17.2617 | -8.8784 | 1.0 | 26.1402 | -88.7963 | -291.7896 | -11.3591 | -12.3382 |
70
- | 0.0 | 4.9 | 1200 | 0.0000 | 17.2793 | -8.9149 | 1.0 | 26.1942 | -88.8693 | -291.7545 | -11.3592 | -12.3377 |
71
- | 0.0 | 5.31 | 1300 | 0.0000 | 17.4039 | -9.2069 | 1.0 | 26.6109 | -89.4533 | -291.5052 | -11.3747 | -12.3648 |
72
- | 0.0 | 5.71 | 1400 | 0.0000 | 17.4994 | -9.2845 | 1.0 | 26.7839 | -89.6084 | -291.3142 | -11.3786 | -12.3691 |
73
- | 0.0 | 6.12 | 1500 | 0.0000 | 17.5283 | -9.3118 | 1.0 | 26.8401 | -89.6630 | -291.2565 | -11.3767 | -12.3657 |
74
- | 0.0001 | 6.53 | 1600 | 0.0000 | 17.6023 | -9.2860 | 1.0 | 26.8883 | -89.6115 | -291.1085 | -11.3862 | -12.3799 |
75
- | 0.0 | 6.94 | 1700 | 0.0000 | 17.6405 | -9.2673 | 1.0 | 26.9078 | -89.5740 | -291.0320 | -11.3904 | -12.3869 |
76
- | 0.0 | 7.35 | 1800 | 0.0000 | 17.6910 | -9.2988 | 1.0 | 26.9898 | -89.6371 | -290.9311 | -11.3915 | -12.3888 |
77
- | 0.0 | 7.76 | 1900 | 0.0000 | 17.7322 | -9.3073 | 1.0 | 27.0394 | -89.6540 | -290.8487 | -11.3933 | -12.3913 |
78
- | 0.0 | 8.16 | 2000 | 0.0000 | 17.9389 | -9.1620 | 1.0 | 27.1009 | -89.3634 | -290.4353 | -11.4099 | -12.4207 |
79
- | 0.0 | 8.57 | 2100 | 0.0000 | 17.9458 | -9.1662 | 1.0 | 27.1120 | -89.3718 | -290.4215 | -11.4100 | -12.4207 |
80
- | 0.0 | 8.98 | 2200 | 0.0000 | 17.9626 | -9.2031 | 1.0 | 27.1656 | -89.4456 | -290.3880 | -11.4129 | -12.4222 |
81
- | 0.0 | 9.39 | 2300 | 0.0000 | 18.0639 | -9.2254 | 1.0 | 27.2893 | -89.4903 | -290.1852 | -11.4131 | -12.4245 |
82
- | 0.0 | 9.8 | 2400 | 0.0000 | 18.0774 | -9.2552 | 1.0 | 27.3326 | -89.5498 | -290.1582 | -11.4131 | -12.4240 |
83
- | 0.0 | 10.2 | 2500 | 0.0000 | 18.1246 | -9.2634 | 1.0 | 27.3880 | -89.5663 | -290.0639 | -11.4165 | -12.4280 |
84
- | 0.0 | 10.61 | 2600 | 0.0000 | 18.2553 | -9.3136 | 1.0 | 27.5689 | -89.6667 | -289.8025 | -11.4293 | -12.4475 |
85
- | 0.0 | 11.02 | 2700 | 0.0000 | 18.2686 | -9.3119 | 1.0 | 27.5805 | -89.6632 | -289.7758 | -11.4301 | -12.4488 |
86
- | 0.0 | 11.43 | 2800 | 0.0000 | 18.5334 | -9.2508 | 1.0 | 27.7843 | -89.5411 | -289.2462 | -11.4567 | -12.4847 |
87
- | 0.0 | 11.84 | 2900 | 0.0000 | 18.5812 | -9.2860 | 1.0 | 27.8673 | -89.6115 | -289.1505 | -11.4642 | -12.4938 |
88
- | 0.0 | 12.24 | 3000 | 0.0000 | 18.5923 | -9.2987 | 1.0 | 27.8910 | -89.6368 | -289.1283 | -11.4635 | -12.4928 |
89
- | 0.0 | 12.65 | 3100 | 0.0000 | 18.5948 | -9.2987 | 1.0 | 27.8935 | -89.6369 | -289.1234 | -11.4642 | -12.4939 |
90
- | 0.0 | 13.06 | 3200 | 0.0000 | 18.6175 | -9.3283 | 1.0 | 27.9458 | -89.6961 | -289.0780 | -11.4639 | -12.4934 |
91
- | 0.0 | 13.47 | 3300 | 0.0000 | 18.6269 | -9.3541 | 1.0 | 27.9810 | -89.7477 | -289.0593 | -11.4642 | -12.4950 |
92
- | 0.0 | 13.88 | 3400 | 0.0000 | 18.6234 | -9.3861 | 1.0 | 28.0096 | -89.8117 | -289.0661 | -11.4637 | -12.4939 |
93
- | 0.0 | 14.29 | 3500 | 0.0000 | 18.6239 | -9.3885 | 1.0 | 28.0123 | -89.8164 | -289.0653 | -11.4637 | -12.4939 |
94
- | 0.0 | 14.69 | 3600 | 0.0000 | 18.9584 | -9.2126 | 1.0 | 28.1710 | -89.4647 | -288.3962 | -11.4998 | -12.5483 |
95
- | 0.0 | 15.1 | 3700 | 0.0000 | 18.9572 | -9.2364 | 1.0 | 28.1936 | -89.5122 | -288.3987 | -11.4993 | -12.5473 |
96
- | 0.0 | 15.51 | 3800 | 0.0000 | 18.9851 | -9.2408 | 1.0 | 28.2259 | -89.5209 | -288.3428 | -11.5009 | -12.5508 |
97
- | 0.0 | 15.92 | 3900 | 0.0000 | 18.9883 | -9.2411 | 1.0 | 28.2293 | -89.5216 | -288.3365 | -11.5013 | -12.5515 |
98
- | 0.0 | 16.33 | 4000 | 0.0000 | 19.0228 | -9.2707 | 1.0 | 28.2934 | -89.5808 | -288.2675 | -11.5033 | -12.5545 |
99
 
100
 
101
  ### Framework versions
 
17
  This model is a fine-tuned version of [ahmedabdelwahed/Mojiz-sft](https://huggingface.co/ahmedabdelwahed/Mojiz-sft) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
  - Loss: 0.0000
20
+ - Rewards/chosen: 22.2209
21
+ - Rewards/rejected: -9.2802
22
  - Rewards/accuracies: 1.0
23
+ - Rewards/margins: 31.5011
24
+ - Logps/rejected: -89.5998
25
+ - Logps/chosen: -281.8713
26
+ - Logits/rejected: -11.8496
27
+ - Logits/chosen: -12.9621
28
 
29
  ## Model description
30
 
 
43
  ### Training hyperparameters
44
 
45
  The following hyperparameters were used during training:
46
+ - learning_rate: 0.0003
47
  - train_batch_size: 4
48
  - eval_batch_size: 8
49
  - seed: 42
 
56
 
57
  | Training Loss | Epoch | Step | Validation Loss | Rewards/chosen | Rewards/rejected | Rewards/accuracies | Rewards/margins | Logps/rejected | Logps/chosen | Logits/rejected | Logits/chosen |
58
  |:-------------:|:-----:|:----:|:---------------:|:--------------:|:----------------:|:------------------:|:---------------:|:--------------:|:------------:|:---------------:|:-------------:|
59
+ | 0.0001 | 0.41 | 100 | 0.0000 | 13.9027 | -6.9708 | 1.0 | 20.8735 | -84.9811 | -298.5077 | -11.2670 | -12.1892 |
60
+ | 0.0 | 0.82 | 200 | 0.0000 | 17.3896 | -7.8690 | 1.0 | 25.2586 | -86.7774 | -291.5338 | -11.3083 | -12.2870 |
61
+ | 0.0005 | 1.22 | 300 | 0.0000 | 17.6416 | -8.8830 | 1.0 | 26.5246 | -88.8054 | -291.0298 | -11.3003 | -12.3102 |
62
+ | 0.0 | 1.63 | 400 | 0.0000 | 17.8706 | -10.2554 | 1.0 | 28.1260 | -91.5502 | -290.5718 | -11.2508 | -12.2176 |
63
+ | 0.0 | 2.04 | 500 | 0.0000 | 18.1057 | -10.4493 | 1.0 | 28.5550 | -91.9381 | -290.1017 | -11.2773 | -12.2732 |
64
+ | 0.0 | 2.45 | 600 | 0.0000 | 18.1832 | -10.3990 | 1.0 | 28.5822 | -91.8375 | -289.9467 | -11.2881 | -12.2895 |
65
+ | 0.0 | 2.86 | 700 | 0.0000 | 16.7929 | -12.9826 | 1.0 | 29.7756 | -97.0047 | -292.7272 | -11.2594 | -12.0447 |
66
+ | 0.0 | 3.27 | 800 | 0.0000 | 21.2291 | -7.5837 | 1.0 | 28.8129 | -86.2069 | -283.8548 | -11.7781 | -12.9026 |
67
+ | 0.0 | 3.67 | 900 | 0.0000 | 21.2500 | -7.6403 | 1.0 | 28.8902 | -86.3200 | -283.8131 | -11.7771 | -12.9005 |
68
+ | 0.0 | 4.08 | 1000 | 0.0000 | 21.3141 | -8.0568 | 1.0 | 29.3709 | -87.1530 | -283.6848 | -11.7649 | -12.8725 |
69
+ | 0.0 | 4.49 | 1100 | 0.0000 | 21.2289 | -8.8084 | 1.0 | 30.0373 | -88.6562 | -283.8553 | -11.7555 | -12.8438 |
70
+ | 0.0 | 4.9 | 1200 | 0.0000 | 21.2556 | -8.8423 | 1.0 | 30.0979 | -88.7241 | -283.8019 | -11.7538 | -12.8404 |
71
+ | 0.0 | 5.31 | 1300 | 0.0000 | 21.3255 | -9.0397 | 1.0 | 30.3653 | -89.1189 | -283.6620 | -11.7587 | -12.8441 |
72
+ | 0.0 | 5.71 | 1400 | 0.0000 | 21.3586 | -9.0546 | 1.0 | 30.4132 | -89.1486 | -283.5958 | -11.7606 | -12.8458 |
73
+ | 0.0 | 6.12 | 1500 | 0.0000 | 21.3779 | -9.0803 | 1.0 | 30.4582 | -89.2000 | -283.5573 | -11.7600 | -12.8442 |
74
+ | 0.0001 | 6.53 | 1600 | 0.0000 | 21.4632 | -8.9069 | 1.0 | 30.3700 | -88.8531 | -283.3867 | -11.7795 | -12.8770 |
75
+ | 0.0 | 6.94 | 1700 | 0.0000 | 21.4992 | -8.8450 | 1.0 | 30.3441 | -88.7293 | -283.3147 | -11.7894 | -12.8933 |
76
+ | 0.0 | 7.35 | 1800 | 0.0000 | 21.5109 | -8.8690 | 1.0 | 30.3799 | -88.7774 | -283.2912 | -11.7896 | -12.8931 |
77
+ | 0.0 | 7.76 | 1900 | 0.0000 | 21.5322 | -8.9067 | 1.0 | 30.4390 | -88.8529 | -283.2486 | -11.7905 | -12.8917 |
78
+ | 0.0 | 8.16 | 2000 | 0.0000 | 21.5638 | -8.9260 | 1.0 | 30.4898 | -88.8915 | -283.1855 | -11.7910 | -12.8912 |
79
+ | 0.0 | 8.57 | 2100 | 0.0000 | 21.5688 | -8.9323 | 1.0 | 30.5011 | -88.9041 | -283.1755 | -11.7910 | -12.8909 |
80
+ | 0.0 | 8.98 | 2200 | 0.0000 | 21.5834 | -8.9761 | 1.0 | 30.5596 | -88.9917 | -283.1462 | -11.7917 | -12.8894 |
81
+ | 0.0 | 9.39 | 2300 | 0.0000 | 21.7552 | -9.1179 | 1.0 | 30.8731 | -89.2752 | -282.8026 | -11.7852 | -12.8848 |
82
+ | 0.0 | 9.8 | 2400 | 0.0000 | 21.7651 | -9.1413 | 1.0 | 30.9064 | -89.3221 | -282.7829 | -11.7850 | -12.8839 |
83
+ | 0.0 | 10.2 | 2500 | 0.0000 | 21.7780 | -9.1572 | 1.0 | 30.9351 | -89.3538 | -282.7572 | -11.7861 | -12.8846 |
84
+ | 0.0 | 10.61 | 2600 | 0.0000 | 21.8023 | -9.1708 | 1.0 | 30.9731 | -89.3810 | -282.7084 | -11.7872 | -12.8865 |
85
+ | 0.0 | 11.02 | 2700 | 0.0000 | 21.8155 | -9.1793 | 1.0 | 30.9948 | -89.3980 | -282.6820 | -11.7879 | -12.8875 |
86
+ | 0.0 | 11.43 | 2800 | 0.0000 | 21.9589 | -9.1294 | 1.0 | 31.0883 | -89.2983 | -282.3953 | -11.8192 | -12.9290 |
87
+ | 0.0 | 11.84 | 2900 | 0.0000 | 21.9736 | -9.1440 | 1.0 | 31.1177 | -89.3275 | -282.3658 | -11.8210 | -12.9310 |
88
+ | 0.0 | 12.24 | 3000 | 0.0000 | 21.9922 | -9.1667 | 1.0 | 31.1589 | -89.3729 | -282.3287 | -11.8189 | -12.9279 |
89
+ | 0.0 | 12.65 | 3100 | 0.0000 | 21.9939 | -9.1697 | 1.0 | 31.1636 | -89.3788 | -282.3252 | -11.8189 | -12.9278 |
90
+ | 0.0 | 13.06 | 3200 | 0.0000 | 22.0187 | -9.2172 | 1.0 | 31.2360 | -89.4739 | -282.2755 | -11.8188 | -12.9264 |
91
+ | 0.0 | 13.47 | 3300 | 0.0000 | 22.0256 | -9.2325 | 1.0 | 31.2581 | -89.5044 | -282.2618 | -11.8194 | -12.9272 |
92
+ | 0.0 | 13.88 | 3400 | 0.0000 | 22.0341 | -9.2495 | 1.0 | 31.2836 | -89.5385 | -282.2448 | -11.8192 | -12.9267 |
93
+ | 0.0 | 14.29 | 3500 | 0.0000 | 22.0343 | -9.2506 | 1.0 | 31.2849 | -89.5407 | -282.2444 | -11.8192 | -12.9267 |
94
+ | 0.0 | 14.69 | 3600 | 0.0000 | 22.2061 | -9.2685 | 1.0 | 31.4746 | -89.5764 | -281.9009 | -11.8502 | -12.9633 |
95
+ | 0.0 | 15.1 | 3700 | 0.0000 | 22.2073 | -9.2691 | 1.0 | 31.4763 | -89.5776 | -281.8986 | -11.8503 | -12.9634 |
96
+ | 0.0 | 15.51 | 3800 | 0.0000 | 22.2107 | -9.2755 | 1.0 | 31.4862 | -89.5905 | -281.8916 | -11.8501 | -12.9628 |
97
+ | 0.0 | 15.92 | 3900 | 0.0000 | 22.2207 | -9.2795 | 1.0 | 31.5003 | -89.5985 | -281.8716 | -11.8493 | -12.9618 |
98
+ | 0.0 | 16.33 | 4000 | 0.0000 | 22.2209 | -9.2802 | 1.0 | 31.5011 | -89.5998 | -281.8713 | -11.8496 | -12.9621 |
99
 
100
 
101
  ### Framework versions
adapter_model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:892aa4183dcfc8aca59538ea9f16571402f9e0a20109e934ab7b7632222edbdd
3
  size 7098016
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f513e6fc34d162dd916e0e7609a14c3ac1e5eb0cf18cfddee2ce1f8552584593
3
  size 7098016
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b975b7366ace72275ef20bba981f0b048ef1600a04cb162c8304e887e242ba65
3
  size 4219
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:efa77791e82ab6c2feaf4e534134b8b0528decaaa81cac2afa283f43e4fda446
3
  size 4219