Tohrumi commited on
Commit
c9faf00
1 Parent(s): 7b952ba

#1: First attempt

Browse files
Files changed (1) hide show
  1. README.md +26 -2
README.md CHANGED
@@ -2,9 +2,10 @@
2
  license: apache-2.0
3
  library_name: peft
4
  tags:
5
- - unsloth
6
  - trl
7
  - sft
 
 
8
  - generated_from_trainer
9
  base_model: unsloth/mistral-7b-bnb-4bit
10
  model-index:
@@ -17,7 +18,9 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  # MistralAI_iwslt15_10000
19
 
20
- This model is a fine-tuned version of [unsloth/mistral-7b-bnb-4bit](https://huggingface.co/unsloth/mistral-7b-bnb-4bit) on an unknown dataset.
 
 
21
 
22
  ## Model description
23
 
@@ -48,6 +51,27 @@ The following hyperparameters were used during training:
48
  - num_epochs: 5
49
  - mixed_precision_training: Native AMP
50
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
51
  ### Framework versions
52
 
53
  - PEFT 0.10.0
 
2
  license: apache-2.0
3
  library_name: peft
4
  tags:
 
5
  - trl
6
  - sft
7
+ - unsloth
8
+ - translation
9
  - generated_from_trainer
10
  base_model: unsloth/mistral-7b-bnb-4bit
11
  model-index:
 
18
 
19
  # MistralAI_iwslt15_10000
20
 
21
+ This model is a fine-tuned version of [unsloth/mistral-7b-bnb-4bit](https://huggingface.co/unsloth/mistral-7b-bnb-4bit) on the None dataset.
22
+ It achieves the following results on the evaluation set:
23
+ - Loss: 1.5245
24
 
25
  ## Model description
26
 
 
51
  - num_epochs: 5
52
  - mixed_precision_training: Native AMP
53
 
54
+ ### Training results
55
+
56
+ | Training Loss | Epoch | Step | Validation Loss |
57
+ |:-------------:|:-----:|:----:|:---------------:|
58
+ | 1.1687 | 0.32 | 100 | 1.0937 |
59
+ | 1.0905 | 0.64 | 200 | 1.0724 |
60
+ | 1.0711 | 0.96 | 300 | 1.0552 |
61
+ | 0.9258 | 1.28 | 400 | 1.0648 |
62
+ | 0.8979 | 1.6 | 500 | 1.0613 |
63
+ | 0.8893 | 1.92 | 600 | 1.0512 |
64
+ | 0.7253 | 2.24 | 700 | 1.1353 |
65
+ | 0.6713 | 2.56 | 800 | 1.1260 |
66
+ | 0.6701 | 2.88 | 900 | 1.1252 |
67
+ | 0.5284 | 3.2 | 1000 | 1.2891 |
68
+ | 0.446 | 3.52 | 1100 | 1.2803 |
69
+ | 0.4454 | 3.84 | 1200 | 1.3040 |
70
+ | 0.3663 | 4.16 | 1300 | 1.5203 |
71
+ | 0.282 | 4.48 | 1400 | 1.5198 |
72
+ | 0.2798 | 4.8 | 1500 | 1.5245 |
73
+
74
+
75
  ### Framework versions
76
 
77
  - PEFT 0.10.0