File size: 3,904 Bytes
2a0ce31
5eeb08a
2a0ce31
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5eeb08a
2a0ce31
5eeb08a
2a0ce31
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5eeb08a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2a0ce31
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
---
base_model: unsloth/mistral-7b-v0.3
library_name: peft
license: apache-2.0
tags:
- unsloth
- generated_from_trainer
model-index:
- name: Mistral-7B-v0.3_pct_ortho
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# Mistral-7B-v0.3_pct_ortho

This model is a fine-tuned version of [unsloth/mistral-7b-v0.3](https://huggingface.co/unsloth/mistral-7b-v0.3) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 7.0729

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.02
- num_epochs: 1

### Training results

| Training Loss | Epoch  | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 2.1243        | 0.0206 | 8    | 2.2855          |
| 9.1803        | 0.0413 | 16   | 12.8133         |
| 9.137         | 0.0619 | 24   | 8.4567          |
| 8.3645        | 0.0825 | 32   | 8.2606          |
| 8.5251        | 0.1032 | 40   | 7.7564          |
| 9.4881        | 0.1238 | 48   | 9.3070          |
| 7.7111        | 0.1444 | 56   | 7.7363          |
| 7.6126        | 0.1651 | 64   | 7.5948          |
| 7.6789        | 0.1857 | 72   | 7.6147          |
| 7.7404        | 0.2063 | 80   | 7.6322          |
| 7.7173        | 0.2270 | 88   | 7.6740          |
| 7.7113        | 0.2476 | 96   | 7.6742          |
| 7.6961        | 0.2682 | 104  | 7.6422          |
| 7.6729        | 0.2888 | 112  | 7.6076          |
| 7.7225        | 0.3095 | 120  | 7.7171          |
| 7.8259        | 0.3301 | 128  | 7.7724          |
| 7.6611        | 0.3507 | 136  | 7.5974          |
| 7.5696        | 0.3714 | 144  | 7.6032          |
| 7.6786        | 0.3920 | 152  | 7.6163          |
| 7.4746        | 0.4126 | 160  | 7.4268          |
| 7.4383        | 0.4333 | 168  | 7.4069          |
| 7.4469        | 0.4539 | 176  | 7.5225          |
| 7.6465        | 0.4745 | 184  | 7.4849          |
| 7.4025        | 0.4952 | 192  | 7.3099          |
| 7.3473        | 0.5158 | 200  | 7.2623          |
| 7.2821        | 0.5364 | 208  | 7.2484          |
| 7.389         | 0.5571 | 216  | 7.7177          |
| 7.2912        | 0.5777 | 224  | 7.1141          |
| 7.1847        | 0.5983 | 232  | 7.1145          |
| 7.2121        | 0.6190 | 240  | 7.1465          |
| 7.1216        | 0.6396 | 248  | 7.1479          |
| 7.2503        | 0.6602 | 256  | 7.1105          |
| 7.1416        | 0.6809 | 264  | 7.1730          |
| 7.2288        | 0.7015 | 272  | 7.1491          |
| 7.3502        | 0.7221 | 280  | 7.1991          |
| 7.2648        | 0.7427 | 288  | 7.1406          |
| 7.1647        | 0.7634 | 296  | 7.1226          |
| 7.1678        | 0.7840 | 304  | 7.0843          |
| 7.1879        | 0.8046 | 312  | 7.1045          |
| 7.2384        | 0.8253 | 320  | 7.1137          |
| 7.2301        | 0.8459 | 328  | 7.0949          |
| 7.2897        | 0.8665 | 336  | 7.1273          |
| 7.1483        | 0.8872 | 344  | 7.1084          |
| 7.1119        | 0.9078 | 352  | 7.0984          |
| 7.2202        | 0.9284 | 360  | 7.0766          |
| 7.1149        | 0.9491 | 368  | 7.0738          |
| 7.1986        | 0.9697 | 376  | 7.0749          |
| 7.155         | 0.9903 | 384  | 7.0729          |


### Framework versions

- PEFT 0.12.0
- Transformers 4.44.0
- Pytorch 2.4.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1