File size: 5,827 Bytes
8b76f61
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1e99699
 
8b76f61
1e99699
 
 
 
 
8b76f61
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1e99699
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8b76f61
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
---
license: llama3
base_model: tsavage68/Summary_L3_1000steps_1e7rate_SFT2
tags:
- trl
- dpo
- generated_from_trainer
model-index:
- name: Summary_L3_1000steps_1e6rate_03beta_CSFTDPO
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# Summary_L3_1000steps_1e6rate_03beta_CSFTDPO

This model is a fine-tuned version of [tsavage68/Summary_L3_1000steps_1e7rate_SFT2](https://huggingface.co/tsavage68/Summary_L3_1000steps_1e7rate_SFT2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5961
- Rewards/chosen: 0.0294
- Rewards/rejected: -2.5656
- Rewards/accuracies: 0.1400
- Rewards/margins: 2.5950
- Logps/rejected: -23.8158
- Logps/chosen: -9.2849
- Logits/rejected: -1.1435
- Logits/chosen: -1.1436

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 100
- training_steps: 1000

### Training results

| Training Loss | Epoch  | Step | Validation Loss | Rewards/chosen | Rewards/rejected | Rewards/accuracies | Rewards/margins | Logps/rejected | Logps/chosen | Logits/rejected | Logits/chosen |
|:-------------:|:------:|:----:|:---------------:|:--------------:|:----------------:|:------------------:|:---------------:|:--------------:|:------------:|:---------------:|:-------------:|
| 0.5553        | 0.2004 | 50   | 0.5962          | 0.0778         | -1.2696          | 0.1400             | 1.3473          | -19.4956       | -9.1236      | -1.1038         | -1.1053       |
| 0.6585        | 0.4008 | 100  | 0.5962          | 0.0854         | -1.4439          | 0.1400             | 1.5292          | -20.0766       | -9.0982      | -1.1078         | -1.1092       |
| 0.6238        | 0.6012 | 150  | 0.5961          | 0.0687         | -2.1556          | 0.1400             | 2.2243          | -22.4490       | -9.1538      | -1.1298         | -1.1306       |
| 0.6065        | 0.8016 | 200  | 0.5961          | 0.0322         | -2.5726          | 0.1400             | 2.6048          | -23.8390       | -9.2754      | -1.1437         | -1.1438       |
| 0.6238        | 1.0020 | 250  | 0.5961          | 0.0294         | -2.5678          | 0.1400             | 2.5971          | -23.8230       | -9.2849      | -1.1438         | -1.1440       |
| 0.6238        | 1.2024 | 300  | 0.5961          | 0.0279         | -2.5674          | 0.1400             | 2.5953          | -23.8219       | -9.2899      | -1.1439         | -1.1440       |
| 0.6238        | 1.4028 | 350  | 0.5961          | 0.0304         | -2.5648          | 0.1400             | 2.5952          | -23.8131       | -9.2814      | -1.1438         | -1.1439       |
| 0.5718        | 1.6032 | 400  | 0.5961          | 0.0304         | -2.5648          | 0.1400             | 2.5952          | -23.8131       | -9.2814      | -1.1438         | -1.1439       |
| 0.5892        | 1.8036 | 450  | 0.5961          | 0.0338         | -2.5715          | 0.1400             | 2.6052          | -23.8353       | -9.2702      | -1.1435         | -1.1436       |
| 0.5718        | 2.0040 | 500  | 0.5961          | 0.0279         | -2.5720          | 0.1400             | 2.5999          | -23.8372       | -9.2897      | -1.1434         | -1.1435       |
| 0.5718        | 2.2044 | 550  | 0.5961          | 0.0266         | -2.5750          | 0.1400             | 2.6016          | -23.8472       | -9.2942      | -1.1438         | -1.1440       |
| 0.5545        | 2.4048 | 600  | 0.5961          | 0.0271         | -2.5761          | 0.1400             | 2.6032          | -23.8507       | -9.2925      | -1.1438         | -1.1440       |
| 0.5199        | 2.6052 | 650  | 0.5961          | 0.0271         | -2.5761          | 0.1400             | 2.6032          | -23.8507       | -9.2925      | -1.1438         | -1.1440       |
| 0.6238        | 2.8056 | 700  | 0.5961          | 0.0270         | -2.5764          | 0.1400             | 2.6035          | -23.8519       | -9.2928      | -1.1438         | -1.1440       |
| 0.6065        | 3.0060 | 750  | 0.5961          | 0.0315         | -2.5674          | 0.1400             | 2.5989          | -23.8216       | -9.2777      | -1.1434         | -1.1436       |
| 0.6412        | 3.2064 | 800  | 0.5961          | 0.0276         | -2.5662          | 0.1400             | 2.5937          | -23.8176       | -9.2909      | -1.1434         | -1.1436       |
| 0.6585        | 3.4068 | 850  | 0.5961          | 0.0277         | -2.5666          | 0.1400             | 2.5943          | -23.8191       | -9.2903      | -1.1434         | -1.1436       |
| 0.6238        | 3.6072 | 900  | 0.5961          | 0.0281         | -2.5670          | 0.1400             | 2.5952          | -23.8205       | -9.2891      | -1.1434         | -1.1436       |
| 0.5372        | 3.8076 | 950  | 0.5961          | 0.0310         | -2.5656          | 0.1400             | 2.5966          | -23.8159       | -9.2795      | -1.1435         | -1.1436       |
| 0.6238        | 4.0080 | 1000 | 0.5961          | 0.0294         | -2.5656          | 0.1400             | 2.5950          | -23.8158       | -9.2849      | -1.1435         | -1.1436       |


### Framework versions

- Transformers 4.41.2
- Pytorch 2.0.0+cu117
- Datasets 2.20.0
- Tokenizers 0.19.1