File size: 279 Bytes
3627d85
 
 
 
 
 
 
b16491b
 
3627d85
 
 
b16491b
 
 
3627d85
b16491b
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
---
library_name: transformers
tags: []
---

# Model Card for Model ID

- Summary Length PPO experiment #7
- No KL divergence in loss

## Model Details

- Dataset size: 16384
- Epochs: 1
- Batch Size: 16 * 4 (w/ 4 GPUs)

Optimizer args: Torch AdamW default, except
- LR = 0.00001