File size: 4,060 Bytes
51aae97
 
 
 
 
 
 
 
 
 
 
 
710a0a8
51aae97
 
 
84f8ff2
51aae97
710a0a8
51aae97
710a0a8
51aae97
f2bd762
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
---
language:
- en
tags:
- pytorch
- causal-lm
- pythia
license: apache-2.0
datasets:
- Anthropic/hh-rlhf
---

[Pythia-1.4b](https://huggingface.co/EleutherAI/pythia-1.4b) supervised finetuned using TRLx library with the helpful subset of [Anthropic-hh-rlhf dataset](https://huggingface.co/datasets/Anthropic/hh-rlhf) for 1 epoch. 

Checkpoints are also uploaded. 

Fully reproducible finetuning code is available on [GitHub](https://github.com/lauraaisling/trlx-pythia/tree/main)

[wandb log](https://wandb.ai/lauraomahony999/pythia-sft/runs/ydaj2ks8)

See [Pythia-1.4b](https://huggingface.co/EleutherAI/pythia-1.4b) for model details [(paper)](https://arxiv.org/abs/2101.00027). 

hf (pretrained=lomahony/pythia-1.4b-helpful-sft), gen_kwargs: (None), limit: None, num_fewshot: 0, batch_size: 16
|    Tasks     |Version|Filter|n-shot|    Metric     | Value |   |Stderr|
|--------------|------:|------|-----:|---------------|------:|---|------|
|arc_challenge |      1|none  |     0|acc            | 0.2679|±  |0.0129|
|              |       |none  |     0|acc_norm       | 0.2978|±  |0.0134|
|arc_easy      |      1|none  |     0|acc            | 0.6120|±  |0.0100|
|              |       |none  |     0|acc_norm       | 0.5282|±  |0.0102|
|boolq         |      2|none  |     0|acc            | 0.6260|±  |0.0085|
|hellaswag     |      1|none  |     0|acc            | 0.4097|±  |0.0049|
|              |       |none  |     0|acc_norm       | 0.5212|±  |0.0050|
|lambada_openai|      1|none  |     0|perplexity     | 6.4836|±  |0.1838|
|              |       |none  |     0|acc            | 0.5789|±  |0.0069|
|openbookqa    |      1|none  |     0|acc            | 0.2120|±  |0.0183|
|              |       |none  |     0|acc_norm       | 0.3340|±  |0.0211|
|piqa          |      1|none  |     0|acc            | 0.7100|±  |0.0106|
|              |       |none  |     0|acc_norm       | 0.7144|±  |0.0105|
|sciq          |      1|none  |     0|acc            | 0.8540|±  |0.0112|
|              |       |none  |     0|acc_norm       | 0.7830|±  |0.0130|
|wikitext      |      2|none  |     0|word_perplexity|15.8394|±  |N/A   |
|              |       |none  |     0|byte_perplexity| 1.6763|±  |N/A   |
|              |       |none  |     0|bits_per_byte  | 0.7453|±  |N/A   |
|winogrande    |      1|none  |     0|acc            | 0.5872|±  |0.0138|

hf (pretrained=lomahony/pythia-1.4b-helpful-sft), gen_kwargs: (None), limit: None, num_fewshot: 5, batch_size: 16
|    Tasks     |Version|Filter|n-shot|    Metric     | Value |   |Stderr|
|--------------|------:|------|-----:|---------------|------:|---|------|
|arc_challenge |      1|none  |     5|acc            | 0.2892|±  |0.0133|
|              |       |none  |     5|acc_norm       | 0.3097|±  |0.0135|
|arc_easy      |      1|none  |     5|acc            | 0.6444|±  |0.0098|
|              |       |none  |     5|acc_norm       | 0.6309|±  |0.0099|
|boolq         |      2|none  |     5|acc            | 0.6333|±  |0.0084|
|hellaswag     |      1|none  |     5|acc            | 0.4065|±  |0.0049|
|              |       |none  |     5|acc_norm       | 0.5215|±  |0.0050|
|lambada_openai|      1|none  |     5|perplexity     | 9.7040|±  |0.2887|
|              |       |none  |     5|acc            | 0.4951|±  |0.0070|
|openbookqa    |      1|none  |     5|acc            | 0.2220|±  |0.0186|
|              |       |none  |     5|acc_norm       | 0.3100|±  |0.0207|
|piqa          |      1|none  |     5|acc            | 0.7029|±  |0.0107|
|              |       |none  |     5|acc_norm       | 0.7127|±  |0.0106|
|sciq          |      1|none  |     5|acc            | 0.9170|±  |0.0087|
|              |       |none  |     5|acc_norm       | 0.9160|±  |0.0088|
|wikitext      |      2|none  |     5|word_perplexity|15.8394|±  |N/A   |
|              |       |none  |     5|byte_perplexity| 1.6763|±  |N/A   |
|              |       |none  |     5|bits_per_byte  | 0.7453|±  |N/A   |
|winogrande    |      1|none  |     5|acc            | 0.5699|±  |0.0139|