lomahony's picture
Create README.md
f893f73
|
raw
history blame
936 Bytes
metadata
language:
  - en
tags:
  - pytorch
  - causal-lm
  - pythia
license: apache-2.0
datasets:
  - Anthropic/hh-rlhf

Pythia-6.9b supervised finetuned with Anthropic-hh-rlhf dataset for 1 epoch (sft-model), before DPO (paper) with same dataset for 1 epoch.

wandb log

Benchmark evaluations done using lm-evaluation-harness.

See Pythia-6.9b for original model details (paper).