chuanli-lambda's picture
Create README.md
8645542
|
raw
history blame
791 Bytes
metadata
language:
  - en
tags:
  - pytorch
  - causal-lm
  - pythia
license: apache-2.0
datasets:
  - Dahoas/synthetic-instruct-gptj-pairwise

This model is created by finetuning EleutherAI/pythia-1.4b-deduped on the Dahoas/synthetic-instruct-gptj-pairwise for two epochs.

You can try a demo of the model hosted on Lambda Cloud.

It took 8xA100 80GB 2 hours to train the model. We set batch_size_per_gpu to 8 (so global batch size is 64), and learning rate to 0.00002 (with linear decay to zero at the last trainig step).

The Weights and Biases record of the training can be found here.