|
--- |
|
language: |
|
- en |
|
tags: |
|
- pytorch |
|
- causal-lm |
|
- pythia |
|
license: apache-2.0 |
|
datasets: |
|
- Dahoas/synthetic-instruct-gptj-pairwise |
|
--- |
|
|
|
This model is created by finetuning `EleutherAI/pythia-1.4b-deduped` on the `Dahoas/synthetic-instruct-gptj-pairwise` for two epochs. |
|
|
|
You can try a [demo](https://cloud.lambdalabs.com/demos/ml/qa-14b-2000) of the model hosted on [Lambda Cloud](https://lambdalabs.com/service/gpu-cloud). |
|
|
|
It took 8xA100 80GB 2 hours to train the model. We set `batch_size_per_gpu` to `8` (so global batch size is 64), and learning rate to `0.00002` (with linear decay to zero at the last trainig step). |
|
|
|
The Weights and Biases record of the training can be found [here](https://wandb.ai/chuanli11/ft-synthetic-instruct-gptj-pairwise-pythia1.4b?workspace=user-chuanli11). |
|
|
|
|
|
|