---
license: apache-2.0
inference: false
---
**NOTE: This "delta model" cannot be used directly.**
Users have to apply it on top of the original LLaMA weights.
See https://github.com/lm-sys/FastChat#vicuna-weights for instructions.
# scifi-fantasy-author Model Card
`scifi-fantasy-author` is a finetuned LLaMA-7B model to generate narrative fiction,
paricularly in the Science Fiction and Fantasy genres.
The following hyperparameters were used
|Batch Size|Epochs|Context length|Learning rate|Scheduler|Weight decay|Warmup ratio|
|---------:|-----:|-------------:|------------:|--------:|-----------:|-----------:|
| 128 | 3 | 8192 | 2e-5 | Cosine | 0. | 0.03 |
The model reached a training loss of 2.008 and took approximately 8 hours on 8x A100 80 GB GPUs.
The specific training script can be found [here](https://github.com/hooloovoo-ai/cyoa-backend/blob/master/backend/scripts/train.py).