metadata
datasets:
- bluuwhale/nsfwstory
- bluuwhale/nsfwstory2
base_model:
- athirdpath/LD-Zephyria-37b_HEALED
athirdpath/LD-Zephyria-37b_HEALED, continually pretrained on one epoch of:
bluuwhale/nsfwstory (615 MB)
bluuwhale/nsfwstory2 (853 MB)
Private Iambe dataset (1.6 GB)
This model has NOT had instruction tuning after pretraining, use -INSTRUCT instead for chat/instruct. This model will do so (see below, Mistral format) but is best for creative writing.
He named himself - I've just taken to letting them do that 0-shot on deterministic unless the name is taken.
Thanks to @bluuwhale for contributing nearly half the dataset