About
ape fiction: The ape stands for Algorithmic Pattern Emulation. I am finetuning this to be used in generating fiction.
Finetuned from Mistral Nemo Base using the fullfictions-85kmax dataset.
This uses about the biggest context size entries in the dataset that I've been able to train without OOM errors.
I used unsloth to do the finetuning on a rented GPU (H100) for 2 epochs. Thanks to everybody who made this possible: the unsloth brothers, folks behind KoboldCPP, team behind Mistral Nemo, organizers and volunteers from Gutenberg.org... there's probably more. Thanks everybody.
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no library tag.