This is an ExLlamaV2 quantized model in 4bpw of TheDrummer/Moistral-11B-v1 using the default calibration dataset.
Original Model card:
Moistral 11B v1 π¦
An eRP model with a rich vocabulary of moist. Finetuned by yours truly.
GGUF: https://huggingface.co/TheDrummer/Moistral-11B-v1-GGUF/tree/main
Description
Moistral 11B is an aggressive finetune of the legendary Fimbulvert v2 model.
If you haven't heard of that model, then your poor-GPU ass is missing out! It is an amazing substitute for those heavyweight models, and it won't disappoint. (That's coming from someone who daily drives a Venus / Goliath 120B)
...and if you're looking for something more moist, then you've come to the right place.
You can expect the same sort of performance here! It's the Honda Civic of eRP models!
Pimp my ride
Turn any old story into a Moistral masterpiece. In fact, we recommend that you leave out any hints of moist in your story to let Moistral cook.
Drive it like it's hot
When in Instruct Mode, Moistral will treat you like a director of your own fantasy ride. Be a controlling navigator or a passenger princess, it won't matter - you're in for a good ride!
Moistral Sample
ASSISTANT:When the doors to the throne room finally opened, she saw him there - Dik, the sorcerer prince sitting on his throne. He looked majestic, just as Princess Lilia expected. In a robe embroidered with silver yarn and covered with magical runes. His courtiers stood on either side of him, all looking at the princess with interest. Princess Lilia felt her determination leaving her, but pulled herself together and once again reminded herself that she should not be afraid. She approached the throne intending to show the dignity of all Iliac by personal example.
USER:
She bows slightly
ASSISTANT:
Princess Lilia bowed slightly in front of Dik, the sorcerer prince. Her eyes fell to the floor out of respect, but soon rose again. She would show herself to be a dignified woman!
"I greet you, your highness," Princess Lilia said. "I am here on behalf of my father, King Edmund Ruhr, to serve as a hostage and envoy."
Dik nodded, but didn't say anything.
It's got the mileage
Moistral has been trained with many long form texts, a nice chunk of which are 8K in length. It is capable of going far and long without passing it back to you. This is not your typical chibi RP model.
Parameters
If Moistral starts to underperform and token spit, I've noticed that lowering the parameters makes it coherent again. Here's what worked for me:
temperature: 0.66
repetition_penalty: 1.1
top_p: 0.64
rp_slp: 1
I encourage you to play around with the parameters yourself to see what works for you.
What's next?
Moistral 11B is my first attempt at finetuning a capable model (Sorry, CreamPhi-2). It's coherent and creative enough to let me understand the impact of my dataset & training. Playing around with it has already given me a better idea on the Do's and Don'ts. I will most likely make a version 2 with some improvements:
- Remove any glitchy texts that come from my dataset. Sanitize sanitize sanitize!
- Balance out the themes in the dataset for a richer, more diverse experience.
- Consider extending the context window.
- Add a 'monologue' dataset that forces the model to keep talking without much interaction from the
user
. - Maybe, just maybe, expose it to dry stuff to let Moistral cook.
GGUF: https://huggingface.co/TheDrummer/Moistral-11B-v1-GGUF/tree/main
I have to acknowledge that I'm standing on the shoulders of giants. Thank you Sao for sharing your finetune config along with tips on getting started. Thanks to everyone in the Finetuning channel for entertaining my every question.
license: other license_name: freeuse license_link: LICENSE
- Downloads last month
- 6
Model tree for mpasila/Moistral-11B-v1-exl2-4bpw
Base model
BeaverLegacy/Moistral-11B-v1