General-Stories-Mistral-7B
This model is based on my dataset General-Stories-Collection which has 1.3 million stories especially meant for General audience.
After an extensive training period spanning over 15 days, this model has been meticulously honed to deliver captivating narratives with broad appeal. Leveraging a vast synthetic dataset comprising approximately 1.3 million stories tailored for diverse readership, this model possesses a deep understanding of narrative intricacies and themes. What sets my model apart is not just its ability to generate stories, but its capacity to evoke emotion, spark imagination, and forge connections with its audience.
I am excited to introduce this powerful tool, ready to spark imagination and entertain readers worldwide with its versatile storytelling capabilities.
As we embark on this exciting journey of AI storytelling, I invite you to explore the endless possibilities my model has to offer. Whether you're a writer seeking inspiration, a reader in search of a captivating tale, or a creative mind eager to push the boundaries of storytelling, my model is here to inspire, entertain, and enrich your literary experience.
Kindly note this is qLoRA version.
GGUF & Exllama
GGUF: Link
Exllama v2: Link
Special Thanks to Bartowski for quantizing this model.
Training
Entire dataset was trained on 4 x A100 80GB. For 2 epoch, training took more than 15 Days. Axolotl codebase was used for training purpose. Entire data is trained on Mistral-7B-v0.1.
Example Prompt:
This model uses ChatML prompt format.
<|im_start|>system
You are a Helpful Assistant.<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
You can modify above Prompt as per your requirement.
I want to say special Thanks to the Open Source community for helping & guiding me to better understand the AI/Model development.
Thank you for your love & support.
Example Output
Example 1
Example 2
Example 3
Example 4
- Downloads last month
- 39