File size: 1,463 Bytes
2af6ff4 7c1ed83 3339ecf 554f027 7c1ed83 ae7dd72 337c602 7c1ed83 337c602 7c1ed83 0d04e1e 7c1ed83 0d04e1e 7c1ed83 0d04e1e 7c1ed83 0d04e1e 7c1ed83 337c602 b698b9b ae7dd72 7c1ed83 b698b9b 47f36e4 7c1ed83 47f36e4 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 |
---
license: apache-2.0
---
# NeuralNovel/Mistral-7B-Instruct-v0.2-Neural-Story
The **Mistral-7B-Instruct-v0.2-Neural-Story** model, developed by NeuralNovel and funded by Techmind, is a language model finetuned from Mistral-7B-Instruct-v0.2.
Designed to generate instructive and narrative text, with a specific focus on storytelling.
This fine-tune has been tailored to provide detailed and creative responses in the context of narrative prompts.
### Data-set
The model was finetuned using the Neural-Story-v1 dataset.
### Summary
This model can be directly used for generating creative and narrative text, making it more suitable for creative writing prompts and storytelling.
#### Out-of-Scope Use
The model may not perform well in scenarios unrelated to instructive and narrative text generation. Misuse or applications outside its designed scope may result in suboptimal outcomes.
### Bias, Risks, and Limitations
The model may exhibit biases or limitations inherent in the training data. It is essential to consider these factors when deploying the model to avoid unintended consequences.
While the Neural-Story-v0.1 dataset serves as an excellent starting point for testing language models, users are advised to exercise caution, as there might be some inherent genre or writing bias.
### Hardware and Training
Trained using NVIDIA Tesla T40 24 GB.
```
n_epochs = 3,
n_checkpoints = 3,
batch_size = 12,
learning_rate = 1e-5,
```
|