Model Card for Mistral-7B pre-trained

Model Description

This model is a fine-tuned Mistral-7B model on a collection of books.

  • Language(s) (NLP): English
  • Finetuned from model: Mistral-7B
  • Dataset used for fine-tuning: Public Domain Gutenberg Corpus fiction books

This model was pre-trained on a collection of public domain books from Gutenberg. It is meant to be further fine-tuned on task specific data related to narrative content.

Downloads last month
5
Safetensors
Model size
7.24B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support