YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Quantization made by Richard Erkhov.
Ministral-3b-instruct - GGUF
- Model creator: https://huggingface.co/ministral/
- Original model: https://huggingface.co/ministral/Ministral-3b-instruct/
Original model description:
library_name: transformers inference: parameters: temperature: 1 top_p: 0.95 top_k: 40 repetition_penalty: 1.2 license: apache-2.0 language: - en pipeline_tag: text-generation
Model Description
Ministral is a series of language model, build with same architecture as the famous Mistral model, but with less size.
- Model type: A 3B parameter GPT-like model fine-tuned on a mix of publicly available, synthetic datasets.
- Language(s) (NLP): Primarily English
- License: Apache 2.0
- Finetuned from model: mistralai/Mistral-7B-v0.1
- Downloads last month
- 31
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.