Edit model card

Mgpt: A Fine-tuned Mixtral Model

Mgpt is a fine-tuned version of the Mixtral model, optimized for various natural language processing tasks. It leverages the power of large-scale language models to generate high-quality text outputs for a wide range of applications.

Overview

Mgpt is built upon the Mixtral model, which is a variant of the popular GPT (Generative Pre-trained Transformer) architecture. The Mixtral model is trained on a diverse range of text data and fine-tuned for specific tasks using transfer learning techniques.

Downloads last month
36
Safetensors
Model size
7.24B params
Tensor type
BF16
·
Unable to determine this model’s pipeline type. Check the docs .