Edit model card

Baby Med LLM

Prompt
baby llm

Spanish Domain Exper LLMs

This project represents an ongoing initiative at Alt aimed at enhancing the Spanish language capabilities of LLMs' base models. By incorporating diverse and rich Spanish corpora, our objective is to improve the model's understanding and generation of specific domain languages.

This effort is part of a broader initiative to adapt and refine large language models (LLMs) for more accurate and nuanced performance across a variety of linguistic contexts, including but not limited to medicine, law, finance, agriculture, construction, and scientific research. By integrating specialized vocabularies and knowledge from these diverse fields into our fine-tuning process, specifically focusing on Spanish corpora, our goal is to significantly enhance the model's proficiency. This comprehensive approach ensures that the 'Mistral7b' model not only excels in general language understanding and generation but also possesses deep expertise in specific domains. Consequently, this enables more effective, natural, and contextually relevant interactions in Spanish, tailored to the unique requirements of professionals across these critical sectors.

  • Developed by: altbrainblock
  • License: apache-2.0
  • Finetuned from model : unsloth/mistral-7b-bnb-4bit

This mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.

Downloads last month
7
GGUF
Model size
7.24B params
Architecture
llama

4-bit

8-bit

16-bit

Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for altbrainblock/tiny_med_es

Quantized
this model