Barcenas 4b

Based on google/gemma-3-4b-it and trained with mlabonne/OpenThoughts-79k-filtered data.

This new now multimodal (Image and text) model was trained with quality data in math, code, science and puzzles.

The goal of this model is to have powerful LLM capable of solving many problems but at a small size and able to run locally on most computers.

Made with ❤️ in Guadalupe, Nuevo Leon, Mexico 🇲🇽

Downloads last month
7
Safetensors
Model size
4.3B params
Tensor type
FP16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Danielbrdz/Barcenas-4b

Finetuned
(96)
this model
Quantizations
2 models

Dataset used to train Danielbrdz/Barcenas-4b