Edit model card

Barcenas 27b

Based on the SillyTilly/google-gemma-2-27b-it and trained with the dataset pinzhenchen/alpaca-cleaned-es in the Spanish language.

The goal of this model is to have a relatively large model optimized in Spanish and that was at the level of the first versions of GPT-4.

I am proud of this model for being the biggest and most powerful one I have done, no doubt it is the result of my short stay in the AI world.

Made with ❤️ in Guadalupe, Nuevo Leon, Mexico 🇲🇽

Downloads last month
4
Safetensors
Model size
27.2B params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Danielbrdz/Barcenas-27b

Quantizations
2 models