Barcenas-Mistral-7b is a fine-tuning of teknium/CollectiveCognition-v1-Mistral-7B

It was trained with Spanish data from lmsys/lmsys-chat-1m provided by Danielbrdz/Barcenas-lmsys-Dataset

Made with ❀️ in Guadalupe, Nuevo Leon, Mexico πŸ‡²πŸ‡½

Downloads last month
797
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Danielbrdz/Barcenas-Mistral-7b

Finetunes
1 model
Merges
1 model
Quantizations
3 models

Dataset used to train Danielbrdz/Barcenas-Mistral-7b

Spaces using Danielbrdz/Barcenas-Mistral-7b 5