A newer version of this model is available: ussipan/Llama-3.2-SipanGPT-v0.5-GGUF

SipánGPT 0.1 Llama 3.2 1B GGUF

  • Modelo pre-entrenado para responder preguntas de la Universidad Señor de Sipán de Lambayeque, Perú.
  • Pre-trained model to answer questions from the Señor de Sipán University of Lambayeque, Peru.

Testing the model

image/png

  • Debido a la poca cantidad de conversaciones con las que fue entrenado (400 conversaciones), el modelo genera bastantes alucinaciones.
  • Due to the small number of conversations with which it was trained (400 conversations), the model generates quite a few hallucinations.

Uploaded model

  • Developed by: jhangmez
  • License: apache-2.0
  • Finetuned from model : unsloth/Meta-Llama-3.2-1B-Instruct

This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.


SipánGPT 0.1 Llama 3.2 1B GGUF

Hecho con ❤️ por Jhan Gómez P.
Downloads last month
57
GGUF
Model size
1.24B params
Architecture
llama

4-bit

5-bit

8-bit

16-bit

Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train ussipan/SipanGPT-0.1-Llama-3.2-1B-GGUF