Lince-zero ggml version quantized to 4 bits

Lince-zero model card

Usage

#!/bin/bash

# License: Apache-2.0

# This script sets up and runs the lince model
# Ensure you have wget, git, cmake, and make installed before running this script.

# Set model and directory variables
MODEL_URL="https://huggingface.co/clibrain/lince-zero-f16-ggml-q4_0/resolve/main/lince-zero-f16-ggml-q4_0.bin"
MODEL_DIR="$HOME/lince-model"
MODEL_BIN="${MODEL_DIR}/lince-zero-f16-ggml-q4_0.bin"
GGML_REPO="https://github.com/mrm8488/ggml-lince.git"
BUILD_DIR="build"

# Create model directory
mkdir -p $MODEL_DIR

# Download the model
wget $MODEL_URL -O $MODEL_BIN || { echo "Failed to download model"; exit 1; }

# Clone the repository
git clone $GGML_REPO || { echo "Failed to clone repository"; exit 1; }

# Navigate into the repository directory
cd ggml-lince || { echo "Directory does not exist"; exit 1; }

# Create build directory and navigate into it
mkdir -p $BUILD_DIR && cd $BUILD_DIR || { echo "Failed to create or navigate to build directory"; exit 1; }

# Build project
cmake .. || { echo "cmake failed"; exit 1; }
cd examples/falcon || { echo "Directory does not exist"; exit 1; }
make || { echo "make failed"; exit 1; }

# Navigate back to run the model
cd ../../$BUILD_DIR/bin || { echo "Directory does not exist"; exit 1; }

# Show the help message
./lince -h || { echo "./lince failed"; exit 1; }

# Run the model
./lince -m  $MODEL_BIN \
-p "A continuaci贸n hay una instrucci贸n que describe una tarea, junto con una entrada que\
proporciona m谩s contexto. Escriba una respuesta que complete adecuadamente la solicitud.\n\n\
### Instruccti贸n:\nDame una lista de sitios a visitar en Espa帽a\n\n### Respuesta:" \
-n 64
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.