Dracones's picture
Upload folder using huggingface_hub
75f6b60 verified
metadata
base_model: []
library_name: transformers
tags:
  - mergekit
  - merge
MidnightMiqu

Midnight-Miqu-103B-v1.0 - EXL2 2.4bpw

This is a 2.4bpw EXL2 quant of sophosympatheia/Midnight-Miqu-103B-v1.0

Details about the model and the merge info can be found at the above mode page.

I have not extensively tested this quant/model other than ensuring I could load it and chat with it.

Quant Details

This is the script used for quantization.

#!/bin/bash

# Activate the conda environment
source ~/miniconda3/etc/profile.d/conda.sh
conda activate exllamav2

# Define variables
MODEL_DIR="models/sophosympatheia_Midnight-Miqu-103B-v1.0"
OUTPUT_DIR="exl2_midnight103b"
MEASUREMENT_FILE="measurements/midnight103b.json"

BIT_PRECISION=2.4
CONVERTED_FOLDER="models/Midnight-Miqu-103B_exl2_2.4bpw"

# Create directories
mkdir $OUTPUT_DIR
mkdir $CONVERTED_FOLDER

# Run conversion commands
python convert.py -i $MODEL_DIR -o $OUTPUT_DIR -nr -om $MEASUREMENT_FILE
python convert.py -i $MODEL_DIR -o $OUTPUT_DIR -nr -m $MEASUREMENT_FILE -b $BIT_PRECISION -cf $CONVERTED_FOLDER