Configuration Parsing
Warning:
In config.json: "quantization_config.bits" must be an integer
Abliterated version using the code from (https://github.com/andyrdt/refusal_direction).
Quantized with these exllamav2 parameters:
python3 convert.py
-i ~/exllamav2/zetasepic_Mistral-Small-Instruct-2409-abliterated
-o ~/exllamav2/exl2/
-om ~/exllamav2/ex2m/measurement.json
-l 16000
-ml 16000
-c erotiquant.parquet
-r 400
-mr 50
python3 convert.py \
-i /root/exllamav2/zetasepic_Mistral-Small-Instruct-2409-abliterated \
-o /root/temp/exl2/ \
-nr \
-m /root/exllamav2/measurement.json \
-mr 50 \
-cf /root/4.5bpw/ \
-c erotiquant.parquet \
-l 16000 \
-r 400 \
-b 4.5
- Downloads last month
- 24
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for openerotica/Mistral-Small-Instruct-2409-abliterated-4.5bpw-exl2
Base model
mistralai/Mistral-Small-Instruct-2409