Edit model card

This is a ggml quantized version of Replit-v2-CodeInstruct-3B. Quantized to 4bit -> q4_1. To run inference you can use ggml directly or ctransformers.

Downloads last month
16
Inference API (serverless) does not yet support model repos that contain custom code.

Spaces using abacaj/Replit-v2-CodeInstruct-3B-ggml 2