abacaj's picture
Update README.md
09da5c4
|
raw
history blame
443 Bytes
metadata
license: other

This is a ggml quantized version of Replit-v2-CodeInstruct-3B. Quantized to 4bit -> q4_1. To run inference you can use ggml directly or ctransformers.

Memory usage of model: 2GB~ Repo to run the model using ctransformers: https://github.com/abacaj/replit-3B-inference