This is a 8.0bpw h8 quantized version of xingyaoww/CodeActAgent-Mistral-7b-v0.1. It is quantized with exllamav2.
- Downloads last month
- 7
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.