Model Card for madlad400-3b-mt-4bit
馃毃 This model is a 4-bit quantized version of Google's madlad400-3b-mt using bitsandbytes. You can find the unquantized version of madlad400-3b-mt here.
- Downloads last month
- 2
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.