Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
amd
/
Mistral-7B-Instruct-v0.3-awq-g128-int4-asym-fp16-onnx-hybrid
like
0
Follow
AMD
1.09k
ONNX
License:
apache-2.0
Model card
Files
Files and versions
Community
main
Mistral-7B-Instruct-v0.3-awq-g128-int4-asym-fp16-onnx-hybrid
File size: 2 Bytes
f8e1493
1
{
}