Quantized model for vllm
- tool: autoawq 4bit
- caliblation: japanese wiki
See detail.
https://huggingface.co/nitky/RoguePlanet-DeepSeek-R1-Qwen-32B
- Downloads last month
- 40
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.
Model tree for fujisan/RoguePlanet-DeepSeek-R1-Qwen-32B-AWQ-calib-wiki
Base model
nitky/RoguePlanet-DeepSeek-R1-Qwen-32B