I'm only uploading float16 models because base models are mostly bfloat16 so its useless to, upcast to float32.
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support