Is there a safetensors format?
gguf is the quantized version, and I can't do sft training. Even with the conversion, there is some loss, especially in the parameter format. Is there the original safetensors format?
mlfoundations-dev/oh-dcft-v3.1-claude-3-5-sonnet-20241022 this is the original safetensors?
gguf is not a quantized format, it is a container format pretty much like safetensors, just with more features. both safetensors and gguf tensors can be quantized.
as for your question, could you explain what is confusing about the model page? it answers your question in the first sentence.
In fact, I have tried several gguf format models before, but the conversion was not as expected because many gguf-specific quantization types. So I want to find the safetensors format.
I actually want to confirm it. But I don't know why you put the safetensors format in mlfoundations-dev instead of mlfoundations until I read https://huggingface.co/mradermacher/model_requests
I think I understand the problem now: we (mradermacher) only provide gguf quants of other people's models. we didn't put anything on mlfoundations(-dev).