Base_model showing up as finetune
base_model: Replete-AI/Replete-LLM-Qwen2-7b
The Hub will infer the type of relationship from the current model to the base model ("adapter", "merge", "quantized", "finetune") but you can also set it explicitly if needed: base_model_relation: quantized for instance.
https://huggingface.co/docs/hub/model-cards#specifying-a-base-model
I guess it wasnt able to infer the type on its own. Does it use the file types to do that? or the title?
This is all very new functionality, if its broken they will fix it.
Since theres no model in the main branch, it wasnt able to infer the type. Im not sure using branches for the different quants is the best way. Maybe consider using just folders?
Folders also mess with things, people will struggle downloading (and likely end up downloading 5 quant sizes in one command), I think branches has served me reasonably well