Add conversational tag
1
#11 opened about 2 months ago
by
celinah
llama.cpp doesn't support this model, how can I convert safetensors model to bin and load in ollama
#10 opened 2 months ago
by
shuminzhou26803586
Update chat_template.json to incorporate `generation` tag
#9 opened 3 months ago
by
zjysteven
RuntimeError: Could not infer dtype of numpy.float32 when converting to PyTorch tensor
1
#8 opened 4 months ago
by
Koshti10
shape mismatch error during inference with finetuned Model
6
#7 opened 6 months ago
by
mdmev
Why no chat template like non-chatty has?
#5 opened 7 months ago
by
pseudotensor
How to merge an adapter to the base model
1
#4 opened 7 months ago
by
alielfilali01
How to deploy on inference endpoints?
2
#3 opened 7 months ago
by
brianjking
Update README.md
#2 opened 7 months ago
by
Alexander70
[Question] question about hyperparameter
1
#1 opened 7 months ago
by
Lala-chick