A new version of Mistral is out! Id love to see a new neural chat
#20 opened 8 months ago
by
rombodawg
Model not found when using OVModelForCausalLM
#17 opened 11 months ago
by
thirdrock
How to load pytorch shards only and not safe-tensors ? so that we can load only the pytorch model into gpu from huggingface?
#16 opened 11 months ago
by
bilwa99
Add extra `metadata` in `README.md`
#15 opened 11 months ago
by
alvarobartt
Other benchmarks as MT-Bench and/or AlpacaEval
2
#14 opened 11 months ago
by
alvarobartt
About DROP results within the `lm-eval-harness`
4
#13 opened 11 months ago
by
alvarobartt
Request: DOI
1
#12 opened 11 months ago
by
Sintayew4
Potential ways to reduce inference latency on CPU cluster?
2
#11 opened 11 months ago
by
TheBacteria
Add base_model metadata
#9 opened 12 months ago
by
davanstrien
Context Length
2
#7 opened 12 months ago
by
mrfakename
Free and ready to use neural-chat-7B-v3-1-GGUF model as OpenAI API compatible endpoint
#6 opened 12 months ago
by
limcheekin
What is the different between Intel/neural-chat-7b-v3-1 vs Intel/neural-chat-7b-v3?
3
#3 opened 12 months ago
by
Ichsan2895
Prompt Template?
13
#1 opened 12 months ago
by
fakezeta