A new version of Mistral is out! Id love to see a new neural chat
#20 opened 9 months ago
by
rombodawg
Model not found when using OVModelForCausalLM
#17 opened about 1 year ago
by
thirdrock
How to load pytorch shards only and not safe-tensors ? so that we can load only the pytorch model into gpu from huggingface?
#16 opened about 1 year ago
by
bilwa99
Add extra `metadata` in `README.md`
#15 opened about 1 year ago
by
alvarobartt
Other benchmarks as MT-Bench and/or AlpacaEval
2
#14 opened about 1 year ago
by
alvarobartt
About DROP results within the `lm-eval-harness`
4
#13 opened about 1 year ago
by
alvarobartt
Request: DOI
1
#12 opened about 1 year ago
by
Sintayew4
Potential ways to reduce inference latency on CPU cluster?
2
#11 opened about 1 year ago
by
TheBacteria
Add base_model metadata
#9 opened about 1 year ago
by
davanstrien
Context Length
2
#7 opened about 1 year ago
by
mrfakename
Free and ready to use neural-chat-7B-v3-1-GGUF model as OpenAI API compatible endpoint
#6 opened about 1 year ago
by
limcheekin
What is the different between Intel/neural-chat-7b-v3-1 vs Intel/neural-chat-7b-v3?
3
#3 opened about 1 year ago
by
Ichsan2895
Prompt Template?
13
#1 opened about 1 year ago
by
fakezeta