vLLM compatible?

#10
by nickandbro - opened

Would be nice to see this in vLLM

hello @nickandbro

Please make sure that you've enabled vLLM in your local apps at https://huggingface.co/settings/local-apps

Afterwards, you should be able to see vLLM snippet under Use this Model dropdown on https://huggingface.co/nvidia/Llama-3_1-Nemotron-51B-Instruct

Please let me know if it works

Thanks!

Screenshot from 2024-10-25 14-24-18.png

Hi. I am getting this error. Could you please help me to fix this error?

Sign up or log in to comment