Something wrong when deploy this model to the local server.

#2
by Axinx - opened

Hi! I'm facing a strange issue when deploying this model locally as an API server. When I make an API call with a long code question, the response I receive is content='\n'. I chose this model because deepseekcoders' performance with my tasks and 32k context was good, but it's not functioning properly even when the context is short enough to receive a good response from deepseek-coder-33b-instruct. What could be the problem with my deployment of this model?

OpenBuddy org

what's the prompt format you are currently using?

Sign up or log in to comment