HELP: How to host mixtral model as openai server

#51
by TurtleRuss - opened

I know that mistral has integrated with openai and can be used online by openai-python. Here is the thing, I want to host a local mixtral model and change the base_url param like :

client = OpenAI(
   base_url="http://localhost:12345"
)

Thanksss:)

Sign up or log in to comment