Run Mistral model on Remote server

#94
by icemaro - opened

i am wondering if i can run this model on a remote server ?
is there a way to consume it with IU like interface , one like LM Studio except that the model will run on the remote server.

ollama + web ui (easy to run in docker)..

deleted
β€’
edited Jan 15

i use ooba's text gen as a server.

its setup to run as a service under Linux. Just pass parameters to it to be accessible on your network.

I want to run the Mistral model

deleted

I want to run the Mistral model

And you got 2.5 suggestions now for doing it.

i have put together a simple implementation guide using Runpod (for GPU), Google Colab (for inference), and Gradio (for UI) Here:
https://github.com/aigeek0x0/radiantloom-ai/blob/main/mixtral-8x7b-instruct-v-0.1-runpod-template.md

Sign up or log in to comment