Openai??

#2
by jostelo - opened

Is this actually a demo of your models or are you just using openai?

image.png

Am I missing sth?

The space is not running with a GPU. They are probably just running a openai compatible backend like vllm or tgi (somewhere else, maybe on the hessianai cluster?). That's why they change the api_base to their deployment server.

So no it is not just using openai's models! Without the key it wouldn't work ๐Ÿ˜‰

Thank you very much for the reply! I guess, I understood. They basically mimicked the openai-api for their model, correct? - that is what got me confused.

Yes many inference libraries mimick the openai api, therefore making it possible to use the oai pip package to use for inference just by changing the api endpoint.

Sign up or log in to comment