Text Generation
Transformers
PyTorch
English
llama
text-generation-inference
Inference Endpoints

OpenOrcaxOpenChat-Preview2-13B - Settings for Use

#3
by HighlandGNU - opened

I thought the good folk who brought us Open-Orca might be interested in how their model is used in the wild and be able to help with some settings.

Many people are using OobaBooga to load models and configure settings. There are quite a lot of options. Which ones would help make the most of this model?

https://github.com/oobabooga/text-generation-webui

brave_eBJRkATwBY.png
brave_8a0IhiAdyQ.png
brave_NN3HjotP0c.png

OpenOrca org

You can find instructions for proper setup in the model card under this section:
https://huggingface.co/Open-Orca/OpenOrcaxOpenChat-Preview2-13B#oobabooga-chat-settings

bleysg changed discussion status to closed

Sign up or log in to comment