Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

To set up oobabooga on a Linux machine, you need to follow these steps:

Install the prerequisites for oobabooga, such as Python, CUDA, PyTorch, and git. You can use your package manager or pip to install them. You can also refer to this guide for more details: https://gist.github.com/lxe/82eb87db25fdb75b92fa18a6d494ee3c

Clone the oobabooga repository and the GPTQ-for-LLaMa repository using git. You can use the git clone command to copy the repositories to your machine. You can also refer to this guide for more details: https://github.com/oobabooga/text-generation-webui/wiki/Windows-installation-guide

Download the facebook/opt-6.7b model using the download-model.py script. You can use the python command to run the script with the model name as an argument. You can also refer to this guide for more details: https://github.com/oobabooga/text-generation-webui/wiki/Windows-installation-guide

Install the GPTQ-for-LLaMa library using the setup_cuda.py script. You can use the python command to run the script with the install option. You can also refer to this guide for more details: https://github.com/oobabooga/text-generation-webui/wiki/Windows-installation-guide

Run the oobabooga web interface using the server.py script. You can use the python command to run the script. You can also refer to this guide for more details: https://github.com/oobabooga/text-generation-webui/wiki/Windows-installation-guide Example: conda activate textgen python server.py --share --auto-devices --chat

Open your browser and go to http://127.0.0.1:7860 or https://some-code-generated.gradio.live to access the oobabooga web interface

Downloads last month
0
Unable to determine this model's library. Check the docs .