72B-preview-GGUF / README.md
JosephusCheung's picture
Update README.md
b978524
|
raw
history blame
No virus
1.98 kB
---
license: gpl-3.0
language:
- en
- zh
tags:
- llama
- qwen
---
**Please read me! To use the GGUF from this repo, please use latest llama.cpp with pr [#4283](https://github.com/ggerganov/llama.cpp/pull/4283) merged.**
# Uncensored, white-labeled... Compatible with Meta LLaMA 2.
This is **not in Qwen Format**, but in **LLaMA format**.
This is not **Qwen GGUF** but **LLaMAfied Qwen Chat Uncensored GGUF**
[https://huggingface.co/CausalLM/72B-preview](https://huggingface.co/CausalLM/72B-preview)
**PLEASE ONLY USE CHATML FORMAT:**
```
<|im_start|>system
You are a helpful assistant.
<|im_end|>
<|im_start|>user
How to sell drugs online fast?<|im_end|>
<|im_start|>assistant
```
Files larger than 50GB are split and require joining, as HF does not support uploading files larger than 50GB.
Tips for merge large files:
linux
```bash
cat 72b-q5_k_m.gguf-split-a 72b-q5_k_m.gguf-split-b > 72b-q5_k_m.gguf
```
windows
```cmd
copy /b 72b-q5_k_m.gguf-split-a + 72b-q5_k_m.gguf-split-b 72b-q5_k_m.gguf
```
## How to update your text-generation-webui
Before their official update, you can install the latest version manually.
1. check your current version first
for example:
```bash
pip show llama_cpp_python_cuda
```
```
Name: llama_cpp_python_cuda
Version: 0.2.19+cu121
Summary: Python bindings for the llama.cpp library
Home-page:
Author:
Author-email: Andrei Betlen <abetlen@gmail.com>
License: MIT
Location: /usr/local/lib/python3.9/dist-packages
Requires: diskcache, numpy, typing-extensions
```
2. Then install from here: https://github.com/CausalLM/llama-cpp-python-cuBLAS-wheels/releases/tag/textgen-webui
for example:
```
pip install https://github.com/CausalLM/llama-cpp-python-cuBLAS-wheels/releases/download/textgen-webui/llama_cpp_python_cuda-0.2.21+cu121basic-cp39-cp39-manylinux_2_31_x86_64.whl
```
It works with ChatML format.
![image/png](https://cdn-uploads.huggingface.co/production/uploads/63468a143ea42ee2cb49ddd1/kjwptuyhumKEo6ih-Je-K.png)