Update README.md
Browse files
README.md
CHANGED
@@ -140,6 +140,15 @@ It was created with groupsize 64 to give higher inference quality, and without `
|
|
140 |
* Does not work with any version of GPTQ-for-LLaMa
|
141 |
* Parameters: Groupsize = 64. No act-order.
|
142 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
143 |
# ✨ Original model card: Falcon-7B-Instruct
|
144 |
|
145 |
**Falcon-7B-Instruct is a 7B parameters causal decoder-only model built by [TII](https://www.tii.ae) based on [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b) and finetuned on a mixture of chat/instruct datasets. It is made available under the [TII Falcon LLM License](https://huggingface.co/tiiuae/falcon-7b-instruct/blob/main/LICENSE.txt).**
|
|
|
140 |
* Does not work with any version of GPTQ-for-LLaMa
|
141 |
* Parameters: Groupsize = 64. No act-order.
|
142 |
|
143 |
+
## Want to support my work?
|
144 |
+
|
145 |
+
I've had a lot of people ask if they can contribute. I love providing models and helping people, but it is starting to rack up pretty big cloud computing bills.
|
146 |
+
|
147 |
+
So if you're able and willing to contribute, it'd be most gratefully received and will help me to keep providing models, and work on various AI proejcts.
|
148 |
+
|
149 |
+
* Patreon: coming soon! (just awaiting approval)
|
150 |
+
* Ko-Fi: https://ko-fi.com/TheBlokeAI
|
151 |
+
|
152 |
# ✨ Original model card: Falcon-7B-Instruct
|
153 |
|
154 |
**Falcon-7B-Instruct is a 7B parameters causal decoder-only model built by [TII](https://www.tii.ae) based on [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b) and finetuned on a mixture of chat/instruct datasets. It is made available under the [TII Falcon LLM License](https://huggingface.co/tiiuae/falcon-7b-instruct/blob/main/LICENSE.txt).**
|