maddes8cht
commited on
Commit
•
01c3f56
1
Parent(s):
9a7e8ad
"Update README.md"
Browse files
README.md
CHANGED
@@ -12,7 +12,13 @@ I am continuously enhancing the structure of these model descriptions, and they
|
|
12 |
|
13 |
# Note: Important Update for Falcon Models in llama.cpp Versions After October 18, 2023
|
14 |
|
15 |
-
As noted on the [Llama.cpp]([ggerganov/llama.cpp: Port of Facebook's LLaMA model in C/C++ (github.com)](https://github.com/ggerganov/llama.cpp#hot-topics) GitHub repository, all new releases of Llama.cpp will require a re-quantization due to the implementation of the new BPE tokenizer
|
|
|
|
|
|
|
|
|
|
|
|
|
16 |
|
17 |
**Stay Informed:** Application software using llama.cpp libraries will follow soon. Keep an eye on the release schedules of your favorite software applications that rely on llama.cpp. They will likely provide instructions on how to integrate the new models.
|
18 |
|
@@ -67,7 +73,6 @@ With a Q6_K you should find it really hard to find a quality difference to the o
|
|
67 |
## Please consider to support my work
|
68 |
**Coming Soon:** I'm in the process of launching a sponsorship/crowdfunding campaign for my work. I'm evaluating Kickstarter, Patreon, or the new GitHub Sponsors platform, and I am hoping for some support and contribution to the continued availability of these kind of models. Your support will enable me to provide even more valuable resources and maintain the models you rely on. Your patience and ongoing support are greatly appreciated as I work to make this page an even more valuable resource for the community.
|
69 |
|
70 |
-
|
71 |
<center>
|
72 |
|
73 |
[![GitHub](https://maddes8cht.github.io/assets/buttons/github-io-button.png)](https://maddes8cht.github.io)
|
|
|
12 |
|
13 |
# Note: Important Update for Falcon Models in llama.cpp Versions After October 18, 2023
|
14 |
|
15 |
+
As noted on the [Llama.cpp]([ggerganov/llama.cpp: Port of Facebook's LLaMA model in C/C++ (github.com)](https://github.com/ggerganov/llama.cpp#hot-topics) GitHub repository, all new releases of Llama.cpp will require a re-quantization due to the implementation of the new BPE tokenizer, which impacts both the original Falcon models and their derived variants.
|
16 |
+
|
17 |
+
Here's what you need to know:
|
18 |
+
|
19 |
+
**Original Falcon Models:** I am diligently working to provide updated quantized versions of the four original Falcon models to ensure their compatibility with the new llama.cpp versions. Please keep an eye on my Hugging Face Model pages for updates on the availability of these models. Promptly downloading them is essential to maintain compatibility with the latest llama.cpp releases.
|
20 |
+
|
21 |
+
**Derived Falcon Models:** It's important to note that the derived Falcon-Models cannot be re-converted without adjustments from the original model creators. Therefore, the compatibility of these derived models with the new llama.cpp versions depends on the actions of the original model creators. So far, these models cannot be used in recent llama.cpp versions at all.
|
22 |
|
23 |
**Stay Informed:** Application software using llama.cpp libraries will follow soon. Keep an eye on the release schedules of your favorite software applications that rely on llama.cpp. They will likely provide instructions on how to integrate the new models.
|
24 |
|
|
|
73 |
## Please consider to support my work
|
74 |
**Coming Soon:** I'm in the process of launching a sponsorship/crowdfunding campaign for my work. I'm evaluating Kickstarter, Patreon, or the new GitHub Sponsors platform, and I am hoping for some support and contribution to the continued availability of these kind of models. Your support will enable me to provide even more valuable resources and maintain the models you rely on. Your patience and ongoing support are greatly appreciated as I work to make this page an even more valuable resource for the community.
|
75 |
|
|
|
76 |
<center>
|
77 |
|
78 |
[![GitHub](https://maddes8cht.github.io/assets/buttons/github-io-button.png)](https://maddes8cht.github.io)
|