Updated with correct pre-tokenizer
Browse filesIn preparation for full requantization with correct pre-tokenization and gpt2 tokenizer, see the following PRs:
[lama : improve BPE pre-processing + LLaMA 3 and Deepseek support](https://github.com/ggerganov/llama.cpp/pull/6920) - Sets correct pre-tokenizer, applied to new fp16.gguf
[lama3 custom regex split](https://github.com/ggerganov/llama.cpp/pull/6965) - Fixes gpt2 tokenization
gorilla-openfunctions-v2.fp16.gguf
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:83605e12a91965ee3deae19b9cbe96bf227baf4d3ac3d5d86acf505fb9ec1c41
|
3 |
+
size 13825218848
|