llama 3 bug found with gguf losing data part 1part 2part 3github ticket
You forgot part 4:
https://www.reddit.com/r/LocalLLaMA/comments/1cn1398/part_4_theres_likely_no_llamacpp_gguf_tokenizer/
Does reconverting into gguf with latest llama.cpp solve this issue?
Β· Sign up or log in to comment