Convert to GGUF format for other languages

#1
by YanaS - opened

Hey, I have difficulty converting this model Photolens/llama-2-13b-langchain-chat for my language. I would really appreciate it if you share how you did it for Chinese, so I can do it for Bulgarian language as well.

sry for inconvenience, it is a typo in model card, I generate it automatically. Thanks for bein watchful. :)

So, does your version support all languages that are supported by the original Photolens model?

Also, you don't have all conig.json and tokenizer-linked files. Wouldn't this cause an issue if used with langchain?

Hello, we used an openassistant based langchain formatted dataset of ours to finetune llama2. The quantization methods doesnt downgrade knowlage in any means, so this model is just a quantized (smaller, similar in quality) of our model.

Also, I believe you can load the model from ctransformers ans load the llm in the langchain code. You can see documentation here: https://python.langchain.com/docs/integrations/providers/ctransformers

Thank you. I will check it and see what I can do.

No problem, feel free to ask anything that comes to mind. :)

Sign up or log in to comment