Text Generation
GGUF
Russian
conversational

Интеграция с PrivateGPT

#5
by Kamnsv - opened

Можно ли эту модель интегрировать с проектом
https://github.com/imartinez/privateGPT
там по умолчанию используется:

llm:
mode: local

Should be matching the selected model

max_new_tokens: 512
context_window: 3900
tokenizer: mistralai/Mistral-7B-Instruct-v0.2

embedding:

Should be matching the value above in most cases

mode: local
ingest_mode: simple

local:
prompt_style: "mistral"
llm_hf_repo_id: TheBloke/Mistral-7B-Instruct-v0.2-GGUF
llm_hf_model_file: mistral-7b-instruct-v0.2.Q4_K_M.gguf
embedding_hf_model_name: BAAI/bge-small-en-v1.5

У меня вот такие настройки:

tokenizer: TheBloke/Llama-2-13B-fp16                                                                                                                                                                     
llm_hf_repo_id: IlyaGusev/saiga2_13b_gguf                                                                                                                                                                
llm_hf_model_file: model-q8_0.gguf

Sign up or log in to comment