# Model Card for qa-expert-7B-V1.0-GGUF This repo contains the GGUF format model files for [khaimaitien/qa-expert-7B-V1.0](https://huggingface.co/khaimaitien/qa-expert-7B-V1.0). You can get more information about how to **use/train** the model from this repo: https://github.com/khaimt/qa_expert ### Model Sources [optional] - **Repository:** [https://github.com/khaimt/qa_expert] ## How to Get Started with the Model First, you need to clone the repo: https://github.com/khaimt/qa_expert Then install the requirements: ```shell pip install -r requirements.txt ``` Then install [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) Here is the example code: ```python from qa_expert import get_inference_model, InferenceType def retrieve(query: str) -> str: # You need to implement this retrieval function, input is a query and output is a string # This can be treated as the function to call in function calling of OpenAI return context model_inference = get_inference_model(InferenceType.llama_cpp, "qa-expert-7B-V1.0.q4_0.gguf") answer, messages = model_inference.generate_answer(question, retriever_func) ```