--- tags: - llava - bakllava - lmm - ggml - llama.cpp --- # ggml_bakllava-1 This repo contains GGUF files to inference [BakLLaVA-1](https://huggingface.co/SkunkworksAI/BakLLaVA-1) with [llama.cpp](https://github.com/ggerganov/llama.cpp) end-to-end without any extra dependency. **Note**: The `mmproj-model-f16.gguf` file structure is experimental and may change. Always use the latest code in llama.cpp.