llama-cpp-api / README.md
toaster61
ggml -> gguf
7fd3f9f
|
raw
history blame
290 Bytes
metadata
title: Openbuddy LLama Api
emoji: 🏆
colorFrom: red
colorTo: indigo
sdk: docker
pinned: true

This api built using Gradio with queue for openbuddy's models. Also includes Quart and uvicorn setup!

For example I used https://huggingface.co/OpenBuddy/openbuddy-openllama-3b-v10-bf16