babyllama-v0.6-gguf / README.md
McaTech's picture
Update README.md
e2f4677 verified

The babyllama-v0.6-gguf llm is quantized in 4bit using llama.cpp

The original model: kevin009/babyllama-v0.6

This repo contains an "apk" which is chatNONET. This app is an offline chatbot. You can ask any questions to it. We use babyllama as a language model to developed this offline chatbot app.

image/jpeg