GGUF of tagalog-seallm-7b-v1. (Primarily tested and run with Koboldcpp).
- Downloads last month
- 33
Hardware compatibility
Log In
to view the estimation
4-bit
5-bit
8-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no pipeline_tag.
Model tree for 922-Narra/tagalog-seallm-7b-v1-gguf
Base model
SeaLLMs/SeaLLM-7B-v2