File size: 1,710 Bytes
02ca21d 9fd193c 47505b0 9fd193c 47505b0 9fd193c a14d488 9fd193c 4a8aec8 9fd193c 4a8aec8 a14d488 4a8aec8 9fd193c 4a8aec8 9fd193c 4a8aec8 a14d488 4a8aec8 9fd193c 4a8aec8 a14d488 9fd193c 47505b0 9fd193c 4a8aec8 9fd193c 4a8aec8 a14d488 9fd193c a14d488 9fd193c 4a8aec8 9fd193c 4a8aec8 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 |
---
license: apache-2.0
---
# SLIM-INTENT-TOOL
<!-- Provide a quick summary of what the model is/does. -->
**slim-intent-tool** is a 4_K_M quantized GGUF version of slim-intent, providing a small, fast inference implementation, optimized for multi-model concurrent deployment.
[**slim-intent**](https://huggingface.co/llmware/slim-intent) is part of the SLIM ("**S**tructured **L**anguage **I**nstruction **M**odel") series, providing a set of small, specialized decoder-based LLMs, fine-tuned for function-calling.
To pull the model via API:
from huggingface_hub import snapshot_download
snapshot_download("llmware/slim-intent-tool", local_dir="/path/on/your/machine/", local_dir_use_symlinks=False)
Load in your favorite GGUF inference engine, or try with llmware as follows:
from llmware.models import ModelCatalog
# to load the model and make a basic inference
model = ModelCatalog().load_model("slim-intent-tool")
response = model.function_call(text_sample)
# this one line will download the model and run a series of tests
ModelCatalog().tool_test_run("slim-intent-tool", verbose=True)
Slim models can also orchestrated as part of a multi-model, multi-step LLMfx calls:
from llmware.agents import LLMfx
llm_fx = LLMfx()
llm_fx.load_tool("intent")
response = llm_fx.intent(text)
Note: please review [**config.json**](https://huggingface.co/llmware/slim-intent-tool/blob/main/config.json) in the repository for prompt wrapping information, details on the model, and full test set.
## Model Card Contact
Darren Oberst & llmware team
[Any questions? Join us on Discord](https://discord.gg/MhZn5Nc39h)
|