doberst commited on
Commit
4a8aec8
1 Parent(s): 453e81c

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +22 -36
README.md CHANGED
@@ -6,57 +6,43 @@ license: apache-2.0
6
 
7
  <!-- Provide a quick summary of what the model is/does. -->
8
 
9
- **slim-sentiment-tool** is part of the SLIM ("Structured Language Instruction Model") model series, providing a set of small, specialized decoder-based LLMs, fine-tuned for function-calling.
10
 
11
- slim-sentiment-tool is a 4_K_M quantized GGUF version of slim-sentiment-tool, providing a fast, small inference implementation.
12
-
13
- Load in your favorite GGUF inference engine, or try with llmware as follows:
14
-
15
- from llmware.models import ModelCatalog
16
-
17
- sentiment_tool = ModelCatalog().load_model("llmware/slim-sentiment-tool")
18
- response = sentiment_tool.function_call(text_sample, params=["sentiment"], function="classify")
19
-
20
- Slim models can also be loaded even more simply as part of LLMfx calls:
21
-
22
- from llmware.agents import LLMfx
23
-
24
- llm_fx = LLMfx()
25
- llm_fx.load_tool("sentiment")
26
- response = llm_fx.sentiment(text)
27
 
 
28
 
29
- ### Model Description
30
 
31
- <!-- Provide a longer summary of what this model is. -->
 
 
32
 
33
- - **Developed by:** llmware
34
- - **Model type:** GGUF
35
- - **Language(s) (NLP):** English
36
- - **License:** Apache 2.0
37
- - **Quantized from model:** llmware/slim-sentiment (finetuned tiny llama)
38
-
39
- ## Uses
40
 
41
- <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
 
 
 
 
42
 
43
- The intended use of SLIM models is to re-imagine traditional 'hard-coded' classifiers through the use of function calls.
 
44
 
45
- Example:
46
 
47
- text = "The stock market declined yesterday as investors worried increasingly about the slowing economy."
48
 
49
- model generation - {"sentiment": ["negative"]}
50
 
51
- keys = "sentiment"
 
 
52
 
53
- All of the SLIM models use a novel prompt instruction structured as follows:
54
 
55
- "<human> " + text + "<classify> " + keys + "</classify>" + "/n<bot>: "
56
 
57
 
58
  ## Model Card Contact
59
 
60
- Darren Oberst & llmware team
61
-
62
 
 
 
6
 
7
  <!-- Provide a quick summary of what the model is/does. -->
8
 
 
9
 
10
+ **slim-sentiment-tool** is a 4_K_M quantized GGUF version of slim-sentiment, providing a small, fast inference implementation, optimized for multi-model concurrent deployment.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
11
 
12
+ [**slim-sentiment**](https://huggingface.co/llmware/slim-sentiment) is part of the SLIM ("**S**tructured **L**anguage **I**nstruction **M**odel") series, providing a set of small, specialized decoder-based LLMs, fine-tuned for function-calling.
13
 
14
+ To pull the model via API:
15
 
16
+ from huggingface_hub import snapshot_download
17
+ snapshot_download("llmware/slim-sentiment-tool", local_dir="/path/on/your/machine/", local_dir_use_symlinks=False)
18
+
19
 
20
+ Load in your favorite GGUF inference engine, or try with llmware as follows:
 
 
 
 
 
 
21
 
22
+ from llmware.models import ModelCatalog
23
+
24
+ # to load the model and make a basic inference
25
+ model = ModelCatalog().load_model("slim-sentiment-tool")
26
+ response = model.function_call(text_sample)
27
 
28
+ # this one line will download the model and run a series of tests
29
+ ModelCatalog().tool_test_run("slim-sentiment-tool", verbose=True)
30
 
 
31
 
32
+ Slim models can also be loaded even more simply as part of a multi-model, multi-step LLMfx calls:
33
 
34
+ from llmware.agents import LLMfx
35
 
36
+ llm_fx = LLMfx()
37
+ llm_fx.load_tool("sentiment")
38
+ response = llm_fx.sentiment(text)
39
 
 
40
 
41
+ Note: please review [**config.json**](https://huggingface.co/llmware/slim-sentiment-tool/blob/main/config.json) in the repository for prompt wrapping information, details on the model, and full test set.
42
 
43
 
44
  ## Model Card Contact
45
 
46
+ Darren Oberst & llmware team
 
47
 
48
+ [Any questions? Join us on Discord](https://discord.gg/MhZn5Nc39h)