doberst commited on
Commit
df3337e
1 Parent(s): 4d4e110

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -10
README.md CHANGED
@@ -7,22 +7,22 @@ inference: false
7
 
8
  <!-- Provide a quick summary of what the model is/does. -->
9
 
10
- **slim-sentiment** is part of the SLIM ("**S**tructured **L**anguage **I**nstruction **M**odel") model series, consisting of small, specialized decoder-based models, fine-tuned for function-calling.
11
 
12
- slim-sentiment has been fine-tuned for **sentiment analysis** function calls, generating output consisting of a python dictionary corresponding to specified keys, e.g.:
13
 
14
- &nbsp;&nbsp;&nbsp;&nbsp;`{"sentiment": ["positive"]}`
15
 
16
 
17
  SLIM models are designed to provide a flexible natural language generative model that can be used as part of a multi-step, multi-model LLM-based automation workflow.
18
 
19
- Each slim model has a 'quantized tool' version, e.g., [**'slim-sentiment-tool'**](https://huggingface.co/llmware/slim-sentiment-tool).
20
 
21
 
22
  ## Prompt format:
23
 
24
  `function = "classify"`
25
- `params = "sentiment"`
26
  `prompt = "<human> " + {text} + "\n" + `
27
  &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;&nbsp; &nbsp; &nbsp; &nbsp;`"<{function}> " + {params} + "</{function}>" + "\n<bot>:"`
28
 
@@ -30,11 +30,11 @@ Each slim model has a 'quantized tool' version, e.g., [**'slim-sentiment-tool'*
30
  <details>
31
  <summary>Transformers Script </summary>
32
 
33
- model = AutoModelForCausalLM.from_pretrained("llmware/slim-sentiment")
34
- tokenizer = AutoTokenizer.from_pretrained("llmware/slim-sentiment")
35
 
36
  function = "classify"
37
- params = "sentiment"
38
 
39
  text = "The stock market declined yesterday as investors worried increasingly about the slowing economy."
40
 
@@ -73,8 +73,8 @@ Each slim model has a 'quantized tool' version, e.g., [**'slim-sentiment-tool'*
73
  <summary>Using as Function Call in LLMWare</summary>
74
 
75
  from llmware.models import ModelCatalog
76
- slim_model = ModelCatalog().load_model("llmware/slim-sentiment")
77
- response = slim_model.function_call(text,params=["sentiment"], function="classify")
78
 
79
  print("llmware - llm_response: ", response)
80
 
 
7
 
8
  <!-- Provide a quick summary of what the model is/does. -->
9
 
10
+ **slim-emotions** is part of the SLIM ("**S**tructured **L**anguage **I**nstruction **M**odel") model series, consisting of small, specialized decoder-based models, fine-tuned for function-calling.
11
 
12
+ slim-emotions has been fine-tuned for **emotion analysis** function calls, generating output consisting of a python dictionary corresponding to specified keys, e.g.:
13
 
14
+ &nbsp;&nbsp;&nbsp;&nbsp;`{"emotion": ["proud"]}`
15
 
16
 
17
  SLIM models are designed to provide a flexible natural language generative model that can be used as part of a multi-step, multi-model LLM-based automation workflow.
18
 
19
+ Each slim model has a 'quantized tool' version, e.g., [**'slim-emotions-tool'**](https://huggingface.co/llmware/slim-emotions-tool).
20
 
21
 
22
  ## Prompt format:
23
 
24
  `function = "classify"`
25
+ `params = "emotions"`
26
  `prompt = "<human> " + {text} + "\n" + `
27
  &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;&nbsp; &nbsp; &nbsp; &nbsp;`"<{function}> " + {params} + "</{function}>" + "\n<bot>:"`
28
 
 
30
  <details>
31
  <summary>Transformers Script </summary>
32
 
33
+ model = AutoModelForCausalLM.from_pretrained("llmware/slim-emotions")
34
+ tokenizer = AutoTokenizer.from_pretrained("llmware/slim-emotions")
35
 
36
  function = "classify"
37
+ params = "emotions"
38
 
39
  text = "The stock market declined yesterday as investors worried increasingly about the slowing economy."
40
 
 
73
  <summary>Using as Function Call in LLMWare</summary>
74
 
75
  from llmware.models import ModelCatalog
76
+ slim_model = ModelCatalog().load_model("llmware/slim-emotions")
77
+ response = slim_model.function_call(text,params=["emotions"], function="classify")
78
 
79
  print("llmware - llm_response: ", response)
80