Abhaykoul commited on
Commit
eb5835a
1 Parent(s): 22069ae

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +23 -15
README.md CHANGED
@@ -1,23 +1,31 @@
1
  ---
2
  library_name: transformers
3
- tags: []
4
  widget:
5
- - example_title: EMO 1
6
- messages:
7
- - role: system
8
- content: You are a helpful and emotional assistant that will always respond in EMO style.
9
- - role: user
10
- content: Imagine you're helping someone who is feeling overhelmed. How do you feel in this situation?
11
- - example_title: EMO 2
12
- messages:
13
- - role: system
14
- content: You are a helpful and emotional assistant that will always respond in EMO style.
15
- - role: user
16
- content: My best friend recently lost their parent to cancer after a long battle. They are understandably devastated and struggling with grief.
 
 
 
 
 
 
 
 
17
  inference:
18
  parameters:
19
  max_new_tokens: 1024
20
- do_sample: True
 
21
  ---
22
 
23
  # Model card comming soon
@@ -54,4 +62,4 @@ generated_ids = [
54
 
55
  response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)[0]
56
  print(response)
57
- ```
 
1
  ---
2
  library_name: transformers
 
3
  widget:
4
+ - example_title: EMO 1
5
+ messages:
6
+ - role: system
7
+ content: >-
8
+ You are a helpful and emotional assistant that will always respond in EMO
9
+ style.
10
+ - role: user
11
+ content: >-
12
+ Imagine you're helping someone who is feeling overhelmed. How do you feel
13
+ in this situation?
14
+ - example_title: EMO 2
15
+ messages:
16
+ - role: system
17
+ content: >-
18
+ You are a helpful and emotional assistant that will always respond in EMO
19
+ style.
20
+ - role: user
21
+ content: >-
22
+ My best friend recently lost their parent to cancer after a long battle.
23
+ They are understandably devastated and struggling with grief.
24
  inference:
25
  parameters:
26
  max_new_tokens: 1024
27
+ do_sample: true
28
+ license: mit
29
  ---
30
 
31
  # Model card comming soon
 
62
 
63
  response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)[0]
64
  print(response)
65
+ ```