MicahB commited on
Commit
f6f4f89
1 Parent(s): c81c37a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -8
README.md CHANGED
@@ -12,8 +12,11 @@ datasets:
12
  license: mit
13
  widget:
14
  - text: I am not having a great day.
 
15
  ---
16
 
 
 
17
  #### Overview
18
 
19
  Model trained from [roberta-base](https://huggingface.co/roberta-base) on the [go_emotions](https://huggingface.co/datasets/go_emotions) dataset for multi-label classification.
@@ -34,16 +37,17 @@ The model was trained using `AutoModelForSequenceClassification.from_pretrained`
34
 
35
  There are multiple ways to use this model in Huggingface Transformers. Possibly the simplest is using a pipeline:
36
 
37
- ```python
38
- from transformers import pipeline
39
-
40
- classifier = pipeline(task="text-classification", model="SamLowe/roberta-base-go_emotions", top_k=None)
41
 
42
- sentences = ["I am not having a great day"]
 
 
 
43
 
44
- model_outputs = classifier(sentences)
45
- print(model_outputs[0])
46
- # produces a list of dicts for each of the labels
47
  ```
48
 
49
  #### Evaluation / metrics
 
12
  license: mit
13
  widget:
14
  - text: I am not having a great day.
15
+ library_name: transformers.js
16
  ---
17
 
18
+ # This is a Transformers.js clone of [SamLowe/roberta-base-go_emotions](https://huggingface.co/SamLowe/roberta-base-go_emotions) !
19
+
20
  #### Overview
21
 
22
  Model trained from [roberta-base](https://huggingface.co/roberta-base) on the [go_emotions](https://huggingface.co/datasets/go_emotions) dataset for multi-label classification.
 
37
 
38
  There are multiple ways to use this model in Huggingface Transformers. Possibly the simplest is using a pipeline:
39
 
40
+ ```js
41
+ const { pipeline } = await import('@xenova/transformers');
 
 
42
 
43
+ // Allocate pipeline
44
+ const pipe = await pipeline('text-classification', "MicahB/roberta-base-go_emotions");
45
+ console.log(await pipe("I love transformers!"));
46
+ ```
47
 
48
+ ```js
49
+ Output:
50
+ [ { label: 'love', score: 0.9529242515563965 } ]
51
  ```
52
 
53
  #### Evaluation / metrics