Muennighoff commited on
Commit
59290ca
1 Parent(s): 480af3c

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +171 -0
README.md ADDED
@@ -0,0 +1,171 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: bigscience-bloom-rail-1.0
3
+ datasets:
4
+ - bigscience/xP3
5
+ language:
6
+ - ak
7
+ - ar
8
+ - as
9
+ - bm
10
+ - bn
11
+ - ca
12
+ - code
13
+ - en
14
+ - es
15
+ - eu
16
+ - fon
17
+ - fr
18
+ - gu
19
+ - hi
20
+ - id
21
+ - ig
22
+ - ki
23
+ - kn
24
+ - lg
25
+ - ln
26
+ - ml
27
+ - mr
28
+ - ne
29
+ - nso
30
+ - ny
31
+ - or
32
+ - pa
33
+ - pt
34
+ - rn
35
+ - rw
36
+ - sn
37
+ - st
38
+ - sw
39
+ - ta
40
+ - te
41
+ - tn
42
+ - ts
43
+ - tum
44
+ - tw
45
+ - ur
46
+ - vi
47
+ - wo
48
+ - xh
49
+ - yo
50
+ - zh
51
+ - zu
52
+ programming_language:
53
+ - C
54
+ - C++
55
+ - C#
56
+ - Go
57
+ - Java
58
+ - JavaScript
59
+ - Lua
60
+ - PHP
61
+ - Python
62
+ - Ruby
63
+ - Rust
64
+ - Scala
65
+ - TypeScript
66
+ pipeline_tag: text-generation
67
+ widget:
68
+ - text: "A is the son's of B's uncle. What is the family relationship between A and B?"
69
+ - text: "Reorder the words in this sentence: justin and name bieber years is my am I 27 old."
70
+ - text: "Task: copy but say the opposite.\n
71
+ PSG won its match against Barca."
72
+ - text: "Is this review positive or negative? Review: Best cast iron skillet you will every buy."
73
+ example_title: "Sentiment analysis"
74
+ - text: "Question A: How is air traffic controlled?
75
+ \nQuestion B: How do you become an air traffic controller?\nPick one: these questions are duplicates or not duplicates."
76
+ - text: "Barack Obama nominated Hilary Clinton as his secretary of state on Monday. He chose her because she had foreign affairs experience as a former First Lady.
77
+ \nIn the previous sentence, decide who 'her' is referring to."
78
+ example_title: "Coreference resolution"
79
+ - text: "Last week I upgraded my iOS version and ever since then my phone has been overheating whenever I use your app.\n
80
+ Select the category for the above sentence from: mobile, website, billing, account access."
81
+ - text: "Sentence 1: Gyorgy Heizler, head of the local disaster unit, said the coach was carrying 38 passengers.\n
82
+ Sentence 2: The head of the local disaster unit, Gyorgy Heizler, said the bus was full except for 38 empty seats.\n\n
83
+ Do sentences 1 and 2 have the same meaning?"
84
+ example_title: "Paraphrase identification"
85
+ - text: "Here's the beginning of an article, choose a tag that best describes the topic of the article: business, cinema, politics, health, travel, sports.\n\n
86
+ The best and worst fo 007 as 'No time to die' marks Daniel Craig's exit.\n
87
+ (CNN) Some 007 math: 60 years, 25 movies (with a small asterisk) and six James Bonds. For a Cold War creation, Ian Fleming's suave spy has certainly gotten around, but despite different guises in the tuxedo and occasional scuba gear, when it comes to Bond ratings, there really shouldn't be much argument about who wore it best."
88
+ - text: "Max: Know any good websites to buy clothes from?\n
89
+ Payton: Sure :) LINK 1, LINK 2, LINK 3\n
90
+ Max: That's a lot of them!\n
91
+ Payton: Yeah, but they have different things so I usually buy things from 2 or 3 of them.\n
92
+ Max: I'll check them out. Thanks.\n\n
93
+ Who or what are Payton and Max referring to when they say 'them'?"
94
+ - text: "Is the word 'table' used in the same meaning in the two following sentences?\n\n
95
+ Sentence A: you can leave the books on the table over there.\n
96
+ Sentence B: the tables in this book are very hard to read."
97
+ - text: "On a shelf, there are five books: a gray book, a red book, a purple book, a blue book, and a black book.\n
98
+ The red book is to the right of the gray book. The black book is to the left of the blue book. The blue book is to the left of the gray book. The purple book is the second from the right.\n\n
99
+ Which book is the leftmost book?"
100
+ example_title: "Logic puzzles"
101
+ - text: "The two men running to become New York City's next mayor will face off in their first debate Wednesday night.\n\n
102
+ Democrat Eric Adams, the Brooklyn Borough president and a former New York City police captain, is widely expected to win the Nov. 2 election against Republican Curtis Sliwa, the founder of the 1970s-era Guardian Angels anti-crime patril.\n\n
103
+ Who are the men running for mayor?"
104
+ example_title: "Reading comprehension"
105
+ - text: "The word 'binne' means any animal that is furry and has four legs, and the word 'bam' means a simple sort of dwelling.\n\n
106
+ Which of the following best characterizes binne bams?\n
107
+ - Sentence 1: Binne bams are for pets.\n
108
+ - Sentence 2: Binne bams are typically furnished with sofas and televisions.\n
109
+ - Sentence 3: Binne bams are luxurious apartments.\n
110
+ - Sentence 4: Binne bams are places where people live."
111
+ ---
112
+
113
+ **Repository**: [bigscience-workshop/bloomz](https://github.com/bigscience-workshop/bloomz)
114
+
115
+ # Models
116
+
117
+ Multilingual model capable of following user instructions in a variety of languages. Together with our paper [TODO: LINK], we release the following models:
118
+
119
+ ----
120
+
121
+ - [bloomz](https://huggingface.co/bigscience/bloomz): 176B parameter multitask finetuned version of [bloom](https://huggingface.co/bigscience/bloom) on [xP3](https://huggingface.co/bigscience/xP3)
122
+ - [bloomz-7b1](https://huggingface.co/bigscience/bloomz-7b1): 7.1B parameter multitask finetuned version of [bloom-7b1](https://huggingface.co/bigscience/bloom-7b1) on [xP3](https://huggingface.co/bigscience/xP3)
123
+ - [bloomz-3b](https://huggingface.co/bigscience/bloomz-3b): 3B parameter multitask finetuned version of [bloom-3b](https://huggingface.co/bigscience/bloom-3b) on [xP3](https://huggingface.co/bigscience/xP3)
124
+ - [bloomz-1b7](https://huggingface.co/bigscience/bloomz-1b7): 1.7B parameter multitask finetuned version of [bloom-1b7](https://huggingface.co/bigscience/bloom-1b7) on [xP3](https://huggingface.co/bigscience/xP3)
125
+ - [bloomz-1b1](https://huggingface.co/bigscience/bloomz-1b1): 1.7B parameter multitask finetuned version of [bloom-1b1](https://huggingface.co/bigscience/bloom-1b1) on [xP3](https://huggingface.co/bigscience/xP3)
126
+ - [bloomz-560m](https://huggingface.co/bigscience/bloomz-560m): 560M parameter multitask finetuned version of [bloom-560m](https://huggingface.co/bigscience/bloom-560m) on [xP3](https://huggingface.co/bigscience/xP3)
127
+
128
+ ----
129
+
130
+ - [bloomz-mt](https://huggingface.co/bigscience/bloomz-mt): 176B parameter multitask finetuned version of [bloom](https://huggingface.co/bigscience/bloom) on [xP3](https://huggingface.co/bigscience/xP3) & [xP3mt](https://huggingface.co/bigscience/xP3). **Better than [bloomz](https://huggingface.co/bigscience/bloomz) when prompting in non-english**
131
+ - [bloomz-7b1-mt](https://huggingface.co/bigscience/bloomz-7b1-mt): 7.1B parameter multitask finetuned version of [bloom-7b1](https://huggingface.co/bigscience/bloom-7b1) on [xP3](https://huggingface.co/bigscience/xP3) & [xP3mt](https://huggingface.co/bigscience/xP3). **Better than [bloomz-7b1](https://huggingface.co/bigscience/bloomz-7b1) when prompting in non-english**
132
+
133
+ ----
134
+
135
+ - [bloomz-p3](https://huggingface.co/bigscience/bloomz): 176B parameter multitask finetuned version of [bloom](https://huggingface.co/bigscience/bloom) on [P3](https://huggingface.co/bigscience/P3). **Released for research purposes, performance is inferior to [bloomz](https://huggingface.co/bigscience/bloomz)**
136
+ - [bloomz-7b1-p3](https://huggingface.co/bigscience/bloomz-7b1): 7.1B parameter multitask finetuned version of [bloom-7b1](https://huggingface.co/bigscience/bloom-7b1) on [P3](https://huggingface.co/bigscience/P3). **Released for research purposes, performance is inferior to [bloomz-7b1](https://huggingface.co/bigscience/bloomz-7b1)**
137
+
138
+ ----
139
+
140
+ # Intended uses
141
+
142
+ You can use the models to perform inference on tasks by specifying your query in natural language, and the models will generate a prediction. For instance, you can ask *"Translate this to Chinese: Je t'aime."*, and the model will hopefully generate *"我爱你"*.
143
+
144
+ # How to use
145
+
146
+ Here is how to use the model in PyTorch:
147
+ ```python
148
+ from transformers import AutoTokenizer, AutoModelForCausalLM
149
+
150
+ tokenizer = AutoTokenizer.from_pretrained("bigscience/bloomz-560m")
151
+ model = AutoModelForCausalLM.from_pretrained("bigscience/bloomz-560m")
152
+
153
+ inputs = tokenizer.encode("Is this review positive or negative? Review: this is the best cast iron skillet you will ever buy", return_tensors="pt")
154
+ outputs = model.generate(inputs)
155
+ print(tokenizer.decode(outputs[0]))
156
+ ```
157
+
158
+ To use another checkpoint, replace the path in `AutoTokenizer` and `AutoModelForCausalLM`.
159
+
160
+ **Note: 176B models are trained with bfloat16, while smaller models are trained with fp16. We recommend using the same precision type or fp32 at inference**
161
+
162
+ # Limitations
163
+
164
+ - Large model size may require large computational resources
165
+ - High performance variance depending on the prompt
166
+
167
+ # BibTeX entry and citation info
168
+
169
+ ```bibtex
170
+ TODO
171
+ ```