nazneen commited on
Commit
a33f56e
1 Parent(s): 126436e

model documentation

Browse files
Files changed (1) hide show
  1. README.md +167 -0
README.md CHANGED
@@ -1,3 +1,170 @@
1
  ---
2
  license: other
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: other
3
+ tags:
4
+ - text-generation
5
  ---
6
+
7
+
8
+ # Model Card for bt-opt-350m
9
+
10
+ # Model Details
11
+
12
+ ## Model Description
13
+
14
+ More information needed
15
+
16
+ - **Developed by:** Opentensor
17
+ - **Shared by [Optional]:** Opentensor
18
+ - **Model type:** Text Generation
19
+ - **Language(s) (NLP):** More information needed
20
+ - **License:** Other
21
+ - **Parent Model:** OPT
22
+ - **Resources for more information:** More information needed
23
+
24
+
25
+
26
+ # Uses
27
+
28
+
29
+ ## Direct Use
30
+ This model can be used for the task of Text Generation
31
+
32
+ ## Downstream Use [Optional]
33
+
34
+ More information needed.
35
+
36
+ ## Out-of-Scope Use
37
+
38
+ The model should not be used to intentionally create hostile or alienating environments for people.
39
+
40
+ # Bias, Risks, and Limitations
41
+
42
+
43
+ Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.
44
+
45
+
46
+
47
+ ## Recommendations
48
+
49
+
50
+ Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
51
+
52
+ # Training Details
53
+
54
+ ## Training Data
55
+
56
+ More information needed
57
+
58
+ ## Training Procedure
59
+
60
+
61
+ ### Preprocessing
62
+
63
+ More information needed
64
+
65
+
66
+
67
+
68
+
69
+ ### Speeds, Sizes, Times
70
+ More information needed
71
+
72
+
73
+ # Evaluation
74
+
75
+
76
+ ## Testing Data, Factors & Metrics
77
+
78
+ ### Testing Data
79
+
80
+ More information needed
81
+
82
+
83
+ ### Factors
84
+ More information needed
85
+
86
+ ### Metrics
87
+
88
+ More information needed
89
+
90
+
91
+ ## Results
92
+
93
+ More information needed
94
+
95
+
96
+ # Model Examination
97
+
98
+ More information needed
99
+
100
+ # Environmental Impact
101
+
102
+ Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
103
+
104
+ - **Hardware Type:** More information needed
105
+ - **Hours used:** More information needed
106
+ - **Cloud Provider:** More information needed
107
+ - **Compute Region:** More information needed
108
+ - **Carbon Emitted:** More information needed
109
+
110
+ # Technical Specifications [optional]
111
+
112
+ ## Model Architecture and Objective
113
+
114
+ More information needed
115
+
116
+ ## Compute Infrastructure
117
+
118
+ More information needed
119
+
120
+ ### Hardware
121
+
122
+
123
+ More information needed
124
+
125
+ ### Software
126
+
127
+ More information needed.
128
+
129
+ # Citation
130
+
131
+
132
+ **BibTeX:**
133
+
134
+ More information needed.
135
+
136
+
137
+
138
+
139
+ # Glossary [optional]
140
+ More information needed
141
+
142
+ # More Information [optional]
143
+ More information needed
144
+
145
+
146
+ # Model Card Authors [optional]
147
+
148
+ Opentensor in collaboration with Ezi Ozoani and the Hugging Face team
149
+
150
+
151
+ # Model Card Contact
152
+
153
+ More information needed
154
+
155
+ # How to Get Started with the Model
156
+
157
+ Use the code below to get started with the model.
158
+
159
+ <details>
160
+ <summary> Click to expand </summary>
161
+
162
+ ```python
163
+ from transformers import AutoTokenizer, AutoModelForCausalLM
164
+
165
+ tokenizer = AutoTokenizer.from_pretrained("opentensor/bt-opt-350m")
166
+
167
+ model = AutoModelForCausalLM.from_pretrained("opentensor/bt-opt-350m")
168
+ ```
169
+ </details>
170
+