rbhar90 nazneen commited on
Commit
b65d0a6
1 Parent(s): 53d08e2

model documentation (#1)

Browse files

- model documentation (33058ed634ba41f35925ba7eaa13bae37f5b501a)


Co-authored-by: Nazneen Rajani <nazneen@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +176 -0
README.md ADDED
@@ -0,0 +1,176 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - roberta
4
+ ---
5
+
6
+ # Model Card for ChemBERTa-10M-MTR
7
+
8
+ # Model Details
9
+
10
+ ## Model Description
11
+
12
+ More information needed
13
+
14
+ - **Developed by:** DeepChem
15
+ - **Shared by [Optional]:** DeepChem
16
+
17
+ - **Model type:** Token Classification
18
+ - **Language(s) (NLP):** More information needed
19
+ - **License:** More information needed
20
+ - **Parent Model:** [RoBERTa](https://huggingface.co/roberta-base?text=The+goal+of+life+is+%3Cmask%3E.)
21
+ - **Resources for more information:** More information needed
22
+
23
+
24
+ # Uses
25
+
26
+
27
+ ## Direct Use
28
+ More information needed.
29
+
30
+ ## Downstream Use [Optional]
31
+
32
+ More information needed.
33
+
34
+ ## Out-of-Scope Use
35
+
36
+ The model should not be used to intentionally create hostile or alienating environments for people.
37
+
38
+ # Bias, Risks, and Limitations
39
+
40
+
41
+ Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.
42
+
43
+
44
+
45
+ ## Recommendations
46
+
47
+
48
+ Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
49
+
50
+ # Training Details
51
+
52
+ ## Training Data
53
+
54
+ More information needed
55
+
56
+ ## Training Procedure
57
+
58
+
59
+ ### Preprocessing
60
+
61
+ More information needed
62
+
63
+
64
+
65
+ ### Speeds, Sizes, Times
66
+
67
+ More information needed
68
+
69
+
70
+
71
+ # Evaluation
72
+
73
+
74
+ ## Testing Data, Factors & Metrics
75
+
76
+ ### Testing Data
77
+
78
+ More information needed
79
+
80
+ ### Factors
81
+ More information needed
82
+
83
+ ### Metrics
84
+
85
+ More information needed
86
+
87
+
88
+ ## Results
89
+
90
+ More information needed
91
+
92
+
93
+ # Model Examination
94
+
95
+ More information needed
96
+
97
+ # Environmental Impact
98
+
99
+ Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
100
+
101
+ - **Hardware Type:** More information needed
102
+ - **Hours used:** More information needed
103
+ - **Cloud Provider:** More information needed
104
+ - **Compute Region:** More information needed
105
+ - **Carbon Emitted:** More information needed
106
+
107
+ # Technical Specifications [optional]
108
+
109
+ ## Model Architecture and Objective
110
+
111
+ More information needed
112
+
113
+ ## Compute Infrastructure
114
+
115
+ More information needed
116
+
117
+ ### Hardware
118
+
119
+
120
+ More information needed
121
+
122
+ ### Software
123
+
124
+ More information needed.
125
+
126
+ # Citation
127
+
128
+
129
+ **BibTeX:**
130
+
131
+ ```bibtex
132
+ @book{Ramsundar-et-al-2019,
133
+ title={Deep Learning for the Life Sciences},
134
+ author={Bharath Ramsundar and Peter Eastman and Patrick Walters and Vijay Pande and Karl Leswing and Zhenqin Wu},
135
+ publisher={O'Reilly Media},
136
+ note={\url{https://www.amazon.com/Deep-Learning-Life-Sciences-Microscopy/dp/1492039837}},
137
+ year={2019}
138
+ }
139
+ ```
140
+
141
+
142
+
143
+ **APA:**
144
+
145
+ More information needed
146
+
147
+ # Glossary [optional]
148
+
149
+ More information needed
150
+
151
+ # More Information [optional]
152
+ More information needed
153
+
154
+ # Model Card Authors [optional]
155
+
156
+ DeepChem in collaboration with Ezi Ozoani and the Hugging Face team
157
+
158
+ # Model Card Contact
159
+
160
+ More information needed
161
+
162
+ # How to Get Started with the Model
163
+
164
+ Use the code below to get started with the model.
165
+
166
+ <details>
167
+ <summary> Click to expand </summary>
168
+
169
+ ```python
170
+ from transformers import AutoTokenizer, RobertaForRegression
171
+
172
+ tokenizer = AutoTokenizer.from_pretrained("DeepChem/ChemBERTa-10M-MTR")
173
+
174
+ model = RobertaForRegression.from_pretrained("DeepChem/ChemBERTa-10M-MTR")
175
+ ```
176
+ </details>