nazneen commited on
Commit
7b3045e
1 Parent(s): cbf18a1

model documentation

Browse files
Files changed (1) hide show
  1. README.md +165 -1
README.md CHANGED
@@ -1 +1,165 @@
1
- T5+Trie to predict entities of the last sentence in a dialogue.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - text-2-text-generation
4
+ - t5
5
+
6
+ ---
7
+
8
+ # Model Card for EntityT5
9
+
10
+ # Model Details
11
+
12
+ ## Model Description
13
+
14
+ T5+Trie to predict entities of the last sentence in a dialogue.
15
+
16
+ - **Developed by:** Jaren Yang
17
+ - **Shared by [Optional]:** Jaren Yang
18
+ - **Model type:** Text2text Generation
19
+ - **Language(s) (NLP):** More information needed
20
+ - **License:** More information needed
21
+ - **Parent Model:** T5
22
+ - **Resources for more information:** More information needed
23
+
24
+
25
+
26
+ # Uses
27
+
28
+
29
+ ## Direct Use
30
+ This model can be used for the task of text2text generation
31
+
32
+ ## Downstream Use [Optional]
33
+
34
+ More information needed.
35
+
36
+ ## Out-of-Scope Use
37
+
38
+ The model should not be used to intentionally create hostile or alienating environments for people.
39
+
40
+ # Bias, Risks, and Limitations
41
+
42
+
43
+ Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.
44
+
45
+
46
+
47
+ ## Recommendations
48
+
49
+
50
+ Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
51
+
52
+ # Training Details
53
+
54
+ ## Training Data
55
+
56
+ More information needed
57
+
58
+ ## Training Procedure
59
+
60
+
61
+ ### Preprocessing
62
+
63
+ More information needed
64
+
65
+ ### Speeds, Sizes, Times
66
+ More information needed
67
+
68
+
69
+ # Evaluation
70
+
71
+
72
+ ## Testing Data, Factors & Metrics
73
+
74
+ ### Testing Data
75
+
76
+ More information needed
77
+
78
+
79
+ ### Factors
80
+ More information needed
81
+
82
+ ### Metrics
83
+
84
+ More information needed
85
+
86
+
87
+ ## Results
88
+
89
+ More information needed
90
+
91
+
92
+ # Model Examination
93
+
94
+ More information needed
95
+
96
+ # Environmental Impact
97
+
98
+ Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
99
+
100
+ - **Hardware Type:** More information needed
101
+ - **Hours used:** More information needed
102
+ - **Cloud Provider:** More information needed
103
+ - **Compute Region:** More information needed
104
+ - **Carbon Emitted:** More information needed
105
+
106
+ # Technical Specifications [optional]
107
+
108
+ ## Model Architecture and Objective
109
+
110
+ More information needed
111
+
112
+ ## Compute Infrastructure
113
+
114
+ More information needed
115
+
116
+ ### Hardware
117
+
118
+
119
+ More information needed
120
+
121
+ ### Software
122
+
123
+ More information needed.
124
+
125
+ # Citation
126
+
127
+
128
+ **BibTeX:**
129
+
130
+ More information needed.
131
+
132
+
133
+
134
+
135
+ # Glossary [optional]
136
+ More information needed
137
+
138
+ # More Information [optional]
139
+ More information needed
140
+
141
+
142
+ # Model Card Authors [optional]
143
+
144
+ Jaren Yang in collaboration with Ezi Ozoani and the Hugging Face team
145
+
146
+
147
+ # Model Card Contact
148
+
149
+ More information needed
150
+
151
+ # How to Get Started with the Model
152
+
153
+ Use the code below to get started with the model.
154
+
155
+ <details>
156
+ <summary> Click to expand </summary>
157
+
158
+ ```python
159
+ from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
160
+
161
+ tokenizer = AutoTokenizer.from_pretrained("Jaren/EntityT5")
162
+
163
+ model = AutoModelForSeq2SeqLM.from_pretrained("Jaren/EntityT5")
164
+ ```
165
+ </details>