Rahulholla commited on
Commit
30eeaf2
1 Parent(s): f61dfaf

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -63
README.md CHANGED
@@ -73,8 +73,7 @@ Users should exercise caution and validate the model's predictions with addition
73
 
74
  Ensure that you have the `transformers` library installed. If not, you can install it via pip:
75
 
76
- ```bash
77
- pip install transformers
78
 
79
  You can load the model using the provided pipeline or directly with the AutoTokenizer and AutoModelForCausalLM classes from the transformers library.
80
  Once the model is loaded, you can use it for text generation tasks. If you prefer a high-level interface, you can use the pipeline approach as well.
@@ -92,7 +91,7 @@ The model was trained on a dataset containing examples of stock option data pair
92
 
93
  <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
94
 
95
- #### Preprocessing [optional]
96
 
97
  The input data was preprocessed to tokenize and encode the text input before training.
98
 
@@ -138,27 +137,7 @@ The model demonstrated the ability to provide relevant and actionable trading in
138
 
139
  #### Summary
140
 
141
-
142
-
143
- ## Model Examination [optional]
144
-
145
- <!-- Relevant interpretability work for the model goes here -->
146
-
147
- [More Information Needed]
148
-
149
- ## Environmental Impact
150
-
151
- <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
152
-
153
- Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
154
-
155
- - **Hardware Type:** [More Information Needed]
156
- - **Hours used:** [More Information Needed]
157
- - **Cloud Provider:** [More Information Needed]
158
- - **Compute Region:** [More Information Needed]
159
- - **Carbon Emitted:** [More Information Needed]
160
-
161
- ## Technical Specifications [optional]
162
 
163
  ### Model Architecture and Objective
164
 
@@ -166,42 +145,6 @@ Carbon emissions can be estimated using the [Machine Learning Impact calculator]
166
 
167
  ### Compute Infrastructure
168
 
169
- [More Information Needed]
170
-
171
- #### Hardware
172
-
173
- [More Information Needed]
174
-
175
- #### Software
176
-
177
- [More Information Needed]
178
-
179
- ## Citation [optional]
180
-
181
- <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
182
-
183
- **BibTeX:**
184
-
185
- [More Information Needed]
186
-
187
- **APA:**
188
-
189
- [More Information Needed]
190
-
191
- ## Glossary [optional]
192
-
193
- <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
194
-
195
- [More Information Needed]
196
-
197
- ## More Information [optional]
198
-
199
- [More Information Needed]
200
-
201
- ## Model Card Authors [optional]
202
-
203
- [More Information Needed]
204
-
205
- ## Model Card Contact
206
-
207
- [More Information Needed]
 
73
 
74
  Ensure that you have the `transformers` library installed. If not, you can install it via pip:
75
 
76
+ ```pip install transformers```
 
77
 
78
  You can load the model using the provided pipeline or directly with the AutoTokenizer and AutoModelForCausalLM classes from the transformers library.
79
  Once the model is loaded, you can use it for text generation tasks. If you prefer a high-level interface, you can use the pipeline approach as well.
 
91
 
92
  <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
93
 
94
+ #### Preprocessing
95
 
96
  The input data was preprocessed to tokenize and encode the text input before training.
97
 
 
137
 
138
  #### Summary
139
 
140
+ ## Technical Specifications
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
141
 
142
  ### Model Architecture and Objective
143
 
 
145
 
146
  ### Compute Infrastructure
147
 
148
+ 1 x A100 GPU - 80GB VRAM
149
+ 117 GB RAM
150
+ 12 vCPU