geralt Ezi commited on
Commit
dfff309
1 Parent(s): 4aad2e7

Model Card (#2)

Browse files

- Model Card (d51058934e197f07d3f30ad0aa74cf3d81379fae)


Co-authored-by: Ezi Ozoani <Ezi@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +77 -7
README.md CHANGED
@@ -1,4 +1,4 @@
1
- \n---
2
  tags:
3
  - Causal Language modeling
4
  - text-generation
@@ -10,22 +10,92 @@ model_index:
10
  name: Causal Language modeling
11
  type: Causal Language modeling
12
  ---
13
- ## MechDistilGPT2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
14
  This model is fine-tuned on text scraped from 100+ Mechanical/Automotive pdf books.
15
 
16
- Base model is DistilGPT2(https://huggingface.co/gpt2) (the smallest version of GPT2)
17
 
18
- ## Fine-Tuning
 
 
 
19
  * Default Training Args
20
  * Epochs = 3
21
  * Training set = 200k sentences
22
  * Validation set = 40k sentences
23
 
24
- ## Framework versions
 
25
  * Transformers 4.7.0.dev0
26
  * Pytorch 1.8.1+cu111
27
  * Datasets 1.6.2
28
  * Tokenizers 0.10.2
29
 
30
- ## References
31
- https://github.com/huggingface/notebooks/blob/master/examples/language_modeling.ipynb
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
  tags:
3
  - Causal Language modeling
4
  - text-generation
 
10
  name: Causal Language modeling
11
  type: Causal Language modeling
12
  ---
13
+ # MechDistilGPT2
14
+ ## Table of Contents
15
+ - [Model Details](#model-details)
16
+ - [Uses](#uses)
17
+ - [Risks, Limitations and Biases](#risks-limitations-and-biases)
18
+ - [Training](#training)
19
+ - [Environmental Impact](#environmental-impact)
20
+ - [How to Get Started With the Model](#how-to-get-started-with-the-model)
21
+
22
+ ## Model Details
23
+ - **Model Description:**
24
+ This model is fine-tuned on text scraped from 100+ Mechanical/Automotive pdf books.
25
+
26
+
27
+ - **Developed by:** [Ashwin](https://huggingface.co/geralt)
28
+
29
+ - **Model Type:** Causal Language modeling
30
+ - **Language(s):** English
31
+ - **License:** [More Information Needed]
32
+ - **Parent Model:** See the [DistilGPT2model](https://huggingface.co/distilgpt2) for more information about the Distilled-GPT2 base model.
33
+ - **Resources for more information:**
34
+ - [Research Paper](https://arxiv.org/abs/2105.09680)
35
+ - [GitHub Repo](https://github.com/huggingface/notebooks/blob/master/examples/language_modeling.ipynb)
36
+
37
+ ## Uses
38
+
39
+ #### Direct Use
40
+
41
+ The model can be used for tasks including topic classification, Causal Language modeling and text generation
42
+
43
+
44
+ #### Misuse and Out-of-scope Use
45
+
46
+ The model should not be used to intentionally create hostile or alienating environments for people. In addition, the model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
47
+
48
+
49
+ ## Risks, Limitations and Biases
50
+
51
+ **CONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.**
52
+
53
+ Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)).
54
+
55
+
56
+ ## Training
57
+
58
+ #### Training Data
59
+
60
  This model is fine-tuned on text scraped from 100+ Mechanical/Automotive pdf books.
61
 
 
62
 
63
+ #### Training Procedure
64
+
65
+ ###### Fine-Tuning
66
+
67
  * Default Training Args
68
  * Epochs = 3
69
  * Training set = 200k sentences
70
  * Validation set = 40k sentences
71
 
72
+ ###### Framework versions
73
+
74
  * Transformers 4.7.0.dev0
75
  * Pytorch 1.8.1+cu111
76
  * Datasets 1.6.2
77
  * Tokenizers 0.10.2
78
 
79
+
80
+ # Environmental Impact
81
+
82
+
83
+ Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
84
+
85
+ - **Hardware Type:** [More information needed]
86
+ - **Hours used:** [More information needed]
87
+ - **Cloud Provider:** [More information needed]
88
+ - **Compute Region:** [More information needed"]
89
+ - **Carbon Emitted:** [More information needed]
90
+
91
+
92
+ ## How to Get Started With the Model
93
+
94
+ ```python
95
+ from transformers import AutoTokenizer, AutoModelForCausalLM
96
+
97
+ tokenizer = AutoTokenizer.from_pretrained("geralt/MechDistilGPT2")
98
+
99
+ model = AutoModelForCausalLM.from_pretrained("geralt/MechDistilGPT2")
100
+
101
+ ```