DunnBC22 commited on
Commit
639a3c3
1 Parent(s): 77a6e74

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -8
README.md CHANGED
@@ -5,28 +5,31 @@ tags:
5
  model-index:
6
  - name: bert-base-uncased-Masked_Language_Model-US_Economic_News_Articles
7
  results: []
 
 
 
 
8
  ---
9
 
10
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
- should probably proofread and complete it, then remove this comment. -->
12
-
13
  # bert-base-uncased-Masked_Language_Model-US_Economic_News_Articles
14
 
15
- This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
  - Loss: 1.8322
18
 
19
  ## Model description
20
 
21
- More information needed
 
 
22
 
23
  ## Intended uses & limitations
24
 
25
- More information needed
26
 
27
  ## Training and evaluation data
28
 
29
- More information needed
30
 
31
  ## Training procedure
32
 
@@ -49,10 +52,11 @@ The following hyperparameters were used during training:
49
  | 2.004 | 2.0 | 4032 | 1.9002 |
50
  | 1.941 | 3.0 | 6048 | 1.8600 |
51
 
 
52
 
53
  ### Framework versions
54
 
55
  - Transformers 4.27.1
56
  - Pytorch 1.13.1+cu116
57
  - Datasets 2.10.1
58
- - Tokenizers 0.13.2
 
5
  model-index:
6
  - name: bert-base-uncased-Masked_Language_Model-US_Economic_News_Articles
7
  results: []
8
+ language:
9
+ - en
10
+ metrics:
11
+ - perplexity
12
  ---
13
 
 
 
 
14
  # bert-base-uncased-Masked_Language_Model-US_Economic_News_Articles
15
 
16
+ This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased).
17
  It achieves the following results on the evaluation set:
18
  - Loss: 1.8322
19
 
20
  ## Model description
21
 
22
+ This is a masked language modeling project.
23
+
24
+ For more information on how it was created, check out the following link: https://github.com/DunnBC22/NLP_Projects/blob/main/Masked%20Language%20Model/US%20Economic%20News%20Articles/US_Economic_News_Articles_MLM.ipynb
25
 
26
  ## Intended uses & limitations
27
 
28
+ This model is intended to demonstrate my ability to solve a complex problem using technology.
29
 
30
  ## Training and evaluation data
31
 
32
+ Dataset Source: https://www.kaggle.com/datasets/trikialaaa/2k-clean-medical-articles-medicalnewstoday
33
 
34
  ## Training procedure
35
 
 
52
  | 2.004 | 2.0 | 4032 | 1.9002 |
53
  | 1.941 | 3.0 | 6048 | 1.8600 |
54
 
55
+ Perplexity: 6.25
56
 
57
  ### Framework versions
58
 
59
  - Transformers 4.27.1
60
  - Pytorch 1.13.1+cu116
61
  - Datasets 2.10.1
62
+ - Tokenizers 0.13.2