Zakia commited on
Commit
59e65f2
1 Parent(s): b12cad8

Removed double asterisks in front of lists

Browse files
Files changed (1) hide show
  1. README.md +10 -10
README.md CHANGED
@@ -27,11 +27,11 @@ The base model for fine-tuning was the [distilbert-base-uncased](https://hugging
27
 
28
  ### Model Description
29
 
30
- - **Developed by:*Zakia*
31
- - **Model type:*Text Classification*
32
- - **Language(s) (NLP):*English*
33
- - **License:*Apache 2.0*
34
- - **Finetuned from model:*distilbert-base-uncased*
35
 
36
  ## Uses
37
 
@@ -104,9 +104,9 @@ Train dataset high_quality_review counts: Counter({0: 2120, 1: 2120})
104
 
105
  #### Training Hyperparameters
106
 
107
- - **Learning Rate: *3e-5*
108
- - **Batch Size:*16*
109
- - **Epochs:*1*
110
 
111
  ## Evaluation
112
 
@@ -176,8 +176,8 @@ Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2019). DistilBERT, a distilled ve
176
 
177
  ## Glossary
178
 
179
- - **Low Quality Review: *high_quality_review=0*
180
- - **High Quality Review:*high_quality_review=1*
181
 
182
  ## More Information
183
 
 
27
 
28
  ### Model Description
29
 
30
+ - Developed by:*Zakia*
31
+ - Model type:*Text Classification*
32
+ - Language(s) (NLP):*English*
33
+ - License:*Apache 2.0*
34
+ - Finetuned from model:*distilbert-base-uncased*
35
 
36
  ## Uses
37
 
 
104
 
105
  #### Training Hyperparameters
106
 
107
+ - Learning Rate: *3e-5*
108
+ - Batch Size:*16*
109
+ - Epochs:*1*
110
 
111
  ## Evaluation
112
 
 
176
 
177
  ## Glossary
178
 
179
+ - Low Quality Review: *high_quality_review=0*
180
+ - High Quality Review:*high_quality_review=1*
181
 
182
  ## More Information
183