abdallahashrafx
commited on
Commit
•
4b1eabe
1
Parent(s):
23ec75d
Update README.md
Browse files
README.md
CHANGED
@@ -22,7 +22,7 @@ The model is trained to classify sentences containing idiomatic expressions as e
|
|
22 |
## Model Details
|
23 |
|
24 |
- **Developed by:** Abdallah Ashraf
|
25 |
-
- **Language(s) (NLP):**
|
26 |
- **Finetuned from model:** bert-base-uncased
|
27 |
|
28 |
## Uses
|
@@ -32,7 +32,7 @@ The model is trained to classify sentences containing idiomatic expressions as e
|
|
32 |
The model can be used directly for classifying idiomatic expressions in text data.
|
33 |
|
34 |
|
35 |
-
### Downstream Use
|
36 |
|
37 |
The model can also be fine-tuned for specific downstream tasks, such as sentiment analysis or natural language understanding, by incorporating it into larger NLP pipelines.
|
38 |
|
@@ -113,13 +113,13 @@ The model was trained using the AdamW optimizer with a learning rate of 2e-6 and
|
|
113 |
|
114 |
#### Training Hyperparameters
|
115 |
|
116 |
-
Training Hyperparameters
|
117 |
-
Training regime: Full fine-tuning
|
118 |
-
Optimizer: AdamW
|
119 |
-
Learning rate: 2e-6
|
120 |
-
momentum: 90-95
|
121 |
-
Weight decay: 0.01
|
122 |
-
loss function : Binary cross entropy loss
|
123 |
|
124 |
|
125 |
## Evaluation
|
|
|
22 |
## Model Details
|
23 |
|
24 |
- **Developed by:** Abdallah Ashraf
|
25 |
+
- **Language(s) (NLP):** english
|
26 |
- **Finetuned from model:** bert-base-uncased
|
27 |
|
28 |
## Uses
|
|
|
32 |
The model can be used directly for classifying idiomatic expressions in text data.
|
33 |
|
34 |
|
35 |
+
### Downstream Use
|
36 |
|
37 |
The model can also be fine-tuned for specific downstream tasks, such as sentiment analysis or natural language understanding, by incorporating it into larger NLP pipelines.
|
38 |
|
|
|
113 |
|
114 |
#### Training Hyperparameters
|
115 |
|
116 |
+
* Training Hyperparameters
|
117 |
+
* Training regime: Full fine-tuning
|
118 |
+
* Optimizer: AdamW
|
119 |
+
* Learning rate: 2e-6
|
120 |
+
* momentum: 90-95
|
121 |
+
* Weight decay: 0.01
|
122 |
+
* loss function : Binary cross entropy loss
|
123 |
|
124 |
|
125 |
## Evaluation
|