Giyaseddin commited on
Commit
9a0f2b9
1 Parent(s): 5272121

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -2
README.md CHANGED
@@ -20,7 +20,7 @@ On average DistilRoBERTa is twice as fast as Roberta-base.
20
 
21
  We encourage to check [RoBERTa-base model](https://huggingface.co/roberta-base) to know more about usage, limitations and potential biases.
22
 
23
- \
24
 
25
  This is a classification model that solves Short Question Answer Assessment task, finetuned [pretrained DistilRoBERTa model](https://huggingface.co/distilroberta-base) on
26
  [Question Answer Assessment dataset](#)
@@ -129,7 +129,6 @@ Here is the scores during the training:
129
  | Epoch | Training Loss | Validation Loss | Accuracy | F1 | Precision | Recall |
130
  |:----------:|:-------------:|:-----------------:|:----------:|:---------:|:----------:|:--------:|
131
  | 1 | No log | 0.773334 | 0.713706 | 0.711398 | 0.746059 | 0.713706 |
132
-
133
  | 2 | 1.069200 | 0.404932 | 0.885279 | 0.884592 | 0.886699 | 0.885279 |
134
  | 3 | 0.473700 | 0.247099 | 0.931980 | 0.931675 | 0.933794 | 0.931980 |
135
  | 3 | 0.228000 | 0.205577 | 0.954315 | 0.954210 | 0.955258 | 0.954315 |
 
20
 
21
  We encourage to check [RoBERTa-base model](https://huggingface.co/roberta-base) to know more about usage, limitations and potential biases.
22
 
23
+
24
 
25
  This is a classification model that solves Short Question Answer Assessment task, finetuned [pretrained DistilRoBERTa model](https://huggingface.co/distilroberta-base) on
26
  [Question Answer Assessment dataset](#)
 
129
  | Epoch | Training Loss | Validation Loss | Accuracy | F1 | Precision | Recall |
130
  |:----------:|:-------------:|:-----------------:|:----------:|:---------:|:----------:|:--------:|
131
  | 1 | No log | 0.773334 | 0.713706 | 0.711398 | 0.746059 | 0.713706 |
 
132
  | 2 | 1.069200 | 0.404932 | 0.885279 | 0.884592 | 0.886699 | 0.885279 |
133
  | 3 | 0.473700 | 0.247099 | 0.931980 | 0.931675 | 0.933794 | 0.931980 |
134
  | 3 | 0.228000 | 0.205577 | 0.954315 | 0.954210 | 0.955258 | 0.954315 |