We find your model to be great base-model
Browse filesWe find your model to be the 1th best base model over roberta-base architecture.
(Means that using your model as a starting point for finetuning is great)
We suggest to add the following Evaluation to your README.md page
. For any question please contact eladv@il.ibm.com
README.md
CHANGED
@@ -51,8 +51,14 @@ output = model(encoded_input)
|
|
51 |
```
|
52 |
|
53 |
## Evaluation results
|
54 |
-
|
55 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
56 |
|
57 |
|
58 |
|
|
|
51 |
```
|
52 |
|
53 |
## Evaluation results
|
54 |
+
Evaluation on 36 dataset using ibm/ColD-Fusion-itr13-seed2 as a base model, yield average score of 78.72.
|
55 |
+
According to [website](https://ibm.github.io/model-recycling/), this is the 1th best model for roberta-base models (updated to 11/12/2022)
|
56 |
+
|
57 |
+
Results:
|
58 |
+
|
59 |
+
| 20_newsgroup | ag_news | amazon_reviews_multi | anli | boolq | cb | cola | copa | dbpedia | esnli | financial_phrasebank | imdb | isear | mnli | mrpc | multirc | poem_sentiment | qnli | qqp | rotten_tomatoes | rte | sst2 | sst_5bins | stsb | trec_coarse | trec_fine | tweet_ev_emoji | tweet_ev_emotion | tweet_ev_hate | tweet_ev_irony | tweet_ev_offensive | tweet_ev_sentiment | wic | wnli | wsc | yahoo_answers |
|
60 |
+
|---------------:|----------:|-----------------------:|--------:|--------:|--------:|-------:|-------:|----------:|--------:|-----------------------:|-------:|--------:|--------:|--------:|----------:|-----------------:|-------:|--------:|------------------:|--------:|--------:|------------:|--------:|--------------:|------------:|-----------------:|-------------------:|----------------:|-----------------:|---------------------:|---------------------:|--------:|--------:|--------:|----------------:|
|
61 |
+
| 86.3648 | 89.3 | 66.72 | 53.0937 | 82.0183 | 89.2857 | 83.605 | 73 | 77.4667 | 91.0423 | 87.3 | 93.868 | 73.1421 | 87.3881 | 87.7451 | 63.6757 | 88.4615 | 92.678 | 91.0809 | 91.4634 | 83.3935 | 95.2982 | 58.1448 | 91.6334 | 97 | 91 | 44.95 | 83.0401 | 52.5589 | 77.0408 | 86.0465 | 69.7818 | 70.0627 | 49.2958 | 63.4615 | 72.5667 |
|
62 |
|
63 |
|
64 |
|