We find your model to be a great base-model

#1
by eladven - opened
Files changed (1) hide show
  1. README.md +8 -2
README.md CHANGED
@@ -51,8 +51,14 @@ output = model(encoded_input)
51
  ```
52
 
53
  ## Evaluation results
54
- See full evaluation results of this model and many more [here](https://ibm.github.io/model-recycling/roberta-base_table.html)
55
- When fine-tuned on downstream tasks, this model achieves the following results:
 
 
 
 
 
 
56
 
57
 
58
 
 
51
  ```
52
 
53
  ## Evaluation results
54
+ Evaluation on 36 dataset using ibm/ColD-Fusion-itr14-seed0 as a base model, yield average score of 78.64.
55
+ According to [website](https://ibm.github.io/model-recycling/), this is the 2th best model for roberta-base models (updated to 11/12/2022)
56
+
57
+ Results:
58
+
59
+ | 20_newsgroup | ag_news | amazon_reviews_multi | anli | boolq | cb | cola | copa | dbpedia | esnli | financial_phrasebank | imdb | isear | mnli | mrpc | multirc | poem_sentiment | qnli | qqp | rotten_tomatoes | rte | sst2 | sst_5bins | stsb | trec_coarse | trec_fine | tweet_ev_emoji | tweet_ev_emotion | tweet_ev_hate | tweet_ev_irony | tweet_ev_offensive | tweet_ev_sentiment | wic | wnli | wsc | yahoo_answers |
60
+ |---------------:|----------:|-----------------------:|--------:|--------:|--------:|--------:|-------:|----------:|--------:|-----------------------:|-------:|--------:|--------:|--------:|----------:|-----------------:|--------:|--------:|------------------:|--------:|--------:|------------:|--------:|--------------:|------------:|-----------------:|-------------------:|----------------:|-----------------:|---------------------:|---------------------:|--------:|--------:|--------:|----------------:|
61
+ | 85.7807 | 89.7 | 66.3 | 51.9688 | 81.4373 | 83.9286 | 83.2215 | 70 | 77.6333 | 90.7166 | 85.2 | 93.62 | 72.6858 | 86.8999 | 88.7255 | 63.8408 | 90.3846 | 92.3668 | 91.3579 | 91.0882 | 84.8375 | 95.8716 | 57.5113 | 91.4939 | 97.8 | 91 | 46.896 | 82.7586 | 54.8485 | 77.8061 | 85.4651 | 69.9935 | 69.7492 | 52.1127 | 63.4615 | 72.7 |
62
 
63
 
64