File size: 2,649 Bytes
7d4a760
 
 
12ff013
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
---
license: afl-3.0
---

## Model Recycling

[Evaluation on 36 datasets](https://ibm.github.io/model-recycling/model_gain_chart?avg=1.62&mnli_lp=nan&20_newsgroup=-3.07&ag_news=-0.36&amazon_reviews_multi=-0.59&anli=1.28&boolq=3.32&cb=11.52&cola=-0.74&copa=2.85&dbpedia=0.83&esnli=-0.12&financial_phrasebank=13.84&imdb=-0.18&isear=1.24&mnli=-0.15&mrpc=0.16&multirc=0.51&poem_sentiment=9.23&qnli=1.11&qqp=-0.07&rotten_tomatoes=-0.40&rte=4.87&sst2=-1.58&sst_5bins=-1.05&stsb=0.73&trec_coarse=-0.43&trec_fine=6.22&tweet_ev_emoji=-0.18&tweet_ev_emotion=0.40&tweet_ev_hate=-1.57&tweet_ev_irony=6.23&tweet_ev_offensive=0.40&tweet_ev_sentiment=-0.08&wic=-0.83&wnli=4.01&wsc=1.54&yahoo_answers=-0.63&model_name=Moussab%2Fdeepset_bert-base-cased-squad2-orkg-unchanged-5e-05&base_name=bert-base-cased) using Moussab/deepset_bert-base-cased-squad2-orkg-unchanged-5e-05 as a base model yields average score of 74.04 in comparison to 72.43 by bert-base-cased.

The model is ranked 3rd among all tested models for the bert-base-cased architecture as of 21/12/2022
Results:

|   20_newsgroup |   ag_news |   amazon_reviews_multi |    anli |   boolq |   cb |    cola |   copa |   dbpedia |   esnli |   financial_phrasebank |   imdb |   isear |    mnli |    mrpc |   multirc |   poem_sentiment |    qnli |     qqp |   rotten_tomatoes |    rte |    sst2 |   sst_5bins |   stsb |   trec_coarse |   trec_fine |   tweet_ev_emoji |   tweet_ev_emotion |   tweet_ev_hate |   tweet_ev_irony |   tweet_ev_offensive |   tweet_ev_sentiment |     wic |   wnli |     wsc |   yahoo_answers |
|---------------:|----------:|-----------------------:|--------:|--------:|-----:|--------:|-------:|----------:|--------:|-----------------------:|-------:|--------:|--------:|--------:|----------:|-----------------:|--------:|--------:|------------------:|-------:|--------:|------------:|-------:|--------------:|------------:|-----------------:|-------------------:|----------------:|-----------------:|---------------------:|---------------------:|--------:|-------:|--------:|----------------:|
|        78.6644 |      88.7 |                  65.12 | 47.8438 | 71.5902 |   75 | 81.1122 |     55 |      79.6 | 89.5155 |                   82.2 | 90.968 | 69.6219 | 83.2384 | 83.0882 |   60.9736 |          76.9231 | 91.1038 | 89.8788 |           84.1463 | 67.509 | 89.9083 |      50.362 | 85.254 |          96.2 |        79.2 |           44.062 |              79.24 |         51.2121 |          71.4286 |              84.6512 |              68.1456 | 63.9498 | 56.338 | 63.4615 |            70.4 |


For more information, see: [Model Recycling](https://ibm.github.io/model-recycling/)