File size: 2,696 Bytes
e89eee3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
# Dylan1999/bert-finetuned-squad-accelerate model
This model is based on bert-base-cased pretrained model.


## Model Recycling

[Evaluation on 36 datasets](https://ibm.github.io/model-recycling/model_gain_chart?avg=1.64&mnli_lp=nan&20_newsgroup=-0.03&ag_news=0.07&amazon_reviews_multi=0.33&anli=0.37&boolq=2.77&cb=11.52&cola=-1.79&copa=2.85&dbpedia=0.80&esnli=-0.01&financial_phrasebank=11.64&imdb=-0.10&isear=1.43&mnli=-0.12&mrpc=3.35&multirc=-1.18&poem_sentiment=5.38&qnli=1.02&qqp=-1.04&rotten_tomatoes=0.26&rte=5.23&sst2=0.48&sst_5bins=-1.36&stsb=1.51&trec_coarse=-0.23&trec_fine=9.62&tweet_ev_emoji=-0.03&tweet_ev_emotion=0.54&tweet_ev_hate=1.57&tweet_ev_irony=3.04&tweet_ev_offensive=-0.06&tweet_ev_sentiment=-1.45&wic=-1.30&wnli=2.61&wsc=1.54&yahoo_answers=-0.09&model_name=Dylan1999%2Fbert-finetuned-squad-accelerate&base_name=bert-base-cased) using Dylan1999/bert-finetuned-squad-accelerate as a base model yields average score of 74.07 in comparison to 72.43 by bert-base-cased.

The model is ranked 2nd among all tested models for the bert-base-cased architecture as of 21/12/2022
Results:

|   20_newsgroup |   ag_news |   amazon_reviews_multi |    anli |   boolq |   cb |    cola |   copa |   dbpedia |   esnli |   financial_phrasebank |   imdb |   isear |    mnli |    mrpc |   multirc |   poem_sentiment |    qnli |     qqp |   rotten_tomatoes |   rte |    sst2 |   sst_5bins |    stsb |   trec_coarse |   trec_fine |   tweet_ev_emoji |   tweet_ev_emotion |   tweet_ev_hate |   tweet_ev_irony |   tweet_ev_offensive |   tweet_ev_sentiment |     wic |    wnli |     wsc |   yahoo_answers |
|---------------:|----------:|-----------------------:|--------:|--------:|-----:|--------:|-------:|----------:|--------:|-----------------------:|-------:|--------:|--------:|--------:|----------:|-----------------:|--------:|--------:|------------------:|------:|--------:|------------:|--------:|--------------:|------------:|-----------------:|-------------------:|----------------:|-----------------:|---------------------:|---------------------:|--------:|--------:|--------:|----------------:|
|        81.7047 |   89.1333 |                  66.04 | 46.9375 | 71.0398 |   75 | 80.0575 |     55 |   79.5667 | 89.6274 |                     80 | 91.044 | 69.8175 | 83.2689 | 86.2745 |   59.2822 |          73.0769 | 91.0123 | 88.9117 |            84.803 | 67.87 | 91.9725 |     50.0452 | 86.0266 |          96.4 |        82.6 |           44.214 |            79.3807 |         54.3434 |          68.2398 |               84.186 |              66.7779 | 63.4796 | 54.9296 | 63.4615 |         70.9333 |


For more information, see: [Model Recycling](https://ibm.github.io/model-recycling/)