expert_name
stringlengths 3
67
| task_eval_on
stringclasses 1
value | score
float64 0.86
5.57
|
---|---|---|
default | ARB | 5.155896 |
sciq_Multiple_Choice | ARB | 4.496497 |
wiki_hop_original_choose_best_object_interrogative_1 | ARB | 3.591354 |
niv2_named_entity_recognition | ARB | 2.111275 |
ultrachat_17 | ARB | 3.420017 |
niv2_coherence_classification | ARB | 2.72363 |
squad_v2_0_3_0_0 | ARB | 2.520757 |
international_law | ARB | 3.347426 |
niv2_question_answering | ARB | 1.894204 |
wiki_qa_exercise | ARB | 3.662061 |
race_high_Taking_a_test | ARB | 1.80849 |
adversarial_qa_dbert_generate_question | ARB | 3.776077 |
quoref_Found_Context_Online | ARB | 2.398534 |
ultrachat_22 | ARB | 3.247254 |
high_school_chemistry | ARB | 3.455788 |
ultrachat_4 | ARB | 3.528342 |
web_questions_get_the_answer | ARB | 3.006208 |
duorc_SelfRC_generate_question_by_answer | ARB | 3.222494 |
quarel_testing_students | ARB | 2.768955 |
niv2_sentence_composition | ARB | 3.184933 |
qasc_qa_with_separated_facts_1 | ARB | 2.979972 |
cot_qasc_ii | ARB | 3.583755 |
wiki_qa_Is_This_True_ | ARB | 4.487175 |
race_high_Read_the_article_and_answer_the_question_no_option_ | ARB | 2.528832 |
niv2_sentence_perturbation | ARB | 3.204303 |
cot_gsm8k_ii | ARB | 3.141675 |
gem_wiki_lingua_english_en_1_1_0 | ARB | 3.136553 |
unified_qa_science_inst | ARB | 2.093446 |
quartz_use_info_from_paragraph_question | ARB | 2.406823 |
wiki_hop_original_generate_object | ARB | 3.748816 |
quoref_What_Is_The_Answer | ARB | 2.647273 |
adversarial_qa_droberta_generate_question | ARB | 3.45915 |
niv2_spam_classification | ARB | 2.853152 |
wiki_bio_comprehension | ARB | 3.994671 |
adversarial_qa_dbidaf_question_context_answer | ARB | 2.254336 |
wiki_bio_what_content | ARB | 3.869342 |
web_questions_whats_the_answer | ARB | 3.014567 |
wiqa_what_is_the_missing_first_step | ARB | 3.710971 |
adversarial_qa_droberta_question_context_answer | ARB | 2.502929 |
ropes_plain_bottom_hint | ARB | 3.224736 |
niv2_stance_detection | ARB | 2.698493 |
business_ethics | ARB | 3.511723 |
kilt_tasks_hotpotqa_combining_facts | ARB | 2.479209 |
cos_e_v1_11_aligned_with_common_sense | ARB | 4.44496 |
gem_web_nlg_en_1_1_0 | ARB | 3.458884 |
web_questions_potential_correct_answer | ARB | 2.905228 |
high_school_us_history | ARB | 3.484543 |
wiki_qa_found_on_google | ARB | 3.662312 |
niv2_gender_classification | ARB | 3.131335 |
niv2_paper_review | ARB | 3.68208 |
niv2_negotiation_strategy_detection | ARB | 2.684463 |
high_school_computer_science | ARB | 3.378751 |
duorc_ParaphraseRC_extract_answer | ARB | 2.893849 |
wmt16_translate_de_en_1_0_0 | ARB | 2.846244 |
quail_no_prompt_id | ARB | 1.618509 |
quoref_Guess_Title_For_Context | ARB | 2.673502 |
duorc_SelfRC_decide_worth_it | ARB | 2.975661 |
professional_psychology | ARB | 3.312037 |
college_physics | ARB | 3.307168 |
ropes_prompt_mix | ARB | 2.52129 |
niv2_word_analogy | ARB | 3.743575 |
adversarial_qa_droberta_tell_what_it_is | ARB | 2.833299 |
niv2_discourse_connective_identification | ARB | 3.119049 |
quail_context_question_answer_description_id | ARB | 1.358964 |
ultrachat_27 | ARB | 3.242133 |
gem_common_gen_1_1_0 | ARB | 3.61642 |
duorc_ParaphraseRC_answer_question | ARB | 2.805736 |
leetcode_ne | ARB | 4.596044 |
super_glue_cb_1_0_2 | ARB | 2.377653 |
niv2_question_understanding | ARB | 3.732026 |
cnn_dailymail_3_4_0 | ARB | 3.129313 |
race_high_Write_a_multi_choice_question_options_given_ | ARB | 4.069612 |
winogrande_1_1_0 | ARB | 2.637436 |
niv2_text_categorization | ARB | 1.829645 |
duorc_SelfRC_extract_answer | ARB | 2.974028 |
trec_1_0_0 | ARB | 3.658335 |
ultrachat_9 | ARB | 3.388127 |
human_aging | ARB | 3.689181 |
yelp_polarity_reviews_0_2_0 | ARB | 2.504508 |
race_high_Select_the_best_answer | ARB | 2.199191 |
ultrachat_15 | ARB | 3.300898 |
high_school_geography | ARB | 3.315491 |
para_crawl_enes | ARB | 2.915951 |
qasc_is_correct_1 | ARB | 3.328053 |
app_reviews_generate_review | ARB | 2.484683 |
ropes_read_background_situation | ARB | 2.763901 |
dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to | ARB | 3.262912 |
stream_aqua | ARB | 3.691433 |
drop_2_0_0 | ARB | 2.408025 |
wiki_hop_original_choose_best_object_affirmative_1 | ARB | 3.413433 |
us_foreign_policy | ARB | 3.681998 |
niv2_discourse_relation_classification | ARB | 2.34878 |
niv2_irony_detection | ARB | 3.412535 |
adversarial_qa_dbidaf_answer_the_following_q | ARB | 2.65873 |
niv2_paraphrasing | ARB | 3.213209 |
niv2_code_to_text | ARB | 4.588747 |
social_i_qa_Generate_answer | ARB | 2.9753 |
stream_aqua_ii | ARB | 3.209856 |
glue_sst2_2_0_0 | ARB | 2.548827 |
niv2_sentence_expansion | ARB | 3.124803 |
End of preview. Expand
in Dataset Viewer.
- Downloads last month
- 0