expert_name
stringlengths 3
67
| task_eval_on
stringclasses 1
value | score
float64 0.93
5.16
|
---|---|---|
default | ARB | 5.155896 |
sciq_Multiple_Choice | ARB | 3.294215 |
wiki_hop_original_choose_best_object_interrogative_1 | ARB | 2.826293 |
niv2_named_entity_recognition | ARB | 2.286225 |
ultrachat_17 | ARB | 3.21875 |
niv2_coherence_classification | ARB | 2.365894 |
squad_v2_0_3_0_0 | ARB | 2.149638 |
international_law | ARB | 2.817848 |
niv2_question_answering | ARB | 1.612155 |
wiki_qa_exercise | ARB | 2.83365 |
race_high_Taking_a_test | ARB | 1.801716 |
adversarial_qa_dbert_generate_question | ARB | 3.338458 |
quoref_Found_Context_Online | ARB | 2.411392 |
ultrachat_22 | ARB | 3.246366 |
high_school_chemistry | ARB | 2.645894 |
ultrachat_4 | ARB | 3.710367 |
web_questions_get_the_answer | ARB | 2.301354 |
duorc_SelfRC_generate_question_by_answer | ARB | 3.163368 |
quarel_testing_students | ARB | 2.695475 |
niv2_sentence_composition | ARB | 2.747302 |
qasc_qa_with_separated_facts_1 | ARB | 2.477008 |
cot_qasc_ii | ARB | 4.58563 |
wiki_qa_Is_This_True_ | ARB | 3.079909 |
race_high_Read_the_article_and_answer_the_question_no_option_ | ARB | 2.696505 |
niv2_sentence_perturbation | ARB | 2.843546 |
cot_gsm8k_ii | ARB | 3.147391 |
gem_wiki_lingua_english_en_1_1_0 | ARB | 2.977728 |
unified_qa_science_inst | ARB | 2.665709 |
quartz_use_info_from_paragraph_question | ARB | 2.374183 |
wiki_hop_original_generate_object | ARB | 3.056774 |
quoref_What_Is_The_Answer | ARB | 2.431265 |
adversarial_qa_droberta_generate_question | ARB | 3.297987 |
niv2_spam_classification | ARB | 3.038603 |
wiki_bio_comprehension | ARB | 3.596785 |
adversarial_qa_dbidaf_question_context_answer | ARB | 2.689375 |
wiki_bio_what_content | ARB | 3.574749 |
web_questions_whats_the_answer | ARB | 2.197099 |
wiqa_what_is_the_missing_first_step | ARB | 3.14554 |
adversarial_qa_droberta_question_context_answer | ARB | 2.635071 |
ropes_plain_bottom_hint | ARB | 2.395279 |
niv2_stance_detection | ARB | 2.828982 |
business_ethics | ARB | 2.328914 |
kilt_tasks_hotpotqa_combining_facts | ARB | 2.488595 |
cos_e_v1_11_aligned_with_common_sense | ARB | 3.479366 |
gem_web_nlg_en_1_1_0 | ARB | 2.917503 |
web_questions_potential_correct_answer | ARB | 2.198083 |
high_school_us_history | ARB | 2.739327 |
wiki_qa_found_on_google | ARB | 2.77429 |
niv2_gender_classification | ARB | 2.388569 |
niv2_paper_review | ARB | 4.913506 |
niv2_negotiation_strategy_detection | ARB | 2.797743 |
high_school_computer_science | ARB | 2.811288 |
duorc_ParaphraseRC_extract_answer | ARB | 2.439124 |
wmt16_translate_de_en_1_0_0 | ARB | 2.780263 |
quail_no_prompt_id | ARB | 1.597093 |
quoref_Guess_Title_For_Context | ARB | 2.293789 |
duorc_SelfRC_decide_worth_it | ARB | 2.422929 |
professional_psychology | ARB | 2.522297 |
college_physics | ARB | 2.835381 |
ropes_prompt_mix | ARB | 2.594529 |
niv2_word_analogy | ARB | 2.817864 |
adversarial_qa_droberta_tell_what_it_is | ARB | 2.481599 |
niv2_discourse_connective_identification | ARB | 3.173257 |
quail_context_question_answer_description_id | ARB | 1.330607 |
ultrachat_27 | ARB | 3.487341 |
gem_common_gen_1_1_0 | ARB | 2.738932 |
duorc_ParaphraseRC_answer_question | ARB | 2.459137 |
leetcode_ne | ARB | 4.412585 |
super_glue_cb_1_0_2 | ARB | 4.101349 |
niv2_question_understanding | ARB | 2.697581 |
cnn_dailymail_3_4_0 | ARB | 2.919555 |
race_high_Write_a_multi_choice_question_options_given_ | ARB | 3.754349 |
winogrande_1_1_0 | ARB | 2.779045 |
niv2_text_categorization | ARB | 1.991364 |
duorc_SelfRC_extract_answer | ARB | 2.475027 |
trec_1_0_0 | ARB | 2.107651 |
ultrachat_9 | ARB | 3.343061 |
human_aging | ARB | 2.667306 |
yelp_polarity_reviews_0_2_0 | ARB | 2.31463 |
race_high_Select_the_best_answer | ARB | 1.591614 |
ultrachat_15 | ARB | 3.20894 |
high_school_geography | ARB | 2.787981 |
para_crawl_enes | ARB | 2.495143 |
qasc_is_correct_1 | ARB | 2.715902 |
app_reviews_generate_review | ARB | 2.467533 |
ropes_read_background_situation | ARB | 2.775204 |
dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to | ARB | 3.152711 |
stream_aqua | ARB | 3.704189 |
drop_2_0_0 | ARB | 2.388396 |
wiki_hop_original_choose_best_object_affirmative_1 | ARB | 2.969821 |
us_foreign_policy | ARB | 2.884342 |
niv2_discourse_relation_classification | ARB | 2.733177 |
niv2_irony_detection | ARB | 2.950854 |
adversarial_qa_dbidaf_answer_the_following_q | ARB | 2.684628 |
niv2_paraphrasing | ARB | 2.814512 |
niv2_code_to_text | ARB | 4.45829 |
social_i_qa_Generate_answer | ARB | 2.898766 |
stream_aqua_ii | ARB | 3.69341 |
glue_sst2_2_0_0 | ARB | 2.136184 |
niv2_sentence_expansion | ARB | 2.94565 |
End of preview. Expand
in Dataset Viewer.
- Downloads last month
- 0