Datasets:
mteb
/

Modalities:
Tabular
Text
Formats:
json
Libraries:
Datasets
Dask
Muennighoff commited on
Commit
a234a88
·
verified ·
1 Parent(s): 0080f1d

Scheduled Commit

Browse files
data/retrieval_battle-81269da8-6dc7-417a-8e0a-245c4a753482.jsonl CHANGED
@@ -37,3 +37,5 @@
37
  {"tstamp": 1722392012.5695, "task_type": "retrieval", "type": "tievote", "models": ["", ""], "ip": "", "0_conv_id": "b88f57ef3e3346fca5d5beeb5aa06f23", "0_model_name": "intfloat/e5-mistral-7b-instruct", "0_prompt": "Research on improving automatic image captioning by employing ChatGPT to ask questions to BLIP-2 for generating richer descriptions.", "0_output": [["Research on improving automatic image captioning by employing ChatGPT to ask questions to BLIP-2 for generating richer descriptions.", "Title: ChatGPT Asks, BLIP-2 Answers: Automatic Questioning Towards Enriched Visual Descriptions\n\nAbstract: Asking insightful questions is crucial for acquiring knowledge and expanding our understanding of the world. However, the importance of questioning has been largely overlooked in AI research, where models have been primarily developed to answer questions. With the recent advancements of large language models (LLMs) like ChatGPT, we discover their capability to ask high-quality questions when provided with a suitable prompt. This discovery presents a new opportunity to develop an automatic questioning system. In this paper, we introduce ChatCaptioner, a novel automatic-questioning method deployed in image captioning. Here, ChatGPT is prompted to ask a series of informative questions about images to BLIP-2, a strong vision question-answering model. By keeping acquiring new visual information from BLIP-2's answers, ChatCaptioner is able to generate more enriched image descriptions. We conduct human-subject evaluations on common image caption datasets such as COCO, Conceptual Caption, and WikiArt, and compare ChatCaptioner with BLIP-2 as well as ground truth. Our results demonstrate that ChatCaptioner's captions are significantly more informative, receiving three times as many votes from human evaluators for providing the most image information. Besides, ChatCaptioner identifies 53% more objects within the image than BLIP-2 alone measured by WordNet synset matching. Code is available at https://github.com/Vision-CAIR/ChatCaptioner"]], "0_corpus": "arxiv", "1_conv_id": "d17e212f553446fa97b7b59266f555ad", "1_model_name": "BM25", "1_prompt": "Research on improving automatic image captioning by employing ChatGPT to ask questions to BLIP-2 for generating richer descriptions.", "1_output": [["Research on improving automatic image captioning by employing ChatGPT to ask questions to BLIP-2 for generating richer descriptions.", "Title: ChatGPT Asks, BLIP-2 Answers: Automatic Questioning Towards Enriched Visual Descriptions\n\nAbstract: Asking insightful questions is crucial for acquiring knowledge and expanding our understanding of the world. However, the importance of questioning has been largely overlooked in AI research, where models have been primarily developed to answer questions. With the recent advancements of large language models (LLMs) like ChatGPT, we discover their capability to ask high-quality questions when provided with a suitable prompt. This discovery presents a new opportunity to develop an automatic questioning system. In this paper, we introduce ChatCaptioner, a novel automatic-questioning method deployed in image captioning. Here, ChatGPT is prompted to ask a series of informative questions about images to BLIP-2, a strong vision question-answering model. By keeping acquiring new visual information from BLIP-2's answers, ChatCaptioner is able to generate more enriched image descriptions. We conduct human-subject evaluations on common image caption datasets such as COCO, Conceptual Caption, and WikiArt, and compare ChatCaptioner with BLIP-2 as well as ground truth. Our results demonstrate that ChatCaptioner's captions are significantly more informative, receiving three times as many votes from human evaluators for providing the most image information. Besides, ChatCaptioner identifies 53% more objects within the image than BLIP-2 alone measured by WordNet synset matching. Code is available at https://github.com/Vision-CAIR/ChatCaptioner"]], "1_corpus": "arxiv"}
38
  {"tstamp": 1722392051.0052, "task_type": "retrieval", "type": "tievote", "models": ["", ""], "ip": "", "0_conv_id": "e76d8a4d528e4709bcc107981744ebd2", "0_model_name": "BM25", "0_prompt": "Which positional encoding scheme improves Transformer performance on long sequences?", "0_output": [["Which positional encoding scheme improves Transformer performance on long sequences?", "Title: PermuteFormer: Efficient Relative Position Encoding for Long Sequences\n\nAbstract: A recent variation of Transformer, Performer, scales Transformer to longer sequences with a linear attention mechanism. However, it is not compatible with relative position encoding, which has advantages over absolute position encoding. In this paper, we discuss possible ways to add relative position encoding to Performer. Based on the analysis, we propose PermuteFormer, a Performer-based model with relative position encoding that scales linearly on long sequences. PermuteFormer applies position-dependent transformation on queries and keys to encode positional information into the attention module. This transformation is carefully crafted so that the final output of self-attention is not affected by absolute positions of tokens. PermuteFormer introduces negligible computational overhead by design that it runs as fast as Performer. We evaluate PermuteFormer on Long-Range Arena, a dataset for long sequences, as well as WikiText-103, a language modeling dataset. The experiments show that PermuteFormer uniformly improves the performance of Performer with almost no computational overhead and outperforms vanilla Transformer on most of the tasks."]], "0_corpus": "arxiv", "1_conv_id": "41c62996b59b4a37b9c1cb25dd70c0dd", "1_model_name": "GritLM/GritLM-7B", "1_prompt": "Which positional encoding scheme improves Transformer performance on long sequences?", "1_output": [["Which positional encoding scheme improves Transformer performance on long sequences?", "Title: PermuteFormer: Efficient Relative Position Encoding for Long Sequences\n\nAbstract: A recent variation of Transformer, Performer, scales Transformer to longer sequences with a linear attention mechanism. However, it is not compatible with relative position encoding, which has advantages over absolute position encoding. In this paper, we discuss possible ways to add relative position encoding to Performer. Based on the analysis, we propose PermuteFormer, a Performer-based model with relative position encoding that scales linearly on long sequences. PermuteFormer applies position-dependent transformation on queries and keys to encode positional information into the attention module. This transformation is carefully crafted so that the final output of self-attention is not affected by absolute positions of tokens. PermuteFormer introduces negligible computational overhead by design that it runs as fast as Performer. We evaluate PermuteFormer on Long-Range Arena, a dataset for long sequences, as well as WikiText-103, a language modeling dataset. The experiments show that PermuteFormer uniformly improves the performance of Performer with almost no computational overhead and outperforms vanilla Transformer on most of the tasks."]], "1_corpus": "arxiv"}
39
  {"tstamp": 1722392088.2313, "task_type": "retrieval", "type": "leftvote", "models": ["", ""], "ip": "", "0_conv_id": "85225ff2b983444c8cf0212454cc1197", "0_model_name": "voyage-multilingual-2", "0_prompt": "who came up with the idea of the transcontinental railroad", "0_output": [["who came up with the idea of the transcontinental railroad", "First transcontinental railroad\n\nThe Big Four Four northern California businessmen formed the Central Pacific Railroad: Leland Stanford, (1824–1893), President; Collis Potter Huntington, (1821–1900), Vice President; Mark Hopkins, (1813–1878), Treasurer; Charles Crocker, (1822–1888), Construction Supervisor. All became substantially wealthy from their association with the railroad. Judah, Marsh, Strong, Stanford, Huntington, Hopkins and Crocker, along with James Bailey and Lucius Anson Booth, became the first board of directors of the Central Pacific Railroad.\nThomas Durant\nFormer ophthalmologist Dr. Thomas Clark \"Doc\" Durant was nominally only a vice president of Union Pacific, so he installed a series of respected men like John Adams Dix as president of the railroad. While serving as vice president of Union Pacific he would be a key figure in the Crédit Mobilier scandal which ultimately led to his removal from the company.\nGrenville M. Dodge\nMajor General Grenville M. Dodge served as the chief engineer of Union Pacific during the construction of the First Transcontinental Railroad. In 1865 while fighting against Native-American tribes he would discover a pass in the Laramie Mountains, which would serve as a vital passage for the First Transcontinental Railroad. Dodge would serve in the United States House of Representatives for Iowa's 5th District from 1867 until 1869. During this time he would push for legislation to help the construction of the railroad."]], "0_corpus": "wikipedia", "1_conv_id": "927e17b7211d4670ae1eb6c0f3fd0fd3", "1_model_name": "intfloat/e5-mistral-7b-instruct", "1_prompt": "who came up with the idea of the transcontinental railroad", "1_output": [["who came up with the idea of the transcontinental railroad", "First transcontinental railroad\n\nAmerica's first transcontinental railroad (known originally as the \"Pacific Railroad\" and later as the \"Overland Route\") was a continuous railroad line built between 1863 and 1869 that connected the existing eastern U.S. rail network at Council Bluffs, Iowa, with the Pacific coast at the Oakland Long Wharf on San Francisco Bay. The rail line was built by three private companies over public lands provided by extensive U.S. land grants. Building was financed by both state and U.S. government subsidy bonds as well as by company-issued mortgage bonds. The Western Pacific Railroad Company built of track from the road's western terminus at Alameda/Oakland to Sacramento, California. The Central Pacific Railroad Company of California (CPRR) constructed east from Sacramento to Promontory Summit, Utah Territory. The Union Pacific Railroad (UPRR) built from the road's eastern terminus at the Missouri River settlements of Council Bluffs and Omaha, Nebraska, westward to Promontory Summit.\nThe railroad opened for through traffic between Sacramento and Omaha on May 10, 1869, when CPRR President Leland Stanford ceremonially tapped the gold \"Last Spike\" (later often referred to as the \"Golden Spike\") with a silver hammer at Promontory Summit. In the following six months, the last leg from Sacramento to San Francisco Bay was completed. The resulting coast-to-coast railroad connection revolutionized the settlement and economy of the American West. It brought the western states and territories into alignment with the northern Union states and made transporting passengers and goods coast-to-coast considerably quicker, safer and less expensive.\nThe first transcontinental rail passengers arrived at the Pacific Railroad's original western terminus at the Alameda Terminal on September 6, 1869, where they transferred to the steamer Alameda for transport across the Bay to San Francisco. The road's rail terminus was moved two months later to the Oakland Long Wharf, about a mile to the north, when its expansion was completed and opened for passengers on November 8, 1869. Service between San Francisco and Oakland Pier continued to be provided by ferry."]], "1_corpus": "wikipedia"}
 
 
 
37
  {"tstamp": 1722392012.5695, "task_type": "retrieval", "type": "tievote", "models": ["", ""], "ip": "", "0_conv_id": "b88f57ef3e3346fca5d5beeb5aa06f23", "0_model_name": "intfloat/e5-mistral-7b-instruct", "0_prompt": "Research on improving automatic image captioning by employing ChatGPT to ask questions to BLIP-2 for generating richer descriptions.", "0_output": [["Research on improving automatic image captioning by employing ChatGPT to ask questions to BLIP-2 for generating richer descriptions.", "Title: ChatGPT Asks, BLIP-2 Answers: Automatic Questioning Towards Enriched Visual Descriptions\n\nAbstract: Asking insightful questions is crucial for acquiring knowledge and expanding our understanding of the world. However, the importance of questioning has been largely overlooked in AI research, where models have been primarily developed to answer questions. With the recent advancements of large language models (LLMs) like ChatGPT, we discover their capability to ask high-quality questions when provided with a suitable prompt. This discovery presents a new opportunity to develop an automatic questioning system. In this paper, we introduce ChatCaptioner, a novel automatic-questioning method deployed in image captioning. Here, ChatGPT is prompted to ask a series of informative questions about images to BLIP-2, a strong vision question-answering model. By keeping acquiring new visual information from BLIP-2's answers, ChatCaptioner is able to generate more enriched image descriptions. We conduct human-subject evaluations on common image caption datasets such as COCO, Conceptual Caption, and WikiArt, and compare ChatCaptioner with BLIP-2 as well as ground truth. Our results demonstrate that ChatCaptioner's captions are significantly more informative, receiving three times as many votes from human evaluators for providing the most image information. Besides, ChatCaptioner identifies 53% more objects within the image than BLIP-2 alone measured by WordNet synset matching. Code is available at https://github.com/Vision-CAIR/ChatCaptioner"]], "0_corpus": "arxiv", "1_conv_id": "d17e212f553446fa97b7b59266f555ad", "1_model_name": "BM25", "1_prompt": "Research on improving automatic image captioning by employing ChatGPT to ask questions to BLIP-2 for generating richer descriptions.", "1_output": [["Research on improving automatic image captioning by employing ChatGPT to ask questions to BLIP-2 for generating richer descriptions.", "Title: ChatGPT Asks, BLIP-2 Answers: Automatic Questioning Towards Enriched Visual Descriptions\n\nAbstract: Asking insightful questions is crucial for acquiring knowledge and expanding our understanding of the world. However, the importance of questioning has been largely overlooked in AI research, where models have been primarily developed to answer questions. With the recent advancements of large language models (LLMs) like ChatGPT, we discover their capability to ask high-quality questions when provided with a suitable prompt. This discovery presents a new opportunity to develop an automatic questioning system. In this paper, we introduce ChatCaptioner, a novel automatic-questioning method deployed in image captioning. Here, ChatGPT is prompted to ask a series of informative questions about images to BLIP-2, a strong vision question-answering model. By keeping acquiring new visual information from BLIP-2's answers, ChatCaptioner is able to generate more enriched image descriptions. We conduct human-subject evaluations on common image caption datasets such as COCO, Conceptual Caption, and WikiArt, and compare ChatCaptioner with BLIP-2 as well as ground truth. Our results demonstrate that ChatCaptioner's captions are significantly more informative, receiving three times as many votes from human evaluators for providing the most image information. Besides, ChatCaptioner identifies 53% more objects within the image than BLIP-2 alone measured by WordNet synset matching. Code is available at https://github.com/Vision-CAIR/ChatCaptioner"]], "1_corpus": "arxiv"}
38
  {"tstamp": 1722392051.0052, "task_type": "retrieval", "type": "tievote", "models": ["", ""], "ip": "", "0_conv_id": "e76d8a4d528e4709bcc107981744ebd2", "0_model_name": "BM25", "0_prompt": "Which positional encoding scheme improves Transformer performance on long sequences?", "0_output": [["Which positional encoding scheme improves Transformer performance on long sequences?", "Title: PermuteFormer: Efficient Relative Position Encoding for Long Sequences\n\nAbstract: A recent variation of Transformer, Performer, scales Transformer to longer sequences with a linear attention mechanism. However, it is not compatible with relative position encoding, which has advantages over absolute position encoding. In this paper, we discuss possible ways to add relative position encoding to Performer. Based on the analysis, we propose PermuteFormer, a Performer-based model with relative position encoding that scales linearly on long sequences. PermuteFormer applies position-dependent transformation on queries and keys to encode positional information into the attention module. This transformation is carefully crafted so that the final output of self-attention is not affected by absolute positions of tokens. PermuteFormer introduces negligible computational overhead by design that it runs as fast as Performer. We evaluate PermuteFormer on Long-Range Arena, a dataset for long sequences, as well as WikiText-103, a language modeling dataset. The experiments show that PermuteFormer uniformly improves the performance of Performer with almost no computational overhead and outperforms vanilla Transformer on most of the tasks."]], "0_corpus": "arxiv", "1_conv_id": "41c62996b59b4a37b9c1cb25dd70c0dd", "1_model_name": "GritLM/GritLM-7B", "1_prompt": "Which positional encoding scheme improves Transformer performance on long sequences?", "1_output": [["Which positional encoding scheme improves Transformer performance on long sequences?", "Title: PermuteFormer: Efficient Relative Position Encoding for Long Sequences\n\nAbstract: A recent variation of Transformer, Performer, scales Transformer to longer sequences with a linear attention mechanism. However, it is not compatible with relative position encoding, which has advantages over absolute position encoding. In this paper, we discuss possible ways to add relative position encoding to Performer. Based on the analysis, we propose PermuteFormer, a Performer-based model with relative position encoding that scales linearly on long sequences. PermuteFormer applies position-dependent transformation on queries and keys to encode positional information into the attention module. This transformation is carefully crafted so that the final output of self-attention is not affected by absolute positions of tokens. PermuteFormer introduces negligible computational overhead by design that it runs as fast as Performer. We evaluate PermuteFormer on Long-Range Arena, a dataset for long sequences, as well as WikiText-103, a language modeling dataset. The experiments show that PermuteFormer uniformly improves the performance of Performer with almost no computational overhead and outperforms vanilla Transformer on most of the tasks."]], "1_corpus": "arxiv"}
39
  {"tstamp": 1722392088.2313, "task_type": "retrieval", "type": "leftvote", "models": ["", ""], "ip": "", "0_conv_id": "85225ff2b983444c8cf0212454cc1197", "0_model_name": "voyage-multilingual-2", "0_prompt": "who came up with the idea of the transcontinental railroad", "0_output": [["who came up with the idea of the transcontinental railroad", "First transcontinental railroad\n\nThe Big Four Four northern California businessmen formed the Central Pacific Railroad: Leland Stanford, (1824–1893), President; Collis Potter Huntington, (1821–1900), Vice President; Mark Hopkins, (1813–1878), Treasurer; Charles Crocker, (1822–1888), Construction Supervisor. All became substantially wealthy from their association with the railroad. Judah, Marsh, Strong, Stanford, Huntington, Hopkins and Crocker, along with James Bailey and Lucius Anson Booth, became the first board of directors of the Central Pacific Railroad.\nThomas Durant\nFormer ophthalmologist Dr. Thomas Clark \"Doc\" Durant was nominally only a vice president of Union Pacific, so he installed a series of respected men like John Adams Dix as president of the railroad. While serving as vice president of Union Pacific he would be a key figure in the Crédit Mobilier scandal which ultimately led to his removal from the company.\nGrenville M. Dodge\nMajor General Grenville M. Dodge served as the chief engineer of Union Pacific during the construction of the First Transcontinental Railroad. In 1865 while fighting against Native-American tribes he would discover a pass in the Laramie Mountains, which would serve as a vital passage for the First Transcontinental Railroad. Dodge would serve in the United States House of Representatives for Iowa's 5th District from 1867 until 1869. During this time he would push for legislation to help the construction of the railroad."]], "0_corpus": "wikipedia", "1_conv_id": "927e17b7211d4670ae1eb6c0f3fd0fd3", "1_model_name": "intfloat/e5-mistral-7b-instruct", "1_prompt": "who came up with the idea of the transcontinental railroad", "1_output": [["who came up with the idea of the transcontinental railroad", "First transcontinental railroad\n\nAmerica's first transcontinental railroad (known originally as the \"Pacific Railroad\" and later as the \"Overland Route\") was a continuous railroad line built between 1863 and 1869 that connected the existing eastern U.S. rail network at Council Bluffs, Iowa, with the Pacific coast at the Oakland Long Wharf on San Francisco Bay. The rail line was built by three private companies over public lands provided by extensive U.S. land grants. Building was financed by both state and U.S. government subsidy bonds as well as by company-issued mortgage bonds. The Western Pacific Railroad Company built of track from the road's western terminus at Alameda/Oakland to Sacramento, California. The Central Pacific Railroad Company of California (CPRR) constructed east from Sacramento to Promontory Summit, Utah Territory. The Union Pacific Railroad (UPRR) built from the road's eastern terminus at the Missouri River settlements of Council Bluffs and Omaha, Nebraska, westward to Promontory Summit.\nThe railroad opened for through traffic between Sacramento and Omaha on May 10, 1869, when CPRR President Leland Stanford ceremonially tapped the gold \"Last Spike\" (later often referred to as the \"Golden Spike\") with a silver hammer at Promontory Summit. In the following six months, the last leg from Sacramento to San Francisco Bay was completed. The resulting coast-to-coast railroad connection revolutionized the settlement and economy of the American West. It brought the western states and territories into alignment with the northern Union states and made transporting passengers and goods coast-to-coast considerably quicker, safer and less expensive.\nThe first transcontinental rail passengers arrived at the Pacific Railroad's original western terminus at the Alameda Terminal on September 6, 1869, where they transferred to the steamer Alameda for transport across the Bay to San Francisco. The road's rail terminus was moved two months later to the Oakland Long Wharf, about a mile to the north, when its expansion was completed and opened for passengers on November 8, 1869. Service between San Francisco and Oakland Pier continued to be provided by ferry."]], "1_corpus": "wikipedia"}
40
+ {"tstamp": 1722392115.6318, "task_type": "retrieval", "type": "tievote", "models": ["", ""], "ip": "", "0_conv_id": "415fb556f184488b92d5725a2fa65296", "0_model_name": "voyage-multilingual-2", "0_prompt": "Paper analyzing the impact of misinformation generated by AI on Open-Domain Question Answering systems", "0_output": [["Paper analyzing the impact of misinformation generated by AI on Open-Domain Question Answering systems", "Title: Attacking Open-domain Question Answering by Injecting Misinformation\n\nAbstract: With a rise in false, inaccurate, and misleading information in propaganda, news, and social media, real-world Question Answering (QA) systems face the challenges of synthesizing and reasoning over misinformation-polluted contexts to derive correct answers. This urgency gives rise to the need to make QA systems robust to misinformation, a topic previously unexplored. We study the risk of misinformation to QA models by investigating the sensitivity of open-domain QA models to corpus pollution with misinformation documents. We curate both human-written and model-generated false documents that we inject into the evidence corpus of QA models and assess the impact on the performance of these systems. Experiments show that QA models are vulnerable to even small amounts of evidence contamination brought by misinformation, with large absolute performance drops on all models. Misinformation attack brings more threat when fake documents are produced at scale by neural models or the attacker targets hacking specific questions of interest. To defend against such a threat, we discuss the necessity of building a misinformation-aware QA system that integrates question-answering and misinformation detection in a joint fashion."]], "0_corpus": "arxiv", "1_conv_id": "bf554ef594fa42659fa1dc167287f96d", "1_model_name": "BM25", "1_prompt": "Paper analyzing the impact of misinformation generated by AI on Open-Domain Question Answering systems", "1_output": [["Paper analyzing the impact of misinformation generated by AI on Open-Domain Question Answering systems", "Title: Attacking Open-domain Question Answering by Injecting Misinformation\n\nAbstract: With a rise in false, inaccurate, and misleading information in propaganda, news, and social media, real-world Question Answering (QA) systems face the challenges of synthesizing and reasoning over misinformation-polluted contexts to derive correct answers. This urgency gives rise to the need to make QA systems robust to misinformation, a topic previously unexplored. We study the risk of misinformation to QA models by investigating the sensitivity of open-domain QA models to corpus pollution with misinformation documents. We curate both human-written and model-generated false documents that we inject into the evidence corpus of QA models and assess the impact on the performance of these systems. Experiments show that QA models are vulnerable to even small amounts of evidence contamination brought by misinformation, with large absolute performance drops on all models. Misinformation attack brings more threat when fake documents are produced at scale by neural models or the attacker targets hacking specific questions of interest. To defend against such a threat, we discuss the necessity of building a misinformation-aware QA system that integrates question-answering and misinformation detection in a joint fashion."]], "1_corpus": "arxiv"}
41
+ {"tstamp": 1722392139.5942, "task_type": "retrieval", "type": "tievote", "models": ["", ""], "ip": "", "0_conv_id": "13b3a212dd6045c5980c9c53a8030a19", "0_model_name": "voyage-multilingual-2", "0_prompt": "suzanne's story orange is the new black", "0_output": [["suzanne's story orange is the new black", "List of Orange Is the New Black characters\n\nWhen Yvonne \"Vee\" Parker enters the prison and forms an African-American gang, Suzanne falls for Vee's charms and maternal influence, being exploited into becoming Vee's \"muscle.\" While zealously loyal to Vee, Suzanne violently beat or threatened any inmates who crossed her, almost acting on command. Later, Vee attempts to coldly trick her into taking the fall for Red's severe beating, as a distraught Suzanne believes she may have done it unconsciously due to her violent history. During the third season, she is encouraged by the new counselor Berdie Rogers to be more creative, causing her to start writing several science fiction erotic stories that become a hit among the women in the prison. Suzanne reveals that she has no sexual experience and is completely naïve in regard to sex, having never actually had a girlfriend before, and that the stories are based on other sources. Eventually, the stories make their way to the staff, causing Rogers to get suspended. Meanwhile, Suzanne becomes nervous upon discovering that one of her fans, Maureen Kukudio, is interested in her romantically. Suzanne backs out of a possible sexual encounter, but is later seen forming a close bond with Maureen towards the end of the season."]], "0_corpus": "wikipedia", "1_conv_id": "5c91938153e6442481f259a5f71fb16e", "1_model_name": "text-embedding-3-large", "1_prompt": "suzanne's story orange is the new black", "1_output": [["suzanne's story orange is the new black", "List of Orange Is the New Black characters\n\nWhen Yvonne \"Vee\" Parker enters the prison and forms an African-American gang, Suzanne falls for Vee's charms and maternal influence, being exploited into becoming Vee's \"muscle.\" While zealously loyal to Vee, Suzanne violently beat or threatened any inmates who crossed her, almost acting on command. Later, Vee attempts to coldly trick her into taking the fall for Red's severe beating, as a distraught Suzanne believes she may have done it unconsciously due to her violent history. During the third season, she is encouraged by the new counselor Berdie Rogers to be more creative, causing her to start writing several science fiction erotic stories that become a hit among the women in the prison. Suzanne reveals that she has no sexual experience and is completely naïve in regard to sex, having never actually had a girlfriend before, and that the stories are based on other sources. Eventually, the stories make their way to the staff, causing Rogers to get suspended. Meanwhile, Suzanne becomes nervous upon discovering that one of her fans, Maureen Kukudio, is interested in her romantically. Suzanne backs out of a possible sexual encounter, but is later seen forming a close bond with Maureen towards the end of the season."]], "1_corpus": "wikipedia"}
data/retrieval_individual-81269da8-6dc7-417a-8e0a-245c4a753482.jsonl CHANGED
@@ -88,3 +88,7 @@
88
  {"tstamp": 1722392033.633, "task_type": "retrieval", "type": "chat", "model": "GritLM/GritLM-7B", "gen_params": {}, "start": 1722392033.4141, "finish": 1722392033.633, "ip": "", "conv_id": "41c62996b59b4a37b9c1cb25dd70c0dd", "model_name": "GritLM/GritLM-7B", "prompt": "Which positional encoding scheme improves Transformer performance on long sequences?", "output": [["Which positional encoding scheme improves Transformer performance on long sequences?", "Title: PermuteFormer: Efficient Relative Position Encoding for Long Sequences\n\nAbstract: A recent variation of Transformer, Performer, scales Transformer to longer sequences with a linear attention mechanism. However, it is not compatible with relative position encoding, which has advantages over absolute position encoding. In this paper, we discuss possible ways to add relative position encoding to Performer. Based on the analysis, we propose PermuteFormer, a Performer-based model with relative position encoding that scales linearly on long sequences. PermuteFormer applies position-dependent transformation on queries and keys to encode positional information into the attention module. This transformation is carefully crafted so that the final output of self-attention is not affected by absolute positions of tokens. PermuteFormer introduces negligible computational overhead by design that it runs as fast as Performer. We evaluate PermuteFormer on Long-Range Arena, a dataset for long sequences, as well as WikiText-103, a language modeling dataset. The experiments show that PermuteFormer uniformly improves the performance of Performer with almost no computational overhead and outperforms vanilla Transformer on most of the tasks."]], "corpus": "arxiv"}
89
  {"tstamp": 1722392065.3117, "task_type": "retrieval", "type": "chat", "model": "voyage-multilingual-2", "gen_params": {}, "start": 1722392064.8813, "finish": 1722392065.3117, "ip": "", "conv_id": "85225ff2b983444c8cf0212454cc1197", "model_name": "voyage-multilingual-2", "prompt": "who came up with the idea of the transcontinental railroad", "output": [["who came up with the idea of the transcontinental railroad", "First transcontinental railroad\n\nThe Big Four Four northern California businessmen formed the Central Pacific Railroad: Leland Stanford, (1824–1893), President; Collis Potter Huntington, (1821–1900), Vice President; Mark Hopkins, (1813–1878), Treasurer; Charles Crocker, (1822–1888), Construction Supervisor. All became substantially wealthy from their association with the railroad. Judah, Marsh, Strong, Stanford, Huntington, Hopkins and Crocker, along with James Bailey and Lucius Anson Booth, became the first board of directors of the Central Pacific Railroad.\nThomas Durant\nFormer ophthalmologist Dr. Thomas Clark \"Doc\" Durant was nominally only a vice president of Union Pacific, so he installed a series of respected men like John Adams Dix as president of the railroad. While serving as vice president of Union Pacific he would be a key figure in the Crédit Mobilier scandal which ultimately led to his removal from the company.\nGrenville M. Dodge\nMajor General Grenville M. Dodge served as the chief engineer of Union Pacific during the construction of the First Transcontinental Railroad. In 1865 while fighting against Native-American tribes he would discover a pass in the Laramie Mountains, which would serve as a vital passage for the First Transcontinental Railroad. Dodge would serve in the United States House of Representatives for Iowa's 5th District from 1867 until 1869. During this time he would push for legislation to help the construction of the railroad."]], "corpus": "wikipedia"}
90
  {"tstamp": 1722392065.3117, "task_type": "retrieval", "type": "chat", "model": "intfloat/e5-mistral-7b-instruct", "gen_params": {}, "start": 1722392064.8813, "finish": 1722392065.3117, "ip": "", "conv_id": "927e17b7211d4670ae1eb6c0f3fd0fd3", "model_name": "intfloat/e5-mistral-7b-instruct", "prompt": "who came up with the idea of the transcontinental railroad", "output": [["who came up with the idea of the transcontinental railroad", "First transcontinental railroad\n\nAmerica's first transcontinental railroad (known originally as the \"Pacific Railroad\" and later as the \"Overland Route\") was a continuous railroad line built between 1863 and 1869 that connected the existing eastern U.S. rail network at Council Bluffs, Iowa, with the Pacific coast at the Oakland Long Wharf on San Francisco Bay. The rail line was built by three private companies over public lands provided by extensive U.S. land grants. Building was financed by both state and U.S. government subsidy bonds as well as by company-issued mortgage bonds. The Western Pacific Railroad Company built of track from the road's western terminus at Alameda/Oakland to Sacramento, California. The Central Pacific Railroad Company of California (CPRR) constructed east from Sacramento to Promontory Summit, Utah Territory. The Union Pacific Railroad (UPRR) built from the road's eastern terminus at the Missouri River settlements of Council Bluffs and Omaha, Nebraska, westward to Promontory Summit.\nThe railroad opened for through traffic between Sacramento and Omaha on May 10, 1869, when CPRR President Leland Stanford ceremonially tapped the gold \"Last Spike\" (later often referred to as the \"Golden Spike\") with a silver hammer at Promontory Summit. In the following six months, the last leg from Sacramento to San Francisco Bay was completed. The resulting coast-to-coast railroad connection revolutionized the settlement and economy of the American West. It brought the western states and territories into alignment with the northern Union states and made transporting passengers and goods coast-to-coast considerably quicker, safer and less expensive.\nThe first transcontinental rail passengers arrived at the Pacific Railroad's original western terminus at the Alameda Terminal on September 6, 1869, where they transferred to the steamer Alameda for transport across the Bay to San Francisco. The road's rail terminus was moved two months later to the Oakland Long Wharf, about a mile to the north, when its expansion was completed and opened for passengers on November 8, 1869. Service between San Francisco and Oakland Pier continued to be provided by ferry."]], "corpus": "wikipedia"}
 
 
 
 
 
88
  {"tstamp": 1722392033.633, "task_type": "retrieval", "type": "chat", "model": "GritLM/GritLM-7B", "gen_params": {}, "start": 1722392033.4141, "finish": 1722392033.633, "ip": "", "conv_id": "41c62996b59b4a37b9c1cb25dd70c0dd", "model_name": "GritLM/GritLM-7B", "prompt": "Which positional encoding scheme improves Transformer performance on long sequences?", "output": [["Which positional encoding scheme improves Transformer performance on long sequences?", "Title: PermuteFormer: Efficient Relative Position Encoding for Long Sequences\n\nAbstract: A recent variation of Transformer, Performer, scales Transformer to longer sequences with a linear attention mechanism. However, it is not compatible with relative position encoding, which has advantages over absolute position encoding. In this paper, we discuss possible ways to add relative position encoding to Performer. Based on the analysis, we propose PermuteFormer, a Performer-based model with relative position encoding that scales linearly on long sequences. PermuteFormer applies position-dependent transformation on queries and keys to encode positional information into the attention module. This transformation is carefully crafted so that the final output of self-attention is not affected by absolute positions of tokens. PermuteFormer introduces negligible computational overhead by design that it runs as fast as Performer. We evaluate PermuteFormer on Long-Range Arena, a dataset for long sequences, as well as WikiText-103, a language modeling dataset. The experiments show that PermuteFormer uniformly improves the performance of Performer with almost no computational overhead and outperforms vanilla Transformer on most of the tasks."]], "corpus": "arxiv"}
89
  {"tstamp": 1722392065.3117, "task_type": "retrieval", "type": "chat", "model": "voyage-multilingual-2", "gen_params": {}, "start": 1722392064.8813, "finish": 1722392065.3117, "ip": "", "conv_id": "85225ff2b983444c8cf0212454cc1197", "model_name": "voyage-multilingual-2", "prompt": "who came up with the idea of the transcontinental railroad", "output": [["who came up with the idea of the transcontinental railroad", "First transcontinental railroad\n\nThe Big Four Four northern California businessmen formed the Central Pacific Railroad: Leland Stanford, (1824–1893), President; Collis Potter Huntington, (1821–1900), Vice President; Mark Hopkins, (1813–1878), Treasurer; Charles Crocker, (1822–1888), Construction Supervisor. All became substantially wealthy from their association with the railroad. Judah, Marsh, Strong, Stanford, Huntington, Hopkins and Crocker, along with James Bailey and Lucius Anson Booth, became the first board of directors of the Central Pacific Railroad.\nThomas Durant\nFormer ophthalmologist Dr. Thomas Clark \"Doc\" Durant was nominally only a vice president of Union Pacific, so he installed a series of respected men like John Adams Dix as president of the railroad. While serving as vice president of Union Pacific he would be a key figure in the Crédit Mobilier scandal which ultimately led to his removal from the company.\nGrenville M. Dodge\nMajor General Grenville M. Dodge served as the chief engineer of Union Pacific during the construction of the First Transcontinental Railroad. In 1865 while fighting against Native-American tribes he would discover a pass in the Laramie Mountains, which would serve as a vital passage for the First Transcontinental Railroad. Dodge would serve in the United States House of Representatives for Iowa's 5th District from 1867 until 1869. During this time he would push for legislation to help the construction of the railroad."]], "corpus": "wikipedia"}
90
  {"tstamp": 1722392065.3117, "task_type": "retrieval", "type": "chat", "model": "intfloat/e5-mistral-7b-instruct", "gen_params": {}, "start": 1722392064.8813, "finish": 1722392065.3117, "ip": "", "conv_id": "927e17b7211d4670ae1eb6c0f3fd0fd3", "model_name": "intfloat/e5-mistral-7b-instruct", "prompt": "who came up with the idea of the transcontinental railroad", "output": [["who came up with the idea of the transcontinental railroad", "First transcontinental railroad\n\nAmerica's first transcontinental railroad (known originally as the \"Pacific Railroad\" and later as the \"Overland Route\") was a continuous railroad line built between 1863 and 1869 that connected the existing eastern U.S. rail network at Council Bluffs, Iowa, with the Pacific coast at the Oakland Long Wharf on San Francisco Bay. The rail line was built by three private companies over public lands provided by extensive U.S. land grants. Building was financed by both state and U.S. government subsidy bonds as well as by company-issued mortgage bonds. The Western Pacific Railroad Company built of track from the road's western terminus at Alameda/Oakland to Sacramento, California. The Central Pacific Railroad Company of California (CPRR) constructed east from Sacramento to Promontory Summit, Utah Territory. The Union Pacific Railroad (UPRR) built from the road's eastern terminus at the Missouri River settlements of Council Bluffs and Omaha, Nebraska, westward to Promontory Summit.\nThe railroad opened for through traffic between Sacramento and Omaha on May 10, 1869, when CPRR President Leland Stanford ceremonially tapped the gold \"Last Spike\" (later often referred to as the \"Golden Spike\") with a silver hammer at Promontory Summit. In the following six months, the last leg from Sacramento to San Francisco Bay was completed. The resulting coast-to-coast railroad connection revolutionized the settlement and economy of the American West. It brought the western states and territories into alignment with the northern Union states and made transporting passengers and goods coast-to-coast considerably quicker, safer and less expensive.\nThe first transcontinental rail passengers arrived at the Pacific Railroad's original western terminus at the Alameda Terminal on September 6, 1869, where they transferred to the steamer Alameda for transport across the Bay to San Francisco. The road's rail terminus was moved two months later to the Oakland Long Wharf, about a mile to the north, when its expansion was completed and opened for passengers on November 8, 1869. Service between San Francisco and Oakland Pier continued to be provided by ferry."]], "corpus": "wikipedia"}
91
+ {"tstamp": 1722392110.8365, "task_type": "retrieval", "type": "chat", "model": "voyage-multilingual-2", "gen_params": {}, "start": 1722392110.4204, "finish": 1722392110.8365, "ip": "", "conv_id": "415fb556f184488b92d5725a2fa65296", "model_name": "voyage-multilingual-2", "prompt": "Paper analyzing the impact of misinformation generated by AI on Open-Domain Question Answering systems", "output": [["Paper analyzing the impact of misinformation generated by AI on Open-Domain Question Answering systems", "Title: Attacking Open-domain Question Answering by Injecting Misinformation\n\nAbstract: With a rise in false, inaccurate, and misleading information in propaganda, news, and social media, real-world Question Answering (QA) systems face the challenges of synthesizing and reasoning over misinformation-polluted contexts to derive correct answers. This urgency gives rise to the need to make QA systems robust to misinformation, a topic previously unexplored. We study the risk of misinformation to QA models by investigating the sensitivity of open-domain QA models to corpus pollution with misinformation documents. We curate both human-written and model-generated false documents that we inject into the evidence corpus of QA models and assess the impact on the performance of these systems. Experiments show that QA models are vulnerable to even small amounts of evidence contamination brought by misinformation, with large absolute performance drops on all models. Misinformation attack brings more threat when fake documents are produced at scale by neural models or the attacker targets hacking specific questions of interest. To defend against such a threat, we discuss the necessity of building a misinformation-aware QA system that integrates question-answering and misinformation detection in a joint fashion."]], "corpus": "arxiv"}
92
+ {"tstamp": 1722392110.8365, "task_type": "retrieval", "type": "chat", "model": "BM25", "gen_params": {}, "start": 1722392110.4204, "finish": 1722392110.8365, "ip": "", "conv_id": "bf554ef594fa42659fa1dc167287f96d", "model_name": "BM25", "prompt": "Paper analyzing the impact of misinformation generated by AI on Open-Domain Question Answering systems", "output": [["Paper analyzing the impact of misinformation generated by AI on Open-Domain Question Answering systems", "Title: Attacking Open-domain Question Answering by Injecting Misinformation\n\nAbstract: With a rise in false, inaccurate, and misleading information in propaganda, news, and social media, real-world Question Answering (QA) systems face the challenges of synthesizing and reasoning over misinformation-polluted contexts to derive correct answers. This urgency gives rise to the need to make QA systems robust to misinformation, a topic previously unexplored. We study the risk of misinformation to QA models by investigating the sensitivity of open-domain QA models to corpus pollution with misinformation documents. We curate both human-written and model-generated false documents that we inject into the evidence corpus of QA models and assess the impact on the performance of these systems. Experiments show that QA models are vulnerable to even small amounts of evidence contamination brought by misinformation, with large absolute performance drops on all models. Misinformation attack brings more threat when fake documents are produced at scale by neural models or the attacker targets hacking specific questions of interest. To defend against such a threat, we discuss the necessity of building a misinformation-aware QA system that integrates question-answering and misinformation detection in a joint fashion."]], "corpus": "arxiv"}
93
+ {"tstamp": 1722392126.1896, "task_type": "retrieval", "type": "chat", "model": "voyage-multilingual-2", "gen_params": {}, "start": 1722392125.597, "finish": 1722392126.1896, "ip": "", "conv_id": "13b3a212dd6045c5980c9c53a8030a19", "model_name": "voyage-multilingual-2", "prompt": "suzanne's story orange is the new black", "output": [["suzanne's story orange is the new black", "List of Orange Is the New Black characters\n\nWhen Yvonne \"Vee\" Parker enters the prison and forms an African-American gang, Suzanne falls for Vee's charms and maternal influence, being exploited into becoming Vee's \"muscle.\" While zealously loyal to Vee, Suzanne violently beat or threatened any inmates who crossed her, almost acting on command. Later, Vee attempts to coldly trick her into taking the fall for Red's severe beating, as a distraught Suzanne believes she may have done it unconsciously due to her violent history. During the third season, she is encouraged by the new counselor Berdie Rogers to be more creative, causing her to start writing several science fiction erotic stories that become a hit among the women in the prison. Suzanne reveals that she has no sexual experience and is completely naïve in regard to sex, having never actually had a girlfriend before, and that the stories are based on other sources. Eventually, the stories make their way to the staff, causing Rogers to get suspended. Meanwhile, Suzanne becomes nervous upon discovering that one of her fans, Maureen Kukudio, is interested in her romantically. Suzanne backs out of a possible sexual encounter, but is later seen forming a close bond with Maureen towards the end of the season."]], "corpus": "wikipedia"}
94
+ {"tstamp": 1722392126.1896, "task_type": "retrieval", "type": "chat", "model": "text-embedding-3-large", "gen_params": {}, "start": 1722392125.597, "finish": 1722392126.1896, "ip": "", "conv_id": "5c91938153e6442481f259a5f71fb16e", "model_name": "text-embedding-3-large", "prompt": "suzanne's story orange is the new black", "output": [["suzanne's story orange is the new black", "List of Orange Is the New Black characters\n\nWhen Yvonne \"Vee\" Parker enters the prison and forms an African-American gang, Suzanne falls for Vee's charms and maternal influence, being exploited into becoming Vee's \"muscle.\" While zealously loyal to Vee, Suzanne violently beat or threatened any inmates who crossed her, almost acting on command. Later, Vee attempts to coldly trick her into taking the fall for Red's severe beating, as a distraught Suzanne believes she may have done it unconsciously due to her violent history. During the third season, she is encouraged by the new counselor Berdie Rogers to be more creative, causing her to start writing several science fiction erotic stories that become a hit among the women in the prison. Suzanne reveals that she has no sexual experience and is completely naïve in regard to sex, having never actually had a girlfriend before, and that the stories are based on other sources. Eventually, the stories make their way to the staff, causing Rogers to get suspended. Meanwhile, Suzanne becomes nervous upon discovering that one of her fans, Maureen Kukudio, is interested in her romantically. Suzanne backs out of a possible sexual encounter, but is later seen forming a close bond with Maureen towards the end of the season."]], "corpus": "wikipedia"}