Muennighoff
commited on
Commit
•
5e9af6f
1
Parent(s):
fe4e0e2
Scheduled Commit
Browse files
data/retrieval_battle-1948e9c4-e613-4487-b12e-0e988013b994.jsonl
CHANGED
@@ -149,3 +149,4 @@
|
|
149 |
{"tstamp": 1722871506.4508, "task_type": "retrieval", "type": "rightvote", "models": ["", ""], "ip": "", "0_conv_id": "6c6458558b63453987521f4a2846e911", "0_model_name": "Salesforce/SFR-Embedding-2_R", "0_prompt": "What was the result of the French election in 2024? Why did that happen?", "0_output": [["What was the result of the French election in 2024? Why did that happen?", "2022 French presidential election\n\nPresidential elections were held in France on 10 and 24 April 2022. As no candidate won a majority in the first round, a runoff was held, in which Emmanuel Macron defeated Marine Le Pen and was re-elected as President of France. Macron, from La République En Marche! (LREM), had defeated Le Pen, leader of the National Rally, once already in the 2017 French presidential election, for the term which expired on 13 May 2022. Macron became the first president of France to win a re-election bid since Jacques Chirac won in 2002.\nIn the first round, Macron took the lead with 27.9% of votes, followed by Le Pen with 23.2%, Jean-Luc Mélenchon of La France Insoumise with 22%, and Éric Zemmour of Reconquête with 7.1%. Valérie Pécresse of The Republicans took 4.8% of the vote, and Anne Hidalgo, mayor of Paris and Socialist Party candidate, 1.8%. Both the Republicans and Socialist parties, considered to be the dominant parties until 2017, received their worst results in a presidential election."]], "0_corpus": "wikipedia", "1_conv_id": "f0cc7bedb5974dedb5b83eeb15975963", "1_model_name": "nomic-ai/nomic-embed-text-v1.5", "1_prompt": "What was the result of the French election in 2024? Why did that happen?", "1_output": [["What was the result of the French election in 2024? Why did that happen?", "2024 French legislative election\n\nOn 9 June 2024, shortly after 21:00 CEST, Macron dissolved the National Assembly and called snap elections in a national address following projections which indicated that the L'Europe Ensemble electoral list would be significantly eclipsed by the RN in the European Parliament elections in France. In his address, he called the rise of nationalism by agitators a threat to France, Europe, and France's place in the world. He also warned that the far-right would bring about the \"impoverishment of the French people and the downfall of our country.\" The dates of the first and second rounds of elections were set for 30 June and 7 July, respectively.\nReactions to the announcement\nPoliticians\nRN leader Jordan Bardella called the large gap between the RN and L'Europe Ensemble electoral lists in the European Parliament elections a \"stinging disavowal\" of President Macron, saying that the results marked \"day 1 of the post-Macron era.\" Marine Le Pen, president of the RN group in the National Assembly, and Jean-Luc Mélenchon, leader of La France Insoumise, celebrated the election results and welcomed the announcement of snap elections.\nFormer president Nicolas Sarkozy condemned Macron, seeing his decision to dissolve parliament as a \"serious risk for the country.\" Mayor of Paris Anne Hidalgo reacted extremely negatively to Macron's decision, saying that the elections posed a serious threat to the upcoming 2024 Summer Olympics because they would not only \"spoil the mood of the whole country\" but also carry the risk of street riots and demonstrations."]], "1_corpus": "wikipedia"}
|
150 |
{"tstamp": 1722871990.1213, "task_type": "retrieval", "type": "bothbadvote", "models": ["", ""], "ip": "", "0_conv_id": "78284d34a4ef47e29eb0067f2420da39", "0_model_name": "text-embedding-3-large", "0_prompt": "What is KiwiTech ", "0_output": [["What is KiwiTech ", "Kiwi.com\n\nKiwi.com (previously known as skypicker.com) is a Czech online travel agency founded by Oliver Dlouhý and Jozef Képesi in 2012. Kiwi.com provides a fare aggregator, metasearch engine and booking for airline tickets and ground transportation. Its ticket search features Kiwi.com’s \"virtual interlining\" concept – itineraries combined from over 750 carriers, including many that do not usually cooperate in online bookings."]], "0_corpus": "wikipedia", "1_conv_id": "375faf9b60ee45219b9143ad1f7e78c4", "1_model_name": "GritLM/GritLM-7B", "1_prompt": "What is KiwiTech ", "1_output": [["What is KiwiTech ", "Kiwi.com\n\nHistory The online portal skypicker.com was created in 2011. The company was founded in Brno by Oliver Dlouhý and Jozef Képesi. In 2016 the company acquired the domain name Kiwi.com for $800,000 and rebranded itself as Kiwi.com. Jiří Hlavenka was one of its first investors. In 2017, Kiwi.com acquired a stake in Jaroslav Kokolus. In 2018, Kiwi.com was included by Financial Times into the list of European firms with the top compound annual revenue growth in 2015-2018 and was titled ‘7th fastest growing company’ in EMEA according to Deloitte.\nAs of June 2019, Kiwi.com’s main shareholder is General Atlantic. Dlouhý and Képesi remain as major shareholders and continue to run the company. Other shareholders are Touzimsky Airlines and Luboš Charčenko.\nIn December 2022, Airasia partnered with Kiwi.com to transform the carrier into the ASEAN app for travel and lifestyle.\nIn 2023, Kiwi.com partnered with Bratislava Airport to assist travelers in finding destinations starting from the Slovakian hub.\nServices\nIn 2018, Kiwi.com launched NOMAD, a special multi-city travel search tool, and Tequila, a dedicated B2B platform.\nAlong with its headquarters in Brno, Kiwi.com has offices in Prague (Czech Republic), Barcelona (Spain), Bratislava (Slovakia), Košice (Slovakia), London (United Kingdom) and Miami (USA).\nThe company is presently one of the five biggest online air ticket sellers in Europe, with an annual turnover of approximately 1.1 billion euros in 2018. The company booked around 12,000 traveler itineraries daily and employed 2000 workers during 2019.\nIn 2022, Kiwi.com sold 50 billion CZK worth of tickets."]], "1_corpus": "wikipedia"}
|
151 |
{"tstamp": 1722872047.7203, "task_type": "retrieval", "type": "tievote", "models": ["", ""], "ip": "", "0_conv_id": "fc8e1acde1f8497aae43dae2f85e1587", "0_model_name": "text-embedding-3-large", "0_prompt": "I am looking for a paper which defined transformer architecture ", "0_output": [["I am looking for a paper which defined transformer architecture ", "Title: An Introduction to Transformers\n\nAbstract: The transformer is a neural network component that can be used to learn useful representations of sequences or sets of data-points. The transformer has driven recent advances in natural language processing, computer vision, and spatio-temporal modelling. There are many introductions to transformers, but most do not contain precise mathematical descriptions of the architecture and the intuitions behind the design choices are often also missing. Moreover, as research takes a winding path, the explanations for the components of the transformer can be idiosyncratic. In this note we aim for a mathematically precise, intuitive, and clean description of the transformer architecture. We will not discuss training as this is rather standard. We assume that the reader is familiar with fundamental topics in machine learning including multi-layer perceptrons, linear transformations, softmax functions and basic probability."]], "0_corpus": "arxiv", "1_conv_id": "b59f4dc6299b49cb9815c6bfe87599cb", "1_model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "1_prompt": "I am looking for a paper which defined transformer architecture ", "1_output": [["I am looking for a paper which defined transformer architecture ", "Title: An Introduction to Transformers\n\nAbstract: The transformer is a neural network component that can be used to learn useful representations of sequences or sets of data-points. The transformer has driven recent advances in natural language processing, computer vision, and spatio-temporal modelling. There are many introductions to transformers, but most do not contain precise mathematical descriptions of the architecture and the intuitions behind the design choices are often also missing. Moreover, as research takes a winding path, the explanations for the components of the transformer can be idiosyncratic. In this note we aim for a mathematically precise, intuitive, and clean description of the transformer architecture. We will not discuss training as this is rather standard. We assume that the reader is familiar with fundamental topics in machine learning including multi-layer perceptrons, linear transformations, softmax functions and basic probability."]], "1_corpus": "arxiv"}
|
|
|
|
149 |
{"tstamp": 1722871506.4508, "task_type": "retrieval", "type": "rightvote", "models": ["", ""], "ip": "", "0_conv_id": "6c6458558b63453987521f4a2846e911", "0_model_name": "Salesforce/SFR-Embedding-2_R", "0_prompt": "What was the result of the French election in 2024? Why did that happen?", "0_output": [["What was the result of the French election in 2024? Why did that happen?", "2022 French presidential election\n\nPresidential elections were held in France on 10 and 24 April 2022. As no candidate won a majority in the first round, a runoff was held, in which Emmanuel Macron defeated Marine Le Pen and was re-elected as President of France. Macron, from La République En Marche! (LREM), had defeated Le Pen, leader of the National Rally, once already in the 2017 French presidential election, for the term which expired on 13 May 2022. Macron became the first president of France to win a re-election bid since Jacques Chirac won in 2002.\nIn the first round, Macron took the lead with 27.9% of votes, followed by Le Pen with 23.2%, Jean-Luc Mélenchon of La France Insoumise with 22%, and Éric Zemmour of Reconquête with 7.1%. Valérie Pécresse of The Republicans took 4.8% of the vote, and Anne Hidalgo, mayor of Paris and Socialist Party candidate, 1.8%. Both the Republicans and Socialist parties, considered to be the dominant parties until 2017, received their worst results in a presidential election."]], "0_corpus": "wikipedia", "1_conv_id": "f0cc7bedb5974dedb5b83eeb15975963", "1_model_name": "nomic-ai/nomic-embed-text-v1.5", "1_prompt": "What was the result of the French election in 2024? Why did that happen?", "1_output": [["What was the result of the French election in 2024? Why did that happen?", "2024 French legislative election\n\nOn 9 June 2024, shortly after 21:00 CEST, Macron dissolved the National Assembly and called snap elections in a national address following projections which indicated that the L'Europe Ensemble electoral list would be significantly eclipsed by the RN in the European Parliament elections in France. In his address, he called the rise of nationalism by agitators a threat to France, Europe, and France's place in the world. He also warned that the far-right would bring about the \"impoverishment of the French people and the downfall of our country.\" The dates of the first and second rounds of elections were set for 30 June and 7 July, respectively.\nReactions to the announcement\nPoliticians\nRN leader Jordan Bardella called the large gap between the RN and L'Europe Ensemble electoral lists in the European Parliament elections a \"stinging disavowal\" of President Macron, saying that the results marked \"day 1 of the post-Macron era.\" Marine Le Pen, president of the RN group in the National Assembly, and Jean-Luc Mélenchon, leader of La France Insoumise, celebrated the election results and welcomed the announcement of snap elections.\nFormer president Nicolas Sarkozy condemned Macron, seeing his decision to dissolve parliament as a \"serious risk for the country.\" Mayor of Paris Anne Hidalgo reacted extremely negatively to Macron's decision, saying that the elections posed a serious threat to the upcoming 2024 Summer Olympics because they would not only \"spoil the mood of the whole country\" but also carry the risk of street riots and demonstrations."]], "1_corpus": "wikipedia"}
|
150 |
{"tstamp": 1722871990.1213, "task_type": "retrieval", "type": "bothbadvote", "models": ["", ""], "ip": "", "0_conv_id": "78284d34a4ef47e29eb0067f2420da39", "0_model_name": "text-embedding-3-large", "0_prompt": "What is KiwiTech ", "0_output": [["What is KiwiTech ", "Kiwi.com\n\nKiwi.com (previously known as skypicker.com) is a Czech online travel agency founded by Oliver Dlouhý and Jozef Képesi in 2012. Kiwi.com provides a fare aggregator, metasearch engine and booking for airline tickets and ground transportation. Its ticket search features Kiwi.com’s \"virtual interlining\" concept – itineraries combined from over 750 carriers, including many that do not usually cooperate in online bookings."]], "0_corpus": "wikipedia", "1_conv_id": "375faf9b60ee45219b9143ad1f7e78c4", "1_model_name": "GritLM/GritLM-7B", "1_prompt": "What is KiwiTech ", "1_output": [["What is KiwiTech ", "Kiwi.com\n\nHistory The online portal skypicker.com was created in 2011. The company was founded in Brno by Oliver Dlouhý and Jozef Képesi. In 2016 the company acquired the domain name Kiwi.com for $800,000 and rebranded itself as Kiwi.com. Jiří Hlavenka was one of its first investors. In 2017, Kiwi.com acquired a stake in Jaroslav Kokolus. In 2018, Kiwi.com was included by Financial Times into the list of European firms with the top compound annual revenue growth in 2015-2018 and was titled ‘7th fastest growing company’ in EMEA according to Deloitte.\nAs of June 2019, Kiwi.com’s main shareholder is General Atlantic. Dlouhý and Képesi remain as major shareholders and continue to run the company. Other shareholders are Touzimsky Airlines and Luboš Charčenko.\nIn December 2022, Airasia partnered with Kiwi.com to transform the carrier into the ASEAN app for travel and lifestyle.\nIn 2023, Kiwi.com partnered with Bratislava Airport to assist travelers in finding destinations starting from the Slovakian hub.\nServices\nIn 2018, Kiwi.com launched NOMAD, a special multi-city travel search tool, and Tequila, a dedicated B2B platform.\nAlong with its headquarters in Brno, Kiwi.com has offices in Prague (Czech Republic), Barcelona (Spain), Bratislava (Slovakia), Košice (Slovakia), London (United Kingdom) and Miami (USA).\nThe company is presently one of the five biggest online air ticket sellers in Europe, with an annual turnover of approximately 1.1 billion euros in 2018. The company booked around 12,000 traveler itineraries daily and employed 2000 workers during 2019.\nIn 2022, Kiwi.com sold 50 billion CZK worth of tickets."]], "1_corpus": "wikipedia"}
|
151 |
{"tstamp": 1722872047.7203, "task_type": "retrieval", "type": "tievote", "models": ["", ""], "ip": "", "0_conv_id": "fc8e1acde1f8497aae43dae2f85e1587", "0_model_name": "text-embedding-3-large", "0_prompt": "I am looking for a paper which defined transformer architecture ", "0_output": [["I am looking for a paper which defined transformer architecture ", "Title: An Introduction to Transformers\n\nAbstract: The transformer is a neural network component that can be used to learn useful representations of sequences or sets of data-points. The transformer has driven recent advances in natural language processing, computer vision, and spatio-temporal modelling. There are many introductions to transformers, but most do not contain precise mathematical descriptions of the architecture and the intuitions behind the design choices are often also missing. Moreover, as research takes a winding path, the explanations for the components of the transformer can be idiosyncratic. In this note we aim for a mathematically precise, intuitive, and clean description of the transformer architecture. We will not discuss training as this is rather standard. We assume that the reader is familiar with fundamental topics in machine learning including multi-layer perceptrons, linear transformations, softmax functions and basic probability."]], "0_corpus": "arxiv", "1_conv_id": "b59f4dc6299b49cb9815c6bfe87599cb", "1_model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "1_prompt": "I am looking for a paper which defined transformer architecture ", "1_output": [["I am looking for a paper which defined transformer architecture ", "Title: An Introduction to Transformers\n\nAbstract: The transformer is a neural network component that can be used to learn useful representations of sequences or sets of data-points. The transformer has driven recent advances in natural language processing, computer vision, and spatio-temporal modelling. There are many introductions to transformers, but most do not contain precise mathematical descriptions of the architecture and the intuitions behind the design choices are often also missing. Moreover, as research takes a winding path, the explanations for the components of the transformer can be idiosyncratic. In this note we aim for a mathematically precise, intuitive, and clean description of the transformer architecture. We will not discuss training as this is rather standard. We assume that the reader is familiar with fundamental topics in machine learning including multi-layer perceptrons, linear transformations, softmax functions and basic probability."]], "1_corpus": "arxiv"}
|
152 |
+
{"tstamp": 1722872217.9369, "task_type": "retrieval", "type": "leftvote", "models": ["", ""], "ip": "", "0_conv_id": "875162e88675454b899e5007c3af71b0", "0_model_name": "voyage-multilingual-2", "0_prompt": "legal medical bill", "0_output": [["legal medical bill", "Title: Natural language processing of MIMIC-III clinical notes for identifying diagnosis and procedures with neural networks\n\nAbstract: Coding diagnosis and procedures in medical records is a crucial process in the healthcare industry, which includes the creation of accurate billings, receiving reimbursements from payers, and creating standardized patient care records. In the United States, Billing and Insurance related activities cost around $471 billion in 2012 which constitutes about 25% of all the U.S hospital spending. In this paper, we report the performance of a natural language processing model that can map clinical notes to medical codes, and predict final diagnosis from unstructured entries of history of present illness, symptoms at the time of admission, etc. Previous studies have demonstrated that deep learning models perform better at such mapping when compared to conventional machine learning models. Therefore, we employed state-of-the-art deep learning method, ULMFiT on the largest emergency department clinical notes dataset MIMIC III which has 1.2M clinical notes to select for the top-10 and top-50 diagnosis and procedure codes. Our models were able to predict the top-10 diagnoses and procedures with 80.3% and 80.5% accuracy, whereas the top-50 ICD-9 codes of diagnosis and procedures are predicted with 70.7% and 63.9% accuracy. Prediction of diagnosis and procedures from unstructured clinical notes benefit human coders to save time, eliminate errors and minimize costs. With promising scores from our present model, the next step would be to deploy this on a small-scale real-world scenario and compare it with human coders as the gold standard. We believe that further research of this approach can create highly accurate predictions that can ease the workflow in a clinical setting."]], "0_corpus": "arxiv", "1_conv_id": "b07a1531d4e2469a9b60f75a7627217c", "1_model_name": "GritLM/GritLM-7B", "1_prompt": "legal medical bill", "1_output": [["legal medical bill", "Title: Learning Bill Similarity with Annotated and Augmented Corpora of Bills\n\nAbstract: Bill writing is a critical element of representative democracy. However, it is often overlooked that most legislative bills are derived, or even directly copied, from other bills. Despite the significance of bill-to-bill linkages for understanding the legislative process, existing approaches fail to address semantic similarities across bills, let alone reordering or paraphrasing which are prevalent in legal document writing. In this paper, we overcome these limitations by proposing a 5-class classification task that closely reflects the nature of the bill generation process. In doing so, we construct a human-labeled dataset of 4,721 bill-to-bill relationships at the subsection-level and release this annotated dataset to the research community. To augment the dataset, we generate synthetic data with varying degrees of similarity, mimicking the complex bill writing process. We use BERT variants and apply multi-stage training, sequentially fine-tuning our models with synthetic and human-labeled datasets. We find that the predictive performance significantly improves when training with both human-labeled and synthetic data. Finally, we apply our trained model to infer section- and bill-level similarities. Our analysis shows that the proposed methodology successfully captures the similarities across legal documents at various levels of aggregation."]], "1_corpus": "arxiv"}
|