Datasets:
mteb
/

Modalities:
Tabular
Text
Formats:
json
Libraries:
Datasets
Dask
Muennighoff commited on
Commit
8164e15
·
verified ·
1 Parent(s): 9d7439b

Scheduled Commit

Browse files
data/retrieval_battle-407c4836-37e2-4f9f-8e9b-06706cc4440c.jsonl CHANGED
@@ -7,3 +7,4 @@
7
  {"tstamp": 1722442304.08, "task_type": "retrieval", "type": "tievote", "models": ["", ""], "ip": "", "0_conv_id": "c11b9d02ecdb4ac29d202f5ce92f0f64", "0_model_name": "Salesforce/SFR-Embedding-2_R", "0_prompt": "What is Venstre?", "0_output": [["What is Venstre?", "Venstre (Denmark)\n\n(V; , ), full name (), is a conservative-liberal, agrarian political party in Denmark. Founded as part of a peasants' movement against the landed aristocracy, today it espouses an economically liberal, pro-free-market ideology."]], "0_corpus": "wikipedia", "1_conv_id": "3a3a219015e944a08e96fcd2a795e1b8", "1_model_name": "BAAI/bge-large-en-v1.5", "1_prompt": "What is Venstre?", "1_output": [["What is Venstre?", "Venstre (Denmark)\n\n(V; , ), full name (), is a conservative-liberal, agrarian political party in Denmark. Founded as part of a peasants' movement against the landed aristocracy, today it espouses an economically liberal, pro-free-market ideology."]], "1_corpus": "wikipedia"}
8
  {"tstamp": 1722443951.9871, "task_type": "retrieval", "type": "bothbadvote", "models": ["", ""], "ip": "", "0_conv_id": "f294c32c8edf4cdb82f34b8988340f89", "0_model_name": "GritLM/GritLM-7B", "0_prompt": "Which country won the most gold medals in the olympic history", "0_output": [["Which country won the most gold medals in the olympic history", "List of Olympic teams by medals won\n\nThe United States has placed first overall in the Summer Olympic Games eighteen times, followed by the Soviet Union with six overall victories (not including the 1992 first-place finish by the Unified Team). In the Winter Olympic Games, Norway has placed first overall nine times and the Soviet Union (prior to its dissolution in 1991) had placed first seven times.\nSummer Olympics\nList by Games\nList by team\nWinter Olympics\nList by Games"]], "0_corpus": "wikipedia", "1_conv_id": "beb89db290a247ab8d2637d64c723268", "1_model_name": "Salesforce/SFR-Embedding-2_R", "1_prompt": "Which country won the most gold medals in the olympic history", "1_output": [["Which country won the most gold medals in the olympic history", "2004 Summer Olympics medal table\n\nThe 2004 Summer Olympics, officially known as the Games of the XXVIII Olympiad, were a summer multi-sport event held in Athens, the capital city of Greece, from 13 to 29 August 2004. A total of 10,625 athletes from 201 countries represented by National Olympic Committees participated in these games, competing in 301 events in 28 sports. Kiribati and Timor Leste competed for the first time in these Olympic Games.\nAthletes from 74 countries won at least one medal. The United States won the most gold medals (36), the most silver medals (40) and the most medals overall (101). China finished second on the International Olympic Committee medal table (though third in terms of total medals), the country's best performance until the 2008 Beijing Olympics, where they were hosts. Russia finished third (second in total medals), and also won the most bronze medals (38). Host nation Greece finished fifteenth, with six gold, six silver, and four bronze medals, in its best total medal haul since 1896."]], "1_corpus": "wikipedia"}
9
  {"tstamp": 1722444028.9194, "task_type": "retrieval", "type": "rightvote", "models": ["", ""], "ip": "", "0_conv_id": "fa4418617a5a491b8a7ede90483569ba", "0_model_name": "BM25", "0_prompt": "Fine-tuning BERT for optimal text classification performance on multiple datasets", "0_output": [["Fine-tuning BERT for optimal text classification performance on multiple datasets", "Title: Imbalanced Multi-label Classification for Business-related Text with Moderately Large Label Spaces\n\nAbstract: In this study, we compared the performance of four different methods for multi label text classification using a specific imbalanced business dataset. The four methods we evaluated were fine tuned BERT, Binary Relevance, Classifier Chains, and Label Powerset. The results show that fine tuned BERT outperforms the other three methods by a significant margin, achieving high values of accuracy, F1 Score, Precision, and Recall. Binary Relevance also performs well on this dataset, while Classifier Chains and Label Powerset demonstrate relatively poor performance. These findings highlight the effectiveness of fine tuned BERT for multi label text classification tasks, and suggest that it may be a useful tool for businesses seeking to analyze complex and multifaceted texts."]], "0_corpus": "arxiv", "1_conv_id": "e87880dd7cd94e34b26d6ec08910f019", "1_model_name": "text-embedding-004", "1_prompt": "Fine-tuning BERT for optimal text classification performance on multiple datasets", "1_output": [["Fine-tuning BERT for optimal text classification performance on multiple datasets", "Title: How to Fine-Tune BERT for Text Classification?\n\nAbstract: Language model pre-training has proven to be useful in learning universal language representations. As a state-of-the-art language model pre-training model, BERT (Bidirectional Encoder Representations from Transformers) has achieved amazing results in many language understanding tasks. In this paper, we conduct exhaustive experiments to investigate different fine-tuning methods of BERT on text classification task and provide a general solution for BERT fine-tuning. Finally, the proposed solution obtains new state-of-the-art results on eight widely-studied text classification datasets."]], "1_corpus": "arxiv"}
 
 
7
  {"tstamp": 1722442304.08, "task_type": "retrieval", "type": "tievote", "models": ["", ""], "ip": "", "0_conv_id": "c11b9d02ecdb4ac29d202f5ce92f0f64", "0_model_name": "Salesforce/SFR-Embedding-2_R", "0_prompt": "What is Venstre?", "0_output": [["What is Venstre?", "Venstre (Denmark)\n\n(V; , ), full name (), is a conservative-liberal, agrarian political party in Denmark. Founded as part of a peasants' movement against the landed aristocracy, today it espouses an economically liberal, pro-free-market ideology."]], "0_corpus": "wikipedia", "1_conv_id": "3a3a219015e944a08e96fcd2a795e1b8", "1_model_name": "BAAI/bge-large-en-v1.5", "1_prompt": "What is Venstre?", "1_output": [["What is Venstre?", "Venstre (Denmark)\n\n(V; , ), full name (), is a conservative-liberal, agrarian political party in Denmark. Founded as part of a peasants' movement against the landed aristocracy, today it espouses an economically liberal, pro-free-market ideology."]], "1_corpus": "wikipedia"}
8
  {"tstamp": 1722443951.9871, "task_type": "retrieval", "type": "bothbadvote", "models": ["", ""], "ip": "", "0_conv_id": "f294c32c8edf4cdb82f34b8988340f89", "0_model_name": "GritLM/GritLM-7B", "0_prompt": "Which country won the most gold medals in the olympic history", "0_output": [["Which country won the most gold medals in the olympic history", "List of Olympic teams by medals won\n\nThe United States has placed first overall in the Summer Olympic Games eighteen times, followed by the Soviet Union with six overall victories (not including the 1992 first-place finish by the Unified Team). In the Winter Olympic Games, Norway has placed first overall nine times and the Soviet Union (prior to its dissolution in 1991) had placed first seven times.\nSummer Olympics\nList by Games\nList by team\nWinter Olympics\nList by Games"]], "0_corpus": "wikipedia", "1_conv_id": "beb89db290a247ab8d2637d64c723268", "1_model_name": "Salesforce/SFR-Embedding-2_R", "1_prompt": "Which country won the most gold medals in the olympic history", "1_output": [["Which country won the most gold medals in the olympic history", "2004 Summer Olympics medal table\n\nThe 2004 Summer Olympics, officially known as the Games of the XXVIII Olympiad, were a summer multi-sport event held in Athens, the capital city of Greece, from 13 to 29 August 2004. A total of 10,625 athletes from 201 countries represented by National Olympic Committees participated in these games, competing in 301 events in 28 sports. Kiribati and Timor Leste competed for the first time in these Olympic Games.\nAthletes from 74 countries won at least one medal. The United States won the most gold medals (36), the most silver medals (40) and the most medals overall (101). China finished second on the International Olympic Committee medal table (though third in terms of total medals), the country's best performance until the 2008 Beijing Olympics, where they were hosts. Russia finished third (second in total medals), and also won the most bronze medals (38). Host nation Greece finished fifteenth, with six gold, six silver, and four bronze medals, in its best total medal haul since 1896."]], "1_corpus": "wikipedia"}
9
  {"tstamp": 1722444028.9194, "task_type": "retrieval", "type": "rightvote", "models": ["", ""], "ip": "", "0_conv_id": "fa4418617a5a491b8a7ede90483569ba", "0_model_name": "BM25", "0_prompt": "Fine-tuning BERT for optimal text classification performance on multiple datasets", "0_output": [["Fine-tuning BERT for optimal text classification performance on multiple datasets", "Title: Imbalanced Multi-label Classification for Business-related Text with Moderately Large Label Spaces\n\nAbstract: In this study, we compared the performance of four different methods for multi label text classification using a specific imbalanced business dataset. The four methods we evaluated were fine tuned BERT, Binary Relevance, Classifier Chains, and Label Powerset. The results show that fine tuned BERT outperforms the other three methods by a significant margin, achieving high values of accuracy, F1 Score, Precision, and Recall. Binary Relevance also performs well on this dataset, while Classifier Chains and Label Powerset demonstrate relatively poor performance. These findings highlight the effectiveness of fine tuned BERT for multi label text classification tasks, and suggest that it may be a useful tool for businesses seeking to analyze complex and multifaceted texts."]], "0_corpus": "arxiv", "1_conv_id": "e87880dd7cd94e34b26d6ec08910f019", "1_model_name": "text-embedding-004", "1_prompt": "Fine-tuning BERT for optimal text classification performance on multiple datasets", "1_output": [["Fine-tuning BERT for optimal text classification performance on multiple datasets", "Title: How to Fine-Tune BERT for Text Classification?\n\nAbstract: Language model pre-training has proven to be useful in learning universal language representations. As a state-of-the-art language model pre-training model, BERT (Bidirectional Encoder Representations from Transformers) has achieved amazing results in many language understanding tasks. In this paper, we conduct exhaustive experiments to investigate different fine-tuning methods of BERT on text classification task and provide a general solution for BERT fine-tuning. Finally, the proposed solution obtains new state-of-the-art results on eight widely-studied text classification datasets."]], "1_corpus": "arxiv"}
10
+ {"tstamp": 1722444509.5213, "task_type": "retrieval", "type": "bothbadvote", "models": ["", ""], "ip": "", "0_conv_id": "17817cacbabd4dcf9fa57b72945df25a", "0_model_name": "sentence-transformers/all-MiniLM-L6-v2", "0_prompt": "What is the antipode of heaven?", "0_output": [["What is the antipode of heaven?", "Heaven in Christianity\n\nIn Christianity, heaven is traditionally the location of the throne of God and the angels of God, and in most forms of Christianity it is the abode of the righteous dead in the afterlife. In some Christian denominations it is understood as a temporary stage before the resurrection of the dead and the saints' return to the New Earth.\nIn the Book of Acts, the resurrected Jesus ascends to heaven where, as the Nicene Creed states, he now sits at the right hand of God and will return to earth in the Second Coming. According to Catholic, Eastern Orthodox, and Oriental Orthodox teaching, Mary, mother of Jesus, is said to have been assumed into heaven without the corruption of her earthly body; she is venerated as Queen of Heaven.\nIn the Christian Bible, concepts about Christian eschatology, the future \"kingdom of heaven\", and the resurrection of the dead are found, particularly in the book of Revelation and in 1 Corinthians 15.\nDescription\nThe Bible does not provide much information on what Heaven is supposed to be like. As a result, Christian theologians are usually not very specific when describing heaven.\nThe Book of Revelation states that the New Jerusalem will be transported from Heaven to Earth, rather than people from Earth going to Heaven. The description of the gates of New Jerusalem in inspired the idea of the Pearly gates, which is the informal name for the gateway to heaven according to some Christian denominations."]], "0_corpus": "wikipedia", "1_conv_id": "2d381decf1d1476fbbd0c1c804ed3088", "1_model_name": "mixedbread-ai/mxbai-embed-large-v1", "1_prompt": "What is the antipode of heaven?", "1_output": [["What is the antipode of heaven?", "Heaven\n\nHeaven, or the heavens, is a common religious cosmological or transcendent supernatural place where beings such as deities, angels, souls, saints, or venerated ancestors are said to originate, be enthroned, or reside. According to the beliefs of some religions, heavenly beings can descend to Earth or incarnate and earthly beings can ascend to Heaven in the afterlife or, in exceptional cases, enter Heaven without dying.\nHeaven is often described as a \"highest place\", the holiest place, a Paradise, in contrast to hell or the Underworld or the \"low places\" and universally or conditionally accessible by earthly beings according to various standards of divinity, goodness, piety, faith, or other virtues or right beliefs or simply divine will. Some believe in the possibility of a heaven on Earth in a world to come."]], "1_corpus": "wikipedia"}
data/retrieval_individual-407c4836-37e2-4f9f-8e9b-06706cc4440c.jsonl CHANGED
@@ -16,3 +16,5 @@
16
  {"tstamp": 1722443940.8847, "task_type": "retrieval", "type": "chat", "model": "Salesforce/SFR-Embedding-2_R", "gen_params": {}, "start": 1722443938.5083, "finish": 1722443940.8847, "ip": "", "conv_id": "beb89db290a247ab8d2637d64c723268", "model_name": "Salesforce/SFR-Embedding-2_R", "prompt": "Which country won the most gold medals in the olympic history", "output": [["Which country won the most gold medals in the olympic history", "2004 Summer Olympics medal table\n\nThe 2004 Summer Olympics, officially known as the Games of the XXVIII Olympiad, were a summer multi-sport event held in Athens, the capital city of Greece, from 13 to 29 August 2004. A total of 10,625 athletes from 201 countries represented by National Olympic Committees participated in these games, competing in 301 events in 28 sports. Kiribati and Timor Leste competed for the first time in these Olympic Games.\nAthletes from 74 countries won at least one medal. The United States won the most gold medals (36), the most silver medals (40) and the most medals overall (101). China finished second on the International Olympic Committee medal table (though third in terms of total medals), the country's best performance until the 2008 Beijing Olympics, where they were hosts. Russia finished third (second in total medals), and also won the most bronze medals (38). Host nation Greece finished fifteenth, with six gold, six silver, and four bronze medals, in its best total medal haul since 1896."]], "corpus": "wikipedia"}
17
  {"tstamp": 1722443999.2944, "task_type": "retrieval", "type": "chat", "model": "BM25", "gen_params": {}, "start": 1722443998.2791, "finish": 1722443999.2944, "ip": "", "conv_id": "fa4418617a5a491b8a7ede90483569ba", "model_name": "BM25", "prompt": "Fine-tuning BERT for optimal text classification performance on multiple datasets", "output": [["Fine-tuning BERT for optimal text classification performance on multiple datasets", "Title: Imbalanced Multi-label Classification for Business-related Text with Moderately Large Label Spaces\n\nAbstract: In this study, we compared the performance of four different methods for multi label text classification using a specific imbalanced business dataset. The four methods we evaluated were fine tuned BERT, Binary Relevance, Classifier Chains, and Label Powerset. The results show that fine tuned BERT outperforms the other three methods by a significant margin, achieving high values of accuracy, F1 Score, Precision, and Recall. Binary Relevance also performs well on this dataset, while Classifier Chains and Label Powerset demonstrate relatively poor performance. These findings highlight the effectiveness of fine tuned BERT for multi label text classification tasks, and suggest that it may be a useful tool for businesses seeking to analyze complex and multifaceted texts."]], "corpus": "arxiv"}
18
  {"tstamp": 1722443999.2944, "task_type": "retrieval", "type": "chat", "model": "text-embedding-004", "gen_params": {}, "start": 1722443998.2791, "finish": 1722443999.2944, "ip": "", "conv_id": "e87880dd7cd94e34b26d6ec08910f019", "model_name": "text-embedding-004", "prompt": "Fine-tuning BERT for optimal text classification performance on multiple datasets", "output": [["Fine-tuning BERT for optimal text classification performance on multiple datasets", "Title: How to Fine-Tune BERT for Text Classification?\n\nAbstract: Language model pre-training has proven to be useful in learning universal language representations. As a state-of-the-art language model pre-training model, BERT (Bidirectional Encoder Representations from Transformers) has achieved amazing results in many language understanding tasks. In this paper, we conduct exhaustive experiments to investigate different fine-tuning methods of BERT on text classification task and provide a general solution for BERT fine-tuning. Finally, the proposed solution obtains new state-of-the-art results on eight widely-studied text classification datasets."]], "corpus": "arxiv"}
 
 
 
16
  {"tstamp": 1722443940.8847, "task_type": "retrieval", "type": "chat", "model": "Salesforce/SFR-Embedding-2_R", "gen_params": {}, "start": 1722443938.5083, "finish": 1722443940.8847, "ip": "", "conv_id": "beb89db290a247ab8d2637d64c723268", "model_name": "Salesforce/SFR-Embedding-2_R", "prompt": "Which country won the most gold medals in the olympic history", "output": [["Which country won the most gold medals in the olympic history", "2004 Summer Olympics medal table\n\nThe 2004 Summer Olympics, officially known as the Games of the XXVIII Olympiad, were a summer multi-sport event held in Athens, the capital city of Greece, from 13 to 29 August 2004. A total of 10,625 athletes from 201 countries represented by National Olympic Committees participated in these games, competing in 301 events in 28 sports. Kiribati and Timor Leste competed for the first time in these Olympic Games.\nAthletes from 74 countries won at least one medal. The United States won the most gold medals (36), the most silver medals (40) and the most medals overall (101). China finished second on the International Olympic Committee medal table (though third in terms of total medals), the country's best performance until the 2008 Beijing Olympics, where they were hosts. Russia finished third (second in total medals), and also won the most bronze medals (38). Host nation Greece finished fifteenth, with six gold, six silver, and four bronze medals, in its best total medal haul since 1896."]], "corpus": "wikipedia"}
17
  {"tstamp": 1722443999.2944, "task_type": "retrieval", "type": "chat", "model": "BM25", "gen_params": {}, "start": 1722443998.2791, "finish": 1722443999.2944, "ip": "", "conv_id": "fa4418617a5a491b8a7ede90483569ba", "model_name": "BM25", "prompt": "Fine-tuning BERT for optimal text classification performance on multiple datasets", "output": [["Fine-tuning BERT for optimal text classification performance on multiple datasets", "Title: Imbalanced Multi-label Classification for Business-related Text with Moderately Large Label Spaces\n\nAbstract: In this study, we compared the performance of four different methods for multi label text classification using a specific imbalanced business dataset. The four methods we evaluated were fine tuned BERT, Binary Relevance, Classifier Chains, and Label Powerset. The results show that fine tuned BERT outperforms the other three methods by a significant margin, achieving high values of accuracy, F1 Score, Precision, and Recall. Binary Relevance also performs well on this dataset, while Classifier Chains and Label Powerset demonstrate relatively poor performance. These findings highlight the effectiveness of fine tuned BERT for multi label text classification tasks, and suggest that it may be a useful tool for businesses seeking to analyze complex and multifaceted texts."]], "corpus": "arxiv"}
18
  {"tstamp": 1722443999.2944, "task_type": "retrieval", "type": "chat", "model": "text-embedding-004", "gen_params": {}, "start": 1722443998.2791, "finish": 1722443999.2944, "ip": "", "conv_id": "e87880dd7cd94e34b26d6ec08910f019", "model_name": "text-embedding-004", "prompt": "Fine-tuning BERT for optimal text classification performance on multiple datasets", "output": [["Fine-tuning BERT for optimal text classification performance on multiple datasets", "Title: How to Fine-Tune BERT for Text Classification?\n\nAbstract: Language model pre-training has proven to be useful in learning universal language representations. As a state-of-the-art language model pre-training model, BERT (Bidirectional Encoder Representations from Transformers) has achieved amazing results in many language understanding tasks. In this paper, we conduct exhaustive experiments to investigate different fine-tuning methods of BERT on text classification task and provide a general solution for BERT fine-tuning. Finally, the proposed solution obtains new state-of-the-art results on eight widely-studied text classification datasets."]], "corpus": "arxiv"}
19
+ {"tstamp": 1722444489.0388, "task_type": "retrieval", "type": "chat", "model": "sentence-transformers/all-MiniLM-L6-v2", "gen_params": {}, "start": 1722444488.784, "finish": 1722444489.0388, "ip": "", "conv_id": "17817cacbabd4dcf9fa57b72945df25a", "model_name": "sentence-transformers/all-MiniLM-L6-v2", "prompt": "What is the antipode of heaven?", "output": [["What is the antipode of heaven?", "Heaven in Christianity\n\nIn Christianity, heaven is traditionally the location of the throne of God and the angels of God, and in most forms of Christianity it is the abode of the righteous dead in the afterlife. In some Christian denominations it is understood as a temporary stage before the resurrection of the dead and the saints' return to the New Earth.\nIn the Book of Acts, the resurrected Jesus ascends to heaven where, as the Nicene Creed states, he now sits at the right hand of God and will return to earth in the Second Coming. According to Catholic, Eastern Orthodox, and Oriental Orthodox teaching, Mary, mother of Jesus, is said to have been assumed into heaven without the corruption of her earthly body; she is venerated as Queen of Heaven.\nIn the Christian Bible, concepts about Christian eschatology, the future \"kingdom of heaven\", and the resurrection of the dead are found, particularly in the book of Revelation and in 1 Corinthians 15.\nDescription\nThe Bible does not provide much information on what Heaven is supposed to be like. As a result, Christian theologians are usually not very specific when describing heaven.\nThe Book of Revelation states that the New Jerusalem will be transported from Heaven to Earth, rather than people from Earth going to Heaven. The description of the gates of New Jerusalem in inspired the idea of the Pearly gates, which is the informal name for the gateway to heaven according to some Christian denominations."]], "corpus": "wikipedia"}
20
+ {"tstamp": 1722444489.0388, "task_type": "retrieval", "type": "chat", "model": "mixedbread-ai/mxbai-embed-large-v1", "gen_params": {}, "start": 1722444488.784, "finish": 1722444489.0388, "ip": "", "conv_id": "2d381decf1d1476fbbd0c1c804ed3088", "model_name": "mixedbread-ai/mxbai-embed-large-v1", "prompt": "What is the antipode of heaven?", "output": [["What is the antipode of heaven?", "Heaven\n\nHeaven, or the heavens, is a common religious cosmological or transcendent supernatural place where beings such as deities, angels, souls, saints, or venerated ancestors are said to originate, be enthroned, or reside. According to the beliefs of some religions, heavenly beings can descend to Earth or incarnate and earthly beings can ascend to Heaven in the afterlife or, in exceptional cases, enter Heaven without dying.\nHeaven is often described as a \"highest place\", the holiest place, a Paradise, in contrast to hell or the Underworld or the \"low places\" and universally or conditionally accessible by earthly beings according to various standards of divinity, goodness, piety, faith, or other virtues or right beliefs or simply divine will. Some believe in the possibility of a heaven on Earth in a world to come."]], "corpus": "wikipedia"}