Datasets:
mteb
/

Modalities:
Tabular
Text
Formats:
json
Libraries:
Datasets
Dask
Muennighoff commited on
Commit
3e4c2d2
·
verified ·
1 Parent(s): 7f5bdb0

Scheduled Commit

Browse files
data/retrieval_battle-407c4836-37e2-4f9f-8e9b-06706cc4440c.jsonl CHANGED
@@ -16,3 +16,4 @@
16
  {"tstamp": 1722445702.9498, "task_type": "retrieval", "type": "leftvote", "models": ["", ""], "ip": "", "0_conv_id": "2822b7b032984b2896bc9c1b5d9477a8", "0_model_name": "GritLM/GritLM-7B", "0_prompt": "A system integrating BERT and IR techniques to enhance open-domain QA performance on Wikipedia.", "0_output": [["A system integrating BERT and IR techniques to enhance open-domain QA performance on Wikipedia.", "Title: End-to-End Open-Domain Question Answering with BERTserini\n\nAbstract: We demonstrate an end-to-end question answering system that integrates BERT with the open-source Anserini information retrieval toolkit. In contrast to most question answering and reading comprehension models today, which operate over small amounts of input text, our system integrates best practices from IR with a BERT-based reader to identify answers from a large corpus of Wikipedia articles in an end-to-end fashion. We report large improvements over previous results on a standard benchmark test collection, showing that fine-tuning pretrained BERT with SQuAD is sufficient to achieve high accuracy in identifying answer spans."]], "0_corpus": "arxiv", "1_conv_id": "e4223ef5fa88473386fbe4e0ee1c3016", "1_model_name": "sentence-transformers/all-MiniLM-L6-v2", "1_prompt": "A system integrating BERT and IR techniques to enhance open-domain QA performance on Wikipedia.", "1_output": [["A system integrating BERT and IR techniques to enhance open-domain QA performance on Wikipedia.", "Title: Detection of the Geminga pulsar with MAGIC hints at a power-law tail emission beyond 15 GeV\n\nAbstract: We report the detection of pulsed gamma-ray emission from the Geminga pulsar (PSR J0633+1746) between $15\\,$GeV and $75\\,$GeV. This is the first time a middle-aged pulsar has been detected up to these energies. Observations were carried out with the MAGIC telescopes between 2017 and 2019 using the low-energy threshold Sum-Trigger-II system. After quality selection cuts, $\\sim 80\\,$hours of observational data were used for this analysis. To compare with the emission at lower energies below the sensitivity range of MAGIC, $11$ years of Fermi-LAT data above $100\\,$MeV were also analysed. From the two pulses per rotation seen by Fermi-LAT, only the second one, P2, is detected in the MAGIC energy range, with a significance of $6.3\\,\\sigma$. The spectrum measured by MAGIC is well-represented by a simple power law of spectral index $\\Gamma= 5.62\\pm0.54$, which smoothly extends the Fermi-LAT spectrum. A joint fit to MAGIC and Fermi-LAT data rules out the existence of a sub-exponential cut-off in the combined energy range at the $3.6\\,\\sigma$ significance level. The power-law tail emission detected by MAGIC is interpreted as the transition from curvature radiation to Inverse Compton Scattering of particles accelerated in the northern outer gap."]], "1_corpus": "arxiv"}
17
  {"tstamp": 1722445718.3648, "task_type": "retrieval", "type": "rightvote", "models": ["", ""], "ip": "", "0_conv_id": "55af47c6ca9c4ac7a77ee1e484d2f648", "0_model_name": "sentence-transformers/all-MiniLM-L6-v2", "0_prompt": "Chinese text encoder with n-gram representations achieving state-of-the-art performance.", "0_output": [["Chinese text encoder with n-gram representations achieving state-of-the-art performance.", "Title: Is Nothing Sacred? Vacuum Energy, Supersymmetry and Lorentz Breaking from Recoiling D branes\n\nAbstract: Classical superstring vacua have zero vacuum energy and are supersymmetric and Lorentz-invariant. We argue that all these properties may be destroyed when quantum aspects of the interactions between particles and non-perturbative vacuum fluctuations are considered. A toy calculation of string/D-brane interactions using a world-sheet approach indicates that quantum recoil effects - reflecting the gravitational back-reaction on space-time foam due to the propagation of energetic particles - induce non-zero vacuum energy that is linked to supersymmetry breaking and breaks Lorentz invariance. This model of space-time foam also suggests the appearance of microscopic event horizons."]], "0_corpus": "arxiv", "1_conv_id": "8b2315872e5949a7a35825b601d43977", "1_model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "1_prompt": "Chinese text encoder with n-gram representations achieving state-of-the-art performance.", "1_output": [["Chinese text encoder with n-gram representations achieving state-of-the-art performance.", "Title: Character-level Chinese-English Translation through ASCII Encoding\n\nAbstract: Character-level Neural Machine Translation (NMT) models have recently achieved impressive results on many language pairs. They mainly do well for Indo-European language pairs, where the languages share the same writing system. However, for translating between Chinese and English, the gap between the two different writing systems poses a major challenge because of a lack of systematic correspondence between the individual linguistic units. In this paper, we enable character-level NMT for Chinese, by breaking down Chinese characters into linguistic units similar to that of Indo-European languages. We use the Wubi encoding scheme, which preserves the original shape and semantic information of the characters, while also being reversible. We show promising results from training Wubi-based models on the character- and subword-level with recurrent as well as convolutional models."]], "1_corpus": "arxiv"}
18
  {"tstamp": 1722445796.0975, "task_type": "retrieval", "type": "rightvote", "models": ["", ""], "ip": "", "0_conv_id": "483dafda9ed54c9c80bc45216ad796db", "0_model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "0_prompt": "what type of data is provided in a knowledge base", "0_output": [["what type of data is provided in a knowledge base", "W. Edwards Deming\n\nDeming continued to advise businesses large and small. From 1985 through 1989, Deming served as a consultant to Vernay Laboratories, a rubber manufacturing firm in Yellow Springs, Ohio, with fewer than 1,000 employees. He held several week-long seminars for employees and suppliers of the small company where his famous example \"Workers on the Red Beads\" spurred several major changes in Vernay's manufacturing processes.\nDeming joined the Graduate School of Business at Columbia University in 1988. In 1990, during his last year, he founded the W. Edwards Deming Center for Quality, Productivity, and Competitiveness at Columbia Business School to promote operational excellence in business through the development of research, best practices and strategic planning.\nIn 1990, Marshall Industries (NYSE:MI, 1984–1999) CEO Robert Rodin trained with the then 90-year-old Deming and his colleague Nida Backaitis. Marshall Industries' dramatic transformation and growth from $400 million to $1.8 billion in sales was chronicled in Deming's last book The New Economics, a Harvard Case Study, and Rodin's book, Free, Perfect and Now.\nIn 1993, Deming published his final book, The New Economics for Industry, Government, Education, which included the System of Profound Knowledge and the 14 Points for Management. It also contained educational concepts involving group-based teaching without grades, as well as management without individual merit or performance reviews.\nAcademic contributions\nThe philosophy of W. Edwards Deming has been summarized as follows:\nThe Deming System of Profound Knowledge"]], "0_corpus": "wikipedia", "1_conv_id": "3139be50f5064e0894343f6a22890c54", "1_model_name": "intfloat/e5-mistral-7b-instruct", "1_prompt": "what type of data is provided in a knowledge base", "1_output": [["what type of data is provided in a knowledge base", "Knowledge base\n\nIn computer science, a knowledge base (KB) is a set of sentences, each sentence given in a knowledge representation language, with interfaces to tell new sentences and to ask questions about what is known, where either of these interfaces might use inference. It is a technology used to store complex structured data used by a computer system. The initial use of the term was in connection with expert systems, which were the first knowledge-based systems.\nOriginal usage of the term"]], "1_corpus": "wikipedia"}
 
 
16
  {"tstamp": 1722445702.9498, "task_type": "retrieval", "type": "leftvote", "models": ["", ""], "ip": "", "0_conv_id": "2822b7b032984b2896bc9c1b5d9477a8", "0_model_name": "GritLM/GritLM-7B", "0_prompt": "A system integrating BERT and IR techniques to enhance open-domain QA performance on Wikipedia.", "0_output": [["A system integrating BERT and IR techniques to enhance open-domain QA performance on Wikipedia.", "Title: End-to-End Open-Domain Question Answering with BERTserini\n\nAbstract: We demonstrate an end-to-end question answering system that integrates BERT with the open-source Anserini information retrieval toolkit. In contrast to most question answering and reading comprehension models today, which operate over small amounts of input text, our system integrates best practices from IR with a BERT-based reader to identify answers from a large corpus of Wikipedia articles in an end-to-end fashion. We report large improvements over previous results on a standard benchmark test collection, showing that fine-tuning pretrained BERT with SQuAD is sufficient to achieve high accuracy in identifying answer spans."]], "0_corpus": "arxiv", "1_conv_id": "e4223ef5fa88473386fbe4e0ee1c3016", "1_model_name": "sentence-transformers/all-MiniLM-L6-v2", "1_prompt": "A system integrating BERT and IR techniques to enhance open-domain QA performance on Wikipedia.", "1_output": [["A system integrating BERT and IR techniques to enhance open-domain QA performance on Wikipedia.", "Title: Detection of the Geminga pulsar with MAGIC hints at a power-law tail emission beyond 15 GeV\n\nAbstract: We report the detection of pulsed gamma-ray emission from the Geminga pulsar (PSR J0633+1746) between $15\\,$GeV and $75\\,$GeV. This is the first time a middle-aged pulsar has been detected up to these energies. Observations were carried out with the MAGIC telescopes between 2017 and 2019 using the low-energy threshold Sum-Trigger-II system. After quality selection cuts, $\\sim 80\\,$hours of observational data were used for this analysis. To compare with the emission at lower energies below the sensitivity range of MAGIC, $11$ years of Fermi-LAT data above $100\\,$MeV were also analysed. From the two pulses per rotation seen by Fermi-LAT, only the second one, P2, is detected in the MAGIC energy range, with a significance of $6.3\\,\\sigma$. The spectrum measured by MAGIC is well-represented by a simple power law of spectral index $\\Gamma= 5.62\\pm0.54$, which smoothly extends the Fermi-LAT spectrum. A joint fit to MAGIC and Fermi-LAT data rules out the existence of a sub-exponential cut-off in the combined energy range at the $3.6\\,\\sigma$ significance level. The power-law tail emission detected by MAGIC is interpreted as the transition from curvature radiation to Inverse Compton Scattering of particles accelerated in the northern outer gap."]], "1_corpus": "arxiv"}
17
  {"tstamp": 1722445718.3648, "task_type": "retrieval", "type": "rightvote", "models": ["", ""], "ip": "", "0_conv_id": "55af47c6ca9c4ac7a77ee1e484d2f648", "0_model_name": "sentence-transformers/all-MiniLM-L6-v2", "0_prompt": "Chinese text encoder with n-gram representations achieving state-of-the-art performance.", "0_output": [["Chinese text encoder with n-gram representations achieving state-of-the-art performance.", "Title: Is Nothing Sacred? Vacuum Energy, Supersymmetry and Lorentz Breaking from Recoiling D branes\n\nAbstract: Classical superstring vacua have zero vacuum energy and are supersymmetric and Lorentz-invariant. We argue that all these properties may be destroyed when quantum aspects of the interactions between particles and non-perturbative vacuum fluctuations are considered. A toy calculation of string/D-brane interactions using a world-sheet approach indicates that quantum recoil effects - reflecting the gravitational back-reaction on space-time foam due to the propagation of energetic particles - induce non-zero vacuum energy that is linked to supersymmetry breaking and breaks Lorentz invariance. This model of space-time foam also suggests the appearance of microscopic event horizons."]], "0_corpus": "arxiv", "1_conv_id": "8b2315872e5949a7a35825b601d43977", "1_model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "1_prompt": "Chinese text encoder with n-gram representations achieving state-of-the-art performance.", "1_output": [["Chinese text encoder with n-gram representations achieving state-of-the-art performance.", "Title: Character-level Chinese-English Translation through ASCII Encoding\n\nAbstract: Character-level Neural Machine Translation (NMT) models have recently achieved impressive results on many language pairs. They mainly do well for Indo-European language pairs, where the languages share the same writing system. However, for translating between Chinese and English, the gap between the two different writing systems poses a major challenge because of a lack of systematic correspondence between the individual linguistic units. In this paper, we enable character-level NMT for Chinese, by breaking down Chinese characters into linguistic units similar to that of Indo-European languages. We use the Wubi encoding scheme, which preserves the original shape and semantic information of the characters, while also being reversible. We show promising results from training Wubi-based models on the character- and subword-level with recurrent as well as convolutional models."]], "1_corpus": "arxiv"}
18
  {"tstamp": 1722445796.0975, "task_type": "retrieval", "type": "rightvote", "models": ["", ""], "ip": "", "0_conv_id": "483dafda9ed54c9c80bc45216ad796db", "0_model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "0_prompt": "what type of data is provided in a knowledge base", "0_output": [["what type of data is provided in a knowledge base", "W. Edwards Deming\n\nDeming continued to advise businesses large and small. From 1985 through 1989, Deming served as a consultant to Vernay Laboratories, a rubber manufacturing firm in Yellow Springs, Ohio, with fewer than 1,000 employees. He held several week-long seminars for employees and suppliers of the small company where his famous example \"Workers on the Red Beads\" spurred several major changes in Vernay's manufacturing processes.\nDeming joined the Graduate School of Business at Columbia University in 1988. In 1990, during his last year, he founded the W. Edwards Deming Center for Quality, Productivity, and Competitiveness at Columbia Business School to promote operational excellence in business through the development of research, best practices and strategic planning.\nIn 1990, Marshall Industries (NYSE:MI, 1984–1999) CEO Robert Rodin trained with the then 90-year-old Deming and his colleague Nida Backaitis. Marshall Industries' dramatic transformation and growth from $400 million to $1.8 billion in sales was chronicled in Deming's last book The New Economics, a Harvard Case Study, and Rodin's book, Free, Perfect and Now.\nIn 1993, Deming published his final book, The New Economics for Industry, Government, Education, which included the System of Profound Knowledge and the 14 Points for Management. It also contained educational concepts involving group-based teaching without grades, as well as management without individual merit or performance reviews.\nAcademic contributions\nThe philosophy of W. Edwards Deming has been summarized as follows:\nThe Deming System of Profound Knowledge"]], "0_corpus": "wikipedia", "1_conv_id": "3139be50f5064e0894343f6a22890c54", "1_model_name": "intfloat/e5-mistral-7b-instruct", "1_prompt": "what type of data is provided in a knowledge base", "1_output": [["what type of data is provided in a knowledge base", "Knowledge base\n\nIn computer science, a knowledge base (KB) is a set of sentences, each sentence given in a knowledge representation language, with interfaces to tell new sentences and to ask questions about what is known, where either of these interfaces might use inference. It is a technology used to store complex structured data used by a computer system. The initial use of the term was in connection with expert systems, which were the first knowledge-based systems.\nOriginal usage of the term"]], "1_corpus": "wikipedia"}
19
+ {"tstamp": 1722445834.473, "task_type": "retrieval", "type": "tievote", "models": ["", ""], "ip": "", "0_conv_id": "ed8435cf63824a24a3b2372e6608f29d", "0_model_name": "intfloat/e5-mistral-7b-instruct", "0_prompt": "how to change share name in windows server 2008?", "0_output": [["how to change share name in windows server 2008?", "Superuser Stackexchange\n\nQ: How to change the administrator account name from command prompt in Windows Server 2008 Server Core? I'm trying to change the administrator account name on my virtual machine for lab work purposes. I'm running Windows Server 2008 Server Core, therefore I'm only using the standard command prompts.\nHow can I do this?\n\nA: If it domain user install Active Directory Administration module for PowerShell. \ncommand line:\npowershell Import-module ActiveDirectory\n\nrename domain user -\nPowershell:\nGet-ADUser -Identity 'Administrator' | Rename-ADObject -NewName 'Partisan'\n\ncommand line:\npowershell Get-ADUser -Identity 'Administrator' ^| Rename-ADObject -NewName 'Partisan'\n\nvar 2:\ndsquery user -name Administrator | dsmove -newname \"Partisan\"\n\nlocal administrator -\nPowershell:\nRename-LocalUser -UserName 'Administrator' -NewUserName 'Partisan'\n\ncommand line:\npowershell Rename-LocalUser -UserName 'Administrator' -NewUserName 'Partisan'\n\nvar2:\nwmic UserAccount where Name=\"Administrator\" call Rename Name=\"Partisan\"\n\n"]], "0_corpus": "stackexchange", "1_conv_id": "939f7aa837ce48d9b9700ee6346622d4", "1_model_name": "mixedbread-ai/mxbai-embed-large-v1", "1_prompt": "how to change share name in windows server 2008?", "1_output": [["how to change share name in windows server 2008?", "Serverfault Stackexchange\n\nQ: How do you configure a \\\\domain\\sharedFolder in windows 2008? Background: \n\n\n*\n\n*I've got several Windows 2008 servers setup by someone else.\n\n*They are all on one domain, lets call this mydomain\n\n*Several boxes are accessing a shared folder by going to \\\\mydomain\\share1\n\n*I found the actual files backing the share on box1\n\n*I need to move the location of this share to a larger drive for space reasons\n\n\nI'm familiar with setting up and managing shares like \\\\box1\\share1, but where do you manage/configure shares that are domain-wide; accessed like \\\\mydomain\\share1 ?\n\nA: It sounds like you have a domain based DFS configured, which is managed by the DFS Management utility.\n"]], "1_corpus": "stackexchange"}
data/retrieval_individual-407c4836-37e2-4f9f-8e9b-06706cc4440c.jsonl CHANGED
@@ -38,3 +38,9 @@
38
  {"tstamp": 1722445710.4902, "task_type": "retrieval", "type": "chat", "model": "Alibaba-NLP/gte-Qwen2-7B-instruct", "gen_params": {}, "start": 1722445710.3004, "finish": 1722445710.4902, "ip": "", "conv_id": "8b2315872e5949a7a35825b601d43977", "model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "prompt": "Chinese text encoder with n-gram representations achieving state-of-the-art performance.", "output": [["Chinese text encoder with n-gram representations achieving state-of-the-art performance.", "Title: Character-level Chinese-English Translation through ASCII Encoding\n\nAbstract: Character-level Neural Machine Translation (NMT) models have recently achieved impressive results on many language pairs. They mainly do well for Indo-European language pairs, where the languages share the same writing system. However, for translating between Chinese and English, the gap between the two different writing systems poses a major challenge because of a lack of systematic correspondence between the individual linguistic units. In this paper, we enable character-level NMT for Chinese, by breaking down Chinese characters into linguistic units similar to that of Indo-European languages. We use the Wubi encoding scheme, which preserves the original shape and semantic information of the characters, while also being reversible. We show promising results from training Wubi-based models on the character- and subword-level with recurrent as well as convolutional models."]], "corpus": "arxiv"}
39
  {"tstamp": 1722445791.9025, "task_type": "retrieval", "type": "chat", "model": "Alibaba-NLP/gte-Qwen2-7B-instruct", "gen_params": {}, "start": 1722445791.5804, "finish": 1722445791.9025, "ip": "", "conv_id": "483dafda9ed54c9c80bc45216ad796db", "model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "prompt": "what type of data is provided in a knowledge base", "output": [["what type of data is provided in a knowledge base", "W. Edwards Deming\n\nDeming continued to advise businesses large and small. From 1985 through 1989, Deming served as a consultant to Vernay Laboratories, a rubber manufacturing firm in Yellow Springs, Ohio, with fewer than 1,000 employees. He held several week-long seminars for employees and suppliers of the small company where his famous example \"Workers on the Red Beads\" spurred several major changes in Vernay's manufacturing processes.\nDeming joined the Graduate School of Business at Columbia University in 1988. In 1990, during his last year, he founded the W. Edwards Deming Center for Quality, Productivity, and Competitiveness at Columbia Business School to promote operational excellence in business through the development of research, best practices and strategic planning.\nIn 1990, Marshall Industries (NYSE:MI, 1984–1999) CEO Robert Rodin trained with the then 90-year-old Deming and his colleague Nida Backaitis. Marshall Industries' dramatic transformation and growth from $400 million to $1.8 billion in sales was chronicled in Deming's last book The New Economics, a Harvard Case Study, and Rodin's book, Free, Perfect and Now.\nIn 1993, Deming published his final book, The New Economics for Industry, Government, Education, which included the System of Profound Knowledge and the 14 Points for Management. It also contained educational concepts involving group-based teaching without grades, as well as management without individual merit or performance reviews.\nAcademic contributions\nThe philosophy of W. Edwards Deming has been summarized as follows:\nThe Deming System of Profound Knowledge"]], "corpus": "wikipedia"}
40
  {"tstamp": 1722445791.9025, "task_type": "retrieval", "type": "chat", "model": "intfloat/e5-mistral-7b-instruct", "gen_params": {}, "start": 1722445791.5804, "finish": 1722445791.9025, "ip": "", "conv_id": "3139be50f5064e0894343f6a22890c54", "model_name": "intfloat/e5-mistral-7b-instruct", "prompt": "what type of data is provided in a knowledge base", "output": [["what type of data is provided in a knowledge base", "Knowledge base\n\nIn computer science, a knowledge base (KB) is a set of sentences, each sentence given in a knowledge representation language, with interfaces to tell new sentences and to ask questions about what is known, where either of these interfaces might use inference. It is a technology used to store complex structured data used by a computer system. The initial use of the term was in connection with expert systems, which were the first knowledge-based systems.\nOriginal usage of the term"]], "corpus": "wikipedia"}
 
 
 
 
 
 
 
38
  {"tstamp": 1722445710.4902, "task_type": "retrieval", "type": "chat", "model": "Alibaba-NLP/gte-Qwen2-7B-instruct", "gen_params": {}, "start": 1722445710.3004, "finish": 1722445710.4902, "ip": "", "conv_id": "8b2315872e5949a7a35825b601d43977", "model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "prompt": "Chinese text encoder with n-gram representations achieving state-of-the-art performance.", "output": [["Chinese text encoder with n-gram representations achieving state-of-the-art performance.", "Title: Character-level Chinese-English Translation through ASCII Encoding\n\nAbstract: Character-level Neural Machine Translation (NMT) models have recently achieved impressive results on many language pairs. They mainly do well for Indo-European language pairs, where the languages share the same writing system. However, for translating between Chinese and English, the gap between the two different writing systems poses a major challenge because of a lack of systematic correspondence between the individual linguistic units. In this paper, we enable character-level NMT for Chinese, by breaking down Chinese characters into linguistic units similar to that of Indo-European languages. We use the Wubi encoding scheme, which preserves the original shape and semantic information of the characters, while also being reversible. We show promising results from training Wubi-based models on the character- and subword-level with recurrent as well as convolutional models."]], "corpus": "arxiv"}
39
  {"tstamp": 1722445791.9025, "task_type": "retrieval", "type": "chat", "model": "Alibaba-NLP/gte-Qwen2-7B-instruct", "gen_params": {}, "start": 1722445791.5804, "finish": 1722445791.9025, "ip": "", "conv_id": "483dafda9ed54c9c80bc45216ad796db", "model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "prompt": "what type of data is provided in a knowledge base", "output": [["what type of data is provided in a knowledge base", "W. Edwards Deming\n\nDeming continued to advise businesses large and small. From 1985 through 1989, Deming served as a consultant to Vernay Laboratories, a rubber manufacturing firm in Yellow Springs, Ohio, with fewer than 1,000 employees. He held several week-long seminars for employees and suppliers of the small company where his famous example \"Workers on the Red Beads\" spurred several major changes in Vernay's manufacturing processes.\nDeming joined the Graduate School of Business at Columbia University in 1988. In 1990, during his last year, he founded the W. Edwards Deming Center for Quality, Productivity, and Competitiveness at Columbia Business School to promote operational excellence in business through the development of research, best practices and strategic planning.\nIn 1990, Marshall Industries (NYSE:MI, 1984–1999) CEO Robert Rodin trained with the then 90-year-old Deming and his colleague Nida Backaitis. Marshall Industries' dramatic transformation and growth from $400 million to $1.8 billion in sales was chronicled in Deming's last book The New Economics, a Harvard Case Study, and Rodin's book, Free, Perfect and Now.\nIn 1993, Deming published his final book, The New Economics for Industry, Government, Education, which included the System of Profound Knowledge and the 14 Points for Management. It also contained educational concepts involving group-based teaching without grades, as well as management without individual merit or performance reviews.\nAcademic contributions\nThe philosophy of W. Edwards Deming has been summarized as follows:\nThe Deming System of Profound Knowledge"]], "corpus": "wikipedia"}
40
  {"tstamp": 1722445791.9025, "task_type": "retrieval", "type": "chat", "model": "intfloat/e5-mistral-7b-instruct", "gen_params": {}, "start": 1722445791.5804, "finish": 1722445791.9025, "ip": "", "conv_id": "3139be50f5064e0894343f6a22890c54", "model_name": "intfloat/e5-mistral-7b-instruct", "prompt": "what type of data is provided in a knowledge base", "output": [["what type of data is provided in a knowledge base", "Knowledge base\n\nIn computer science, a knowledge base (KB) is a set of sentences, each sentence given in a knowledge representation language, with interfaces to tell new sentences and to ask questions about what is known, where either of these interfaces might use inference. It is a technology used to store complex structured data used by a computer system. The initial use of the term was in connection with expert systems, which were the first knowledge-based systems.\nOriginal usage of the term"]], "corpus": "wikipedia"}
41
+ {"tstamp": 1722445805.4526, "task_type": "retrieval", "type": "chat", "model": "intfloat/e5-mistral-7b-instruct", "gen_params": {}, "start": 1722445805.2162, "finish": 1722445805.4526, "ip": "", "conv_id": "ed8435cf63824a24a3b2372e6608f29d", "model_name": "intfloat/e5-mistral-7b-instruct", "prompt": "how to change share name in windows server 2008?", "output": [["how to change share name in windows server 2008?", "Superuser Stackexchange\n\nQ: How to change the administrator account name from command prompt in Windows Server 2008 Server Core? I'm trying to change the administrator account name on my virtual machine for lab work purposes. I'm running Windows Server 2008 Server Core, therefore I'm only using the standard command prompts.\nHow can I do this?\n\nA: If it domain user install Active Directory Administration module for PowerShell. \ncommand line:\npowershell Import-module ActiveDirectory\n\nrename domain user -\nPowershell:\nGet-ADUser -Identity 'Administrator' | Rename-ADObject -NewName 'Partisan'\n\ncommand line:\npowershell Get-ADUser -Identity 'Administrator' ^| Rename-ADObject -NewName 'Partisan'\n\nvar 2:\ndsquery user -name Administrator | dsmove -newname \"Partisan\"\n\nlocal administrator -\nPowershell:\nRename-LocalUser -UserName 'Administrator' -NewUserName 'Partisan'\n\ncommand line:\npowershell Rename-LocalUser -UserName 'Administrator' -NewUserName 'Partisan'\n\nvar2:\nwmic UserAccount where Name=\"Administrator\" call Rename Name=\"Partisan\"\n\n"]], "corpus": "stackexchange"}
42
+ {"tstamp": 1722445805.4526, "task_type": "retrieval", "type": "chat", "model": "mixedbread-ai/mxbai-embed-large-v1", "gen_params": {}, "start": 1722445805.2162, "finish": 1722445805.4526, "ip": "", "conv_id": "939f7aa837ce48d9b9700ee6346622d4", "model_name": "mixedbread-ai/mxbai-embed-large-v1", "prompt": "how to change share name in windows server 2008?", "output": [["how to change share name in windows server 2008?", "Serverfault Stackexchange\n\nQ: How do you configure a \\\\domain\\sharedFolder in windows 2008? Background: \n\n\n*\n\n*I've got several Windows 2008 servers setup by someone else.\n\n*They are all on one domain, lets call this mydomain\n\n*Several boxes are accessing a shared folder by going to \\\\mydomain\\share1\n\n*I found the actual files backing the share on box1\n\n*I need to move the location of this share to a larger drive for space reasons\n\n\nI'm familiar with setting up and managing shares like \\\\box1\\share1, but where do you manage/configure shares that are domain-wide; accessed like \\\\mydomain\\share1 ?\n\nA: It sounds like you have a domain based DFS configured, which is managed by the DFS Management utility.\n"]], "corpus": "stackexchange"}
43
+ {"tstamp": 1722445859.671, "task_type": "retrieval", "type": "chat", "model": "nomic-ai/nomic-embed-text-v1.5", "gen_params": {}, "start": 1722445859.5226, "finish": 1722445859.671, "ip": "", "conv_id": "44be4054380e46ddb723a2cb471d95ca", "model_name": "nomic-ai/nomic-embed-text-v1.5", "prompt": "Recent work on embedding text from multiple languages into a common semantic space using dual-encoder models", "output": [["Recent work on embedding text from multiple languages into a common semantic space using dual-encoder models", "Title: Multilingual Universal Sentence Encoder for Semantic Retrieval\n\nAbstract: We introduce two pre-trained retrieval focused multilingual sentence encoding models, respectively based on the Transformer and CNN model architectures. The models embed text from 16 languages into a single semantic space using a multi-task trained dual-encoder that learns tied representations using translation based bridge tasks (Chidambaram al., 2018). The models provide performance that is competitive with the state-of-the-art on: semantic retrieval (SR), translation pair bitext retrieval (BR) and retrieval question answering (ReQA). On English transfer learning tasks, our sentence-level embeddings approach, and in some cases exceed, the performance of monolingual, English only, sentence embedding models. Our models are made available for download on TensorFlow Hub."]], "corpus": "arxiv"}
44
+ {"tstamp": 1722445859.671, "task_type": "retrieval", "type": "chat", "model": "jinaai/jina-embeddings-v2-base-en", "gen_params": {}, "start": 1722445859.5226, "finish": 1722445859.671, "ip": "", "conv_id": "2ee560deb1bb478ba7cc802e6642343d", "model_name": "jinaai/jina-embeddings-v2-base-en", "prompt": "Recent work on embedding text from multiple languages into a common semantic space using dual-encoder models", "output": [["Recent work on embedding text from multiple languages into a common semantic space using dual-encoder models", "Title: Point-charge electrostatics in disordered alloys\n\nAbstract: A simple analytic model of point-ion electrostatics has been previously proposed in which the magnitude of the net charge q_i on each atom in an ordered or random alloy depends linearly on the number N_i^(1) of unlike neighbors in its first coordination shell. Point charges extracted from recent large supercell (256-432 atom) local density approximation (LDA) calculations of Cu-Zn random alloys now enable an assessment of the physical validity and accuracy of the simple model. We find that this model accurately describes (i) the trends in q_i vs. N_i^(1), particularly for fcc alloys, (ii) the magnitudes of total electrostatic energies in random alloys, (iii) the relationships between constant-occupation-averaged charges <q_i> and Coulomb shifts <V_i> (i.e., the average over all sites occupied by either $A$ or $B$ atoms) in the random alloy, and (iv) the linear relation between the site charge q_i and the constant- charge-averaged Coulomb shift (i.e., the average over all sites with the same charge) for fcc alloys. However, for bcc alloys the fluctuations predicted by the model in the q_i vs. V_i relation exceed those found in the LDA supercell calculations. We find that (a) the fluctuations present in the model have a vanishing contribution to the electrostatic energy. (b) Generalizing the model to include a dependence of the charge on the atoms in the first three (two) shells in bcc (fcc) - rather than the first shell only - removes the fluctuations, in complete agreement with the LDA data. We also demonstrate an efficient way to extract charge transfer parameters of the generalized model from LDA calculations on small unit cells."]], "corpus": "arxiv"}
45
+ {"tstamp": 1722445894.206, "task_type": "retrieval", "type": "chat", "model": "nomic-ai/nomic-embed-text-v1.5", "gen_params": {}, "start": 1722445891.9819, "finish": 1722445894.206, "ip": "", "conv_id": "b96ff4fc7f0443eeac98061a3dfffce4", "model_name": "nomic-ai/nomic-embed-text-v1.5", "prompt": "one flew over the cuckoo's nest why did chief pretend to be deaf?", "output": [["one flew over the cuckoo's nest why did chief pretend to be deaf?", "Scifi Stackexchange\n\nQ: What story was Jack telling when Elizabeth fell? What story was Jack telling when Elizabeth fell? Can someone provide the whole story?\n\n\nA: The scene was ad-libbed by Johnny Depp. It's a reference to The Fast Show (which Depp is a fan of), where Patrick Nice often says \"...and [something happened]...which was nice\". In one episode, Nice explicitly says this line:\n\n...and then they made me their chief... which was nice.\n\nThat said, the second movie: Dead Man's Chest involves Sparrow as the chief for the Pelegostos Tribe. According to the DVD audio commentary with Ted Elliott and Terry Rossio, this is the second time that he has visited them, with the first being the one that was referenced by this line.\nThe full story of his initial adventure with the tribe was never explained. TV Tropes would call this a noodle incident.\n"]], "corpus": "stackexchange"}
46
+ {"tstamp": 1722445894.206, "task_type": "retrieval", "type": "chat", "model": "GritLM/GritLM-7B", "gen_params": {}, "start": 1722445891.9819, "finish": 1722445894.206, "ip": "", "conv_id": "a78fca47acfe43b9a74361b3c222464e", "model_name": "GritLM/GritLM-7B", "prompt": "one flew over the cuckoo's nest why did chief pretend to be deaf?", "output": [["one flew over the cuckoo's nest why did chief pretend to be deaf?", "Scifi Stackexchange\n\nQ: Why did V need the police chief? In the film V for Vendetta, the following conversation takes place:\n\nFinch: Rookwood. Why didn't you come forward before? What were you waiting for?\nV: For you, Inspector. I needed you.\n\nI have tried to come up with a reason why V needed him, and have concluded that V didn't need him. Rather the author needed him (the police chief) to provide another perspective on the plot: a plot device.\nIs that reasonable?\n\nA: V talked of liberating people by giving them the truth. What better way or what more symbolic way to do that than to help the police inspector find the truth? And also, I think V hoped that the police inspector would be inspired to make the world a more just place if he knew the whole truth. V knew the world would go on once he died and someone needed to take his place in ensuring justice is served. That would be Evey with the help of the police inspector. \n"]], "corpus": "stackexchange"}
data/retrieval_side_by_side-407c4836-37e2-4f9f-8e9b-06706cc4440c.jsonl ADDED
@@ -0,0 +1 @@
 
 
1
+ {"tstamp": 1722445866.7856, "task_type": "retrieval", "type": "leftvote", "models": ["nomic-ai/nomic-embed-text-v1.5", "jinaai/jina-embeddings-v2-base-en"], "ip": "", "0_conv_id": "44be4054380e46ddb723a2cb471d95ca", "0_model_name": "nomic-ai/nomic-embed-text-v1.5", "0_prompt": "Recent work on embedding text from multiple languages into a common semantic space using dual-encoder models", "0_output": [["Recent work on embedding text from multiple languages into a common semantic space using dual-encoder models", "Title: Multilingual Universal Sentence Encoder for Semantic Retrieval\n\nAbstract: We introduce two pre-trained retrieval focused multilingual sentence encoding models, respectively based on the Transformer and CNN model architectures. The models embed text from 16 languages into a single semantic space using a multi-task trained dual-encoder that learns tied representations using translation based bridge tasks (Chidambaram al., 2018). The models provide performance that is competitive with the state-of-the-art on: semantic retrieval (SR), translation pair bitext retrieval (BR) and retrieval question answering (ReQA). On English transfer learning tasks, our sentence-level embeddings approach, and in some cases exceed, the performance of monolingual, English only, sentence embedding models. Our models are made available for download on TensorFlow Hub."]], "0_corpus": "arxiv", "1_conv_id": "2ee560deb1bb478ba7cc802e6642343d", "1_model_name": "jinaai/jina-embeddings-v2-base-en", "1_prompt": "Recent work on embedding text from multiple languages into a common semantic space using dual-encoder models", "1_output": [["Recent work on embedding text from multiple languages into a common semantic space using dual-encoder models", "Title: Point-charge electrostatics in disordered alloys\n\nAbstract: A simple analytic model of point-ion electrostatics has been previously proposed in which the magnitude of the net charge q_i on each atom in an ordered or random alloy depends linearly on the number N_i^(1) of unlike neighbors in its first coordination shell. Point charges extracted from recent large supercell (256-432 atom) local density approximation (LDA) calculations of Cu-Zn random alloys now enable an assessment of the physical validity and accuracy of the simple model. We find that this model accurately describes (i) the trends in q_i vs. N_i^(1), particularly for fcc alloys, (ii) the magnitudes of total electrostatic energies in random alloys, (iii) the relationships between constant-occupation-averaged charges <q_i> and Coulomb shifts <V_i> (i.e., the average over all sites occupied by either $A$ or $B$ atoms) in the random alloy, and (iv) the linear relation between the site charge q_i and the constant- charge-averaged Coulomb shift (i.e., the average over all sites with the same charge) for fcc alloys. However, for bcc alloys the fluctuations predicted by the model in the q_i vs. V_i relation exceed those found in the LDA supercell calculations. We find that (a) the fluctuations present in the model have a vanishing contribution to the electrostatic energy. (b) Generalizing the model to include a dependence of the charge on the atoms in the first three (two) shells in bcc (fcc) - rather than the first shell only - removes the fluctuations, in complete agreement with the LDA data. We also demonstrate an efficient way to extract charge transfer parameters of the generalized model from LDA calculations on small unit cells."]], "1_corpus": "arxiv"}