Datasets:

Languages:
English
ArXiv:
License:
amodaresi commited on
Commit
dc64ff6
·
1 Parent(s): 1410cb5

Edited readme, added books, licenses, and needle sets

Browse files
README.md CHANGED
@@ -7,4 +7,75 @@ language:
7
  pretty_name: NoLiMa
8
  ---
9
 
10
- # NoLiMa: Long-Context Evaluation Beyond Literal Matching
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7
  pretty_name: NoLiMa
8
  ---
9
 
10
+ # NoLiMa: Long-Context Evaluation Beyond Literal Matching
11
+
12
+ ## Abstract
13
+ > Recent large language models (LLMs) support long contexts ranging from 128K to 1M tokens. A popular method for evaluating these capabilities is the needle-in-a-haystack (NIAH) test, which involves retrieving a "needle" (relevant information) from a "haystack" (long irrelevant context). Extensions of this approach include increasing distractors, fact chaining, and in-context reasoning. However, in these benchmarks, models can exploit existing literal matches between the needle and haystack to simplify the task. To address this, we introduce **NoLiMa**, a benchmark extending NIAH with a carefully designed needle set, where questions and needles have **minimal lexical overlap, requiring models to infer latent associations to locate the needle within the haystack**. We evaluate 12 popular LLMs that claim to support contexts of at least 128K tokens. While they perform well in short contexts (<1K), performance degrades significantly as context length increases. At 32K, for instance, 10 models drop below 50\% of their strong short-length baselines. Even GPT-4o, one of the top-performing exceptions, experiences a reduction from an almost-perfect baseline of 99.3\% to 69.7\%. Our analysis suggests these declines stem from the increased difficulty the attention mechanism faces in longer contexts when literal matches are absent, making it harder to retrieve relevant information.
14
+
15
+ ## Results
16
+ | Models | Claimed Length | Effective Length | Base Score<br>(×0.85: Thr.) | 1K | 2K | 4K | 8K | 16K | 32K |
17
+ |----------------------|:-------------:|:---------------:|:-----------------------:|:---:|:---:|:---:|:---:|:---:|:---:|
18
+ | GPT-4o | 128K | 8K | 99.3 (84.4) | <ins>98.1</ins> | <ins>98.0</ins> | <ins>95.7</ins> | <ins>89.2</ins> | 81.6 | 69.7 |
19
+ | Llama 3.3 70B | 128K | 2K | 97.3 (82.7) | <ins>94.2</ins> | <ins>87.4</ins> | 81.5 | 72.1 | 59.5 | *42.7* |
20
+ | Llama 3.1 405B | 128K | 2K | 94.7 (80.5) | <ins>89.0</ins> | <ins>85.0</ins> | 74.5 | 60.1 | 48.4 | *38.0* |
21
+ | Llama 3.1 70B | 128K | 2K | 94.5 (80.3) | <ins>91.0</ins> | <ins>81.8</ins> | 71.2 | 62.7 | 51.8 | *43.2* |
22
+ | Gemini 1.5 Pro | 2M | 2K | 92.6 (78.7) | <ins>86.4</ins> | <ins>82.7</ins> | 75.4 | 63.9 | 55.5 | 48.2 |
23
+ | Jamba 1.5 Mini | 256K | <1K | 92.4 (78.6) | 76.3 | 74.1 | 70.8 | 62.2 | 52.7 | *43.6* |
24
+ | Command R+ | 128K | <1K | 90.9 (77.3) | 77.0 | 73.5 | 66.3 | *39.5* | *21.3* | *7.4* |
25
+ | Mistral Large 2 | 128K | 2K | 87.9 (74.7) | <ins>86.1</ins> | <ins>85.5</ins> | 73.3 | 51.5 | *32.6* | *18.7* |
26
+ | Claude 3.5 Sonnet | 200K | 4K | 87.6 (74.4) | <ins>85.4</ins> | <ins>84.0</ins> | <ins>77.6</ins> | 61.7 | 45.7 | *29.8* |
27
+ | Gemini 1.5 Flash | 1M | <1K | 84.7 (72.0) | 68.6 | 61.6 | 51.0 | 44.4 | *35.5* | *28.6* |
28
+ | GPT-4o mini | 128K | <1K | 84.9 (72.2) | 67.7 | 58.2 | 44.1 | *32.6* | *20.6* | *13.7* |
29
+ | Llama 3.1 8B | 128K | 1K | 76.7 (65.2) | <ins>65.7</ins> | 54.4 | 44.1 | *31.9* | *22.6* | *14.2* |
30
+
31
+ This table presents the performance results of selected models on NOLIMA tests. The **base score** represents a model’s accuracy on the task at short contexts (250, 500, and 1K) and serves as a controlled reference to measure performance degradation at longer contexts.
32
+ The **effective length** is defined as the longest context where a model maintains at least 85% of its base score. Scores above this threshold are <ins>underlined</ins>, while scores dropping below 50% of the base score are *italicized*.
33
+
34
+ ### NoLiMa-Hard Results
35
+ | Models | Base Score | 4K | 8K | 16K | 32K |
36
+ |-----------------------|:---------:|:---:|:---:|:---:|:---:|
37
+ | **Llama 3.3 70B** | | | | | |
38
+ | - w/o CoT | 98.3 | 55.5 | *37.2* | *16.7* | *8.9* |
39
+ | - w/ CoT | 97.1 | 73.0 | 51.2 | *31.8* | *10.1* |
40
+ | **Reasoning Models** | | | | | |
41
+ | GPT-o1 | 99.9 | 92.0 | 78.0 | 60.1 | *31.1* |
42
+ | GPT-o3 Mini | 98.8 | 52.8 | *36.9* | *25.5* | *18.9* |
43
+ | DeepSeek R1-Distill-Llama-70B | 99.9 | 91.4 | 75.5 | *49.4* | *20.7* |
44
+
45
+ This table presents the performance results of selected reasoning models on **NoLiMa-Hard**, a subset of the original NoLiMa needle set containing the 10 most challenging question-needle pairs from previous evaluations.
46
+ Scores dropping below 50% of the base score are in *italic*.
47
+
48
+ ## Evaluation
49
+
50
+ This HuggingFace repository contains all the necessary data--including the NoLiMa needle set and the haystacks--to evaluate models on the NoLiMa benchmark.
51
+
52
+ To access the evaluation script and more information, please refer to the [NoLiMa GitHub repository]().
53
+
54
+ ## Dataset Structure
55
+ The dataset is structured as follows:
56
+ - `haystack/` : Contains the haystack data
57
+ - `books.tar.gz` : Contains the books used to generate the haystacks. It can also be used to create new shuffled haystacks.
58
+ - `rand_shuffle/` : Contains the shuffled haystacks that were used in the evaluation.
59
+ - `needlesets/` : Contains the NoLiMa needle sets:
60
+ - `needle_set.json` : The main NoLiMa needle set.
61
+ - `needle_set_hard.json` : The NoLiMa-Hard needle set; a subset of the main needle set containing the 10 most challenging question-needle pairs.
62
+ - `needle_set_ONLYDirect.json` : The main needle set with only direct questions.
63
+ - `needle_set_MC.json` : The main needle set formatted as multiple-choice questions.
64
+ - `needle_set_w_CoT.json` : The main needle set with CoT task templates.
65
+ - `needle_set_w_distractor.json` : The main needle set with distractors.
66
+
67
+ ## GitHub Repo & Paper
68
+ For more information about **NoLiMa**, refer to:
69
+
70
+ - 📄 Paper: "[NoLiMa: Long-Context Evaluation Beyond Literal Matching]()"
71
+ - 🔗 GitHub Repo: [NoLiMa GitHub repository]()
72
+
73
+ ## License
74
+
75
+ The evaluation code and needle set data is licensed under the [Adobe Research License](LICENSE). The license prohibits commercial use and allows non-commercial research use. For details about the haystack data, please refer to the [haystack/LICENSES.md](haystack/LICENSES) file.
76
+
77
+ ## Cite
78
+ If you use the **NoLiMa** dataset, filtering pipeline, or code, please cite the paper:
79
+ ```bibtex
80
+ @{}
81
+ ```
haystack/LICENSES.md ADDED
@@ -0,0 +1,41 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ This file contains the attributions for the books that are being used in the haystack dataset:
2
+
3
+ ##### book_1.txt:
4
+ "Little Brother" by Cory Doctorow, derived from the Project Gutenberg edition published on 2009-09-30, produced by GITenberg and the Free Ebook Foundation.
5
+ Used under the [Creative Commons Attribution-NonCommercial-ShareAlike 3.0 License](https://creativecommons.org/licenses/by-nc-sa/3.0/). The work was converted from the epub version, with some parts removed and edited from the original GITenberg edition [(GITenberg link)](https://github.com/GITenberg/Little-Brother_30142/releases/download/0.1.0/book.epub).
6
+
7
+
8
+ ##### book_2.txt:
9
+ "ZERO SUM GAME" by SL Huang, used under the [Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License](https://creativecommons.org/licenses/by-nc-sa/4.0/). The work was converted from the epub version, with some parts removed and edited from the original [(Link)](https://www.unglue.it/download_ebook/2241/).
10
+
11
+
12
+ ##### book_3.txt:
13
+ "Life Blood" by Thomas Hoover, derived from the Project Gutenberg edition. Used under the [Creative Commons Attribution 3.0 Unported License](https://creativecommons.org/licenses/by/3.0/). Originally released on November 14, 2010. The work was converted from the epub version, with some parts removed and edited from the original [Project Gutenberg edition (hosted at Internet Archive)](https://archive.org/download/LifeBlood_Hoover/LifeBlood.epub).
14
+
15
+
16
+ ##### book_4.txt:
17
+ "OVER CLOCKED" by Cory Doctorow, used under the [Creative Commons Attribution-NonCommercial-ShareAlike 2.5](https://creativecommons.org/licenses/by-nc-sa/2.5/). The work was converted from the epub version, with some parts removed and edited from the original [(Link)](https://ia802909.us.archive.org/28/items/overclocked/overclocked.epub).
18
+
19
+
20
+ ##### book_6.txt
21
+ "ROOT OF UNITY" by SL Huang, used under the [Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License](https://creativecommons.org/licenses/by-nc-sa/4.0/). The work was converted from the epub version, with some parts removed and edited from the original [(Link)](https://unglueit-files.s3.amazonaws.com/ebf/7fcc11a806754be7957949dc3ae32391.epub).
22
+
23
+
24
+ ##### book_10.txt
25
+ "With a Little Help" by Cory Doctorow, used under the [Creative Commons Attribution-NonCommercial-ShareAlike 3.0 License](https://creativecommons.org/licenses/by-nc-sa/3.0/). The work was converted from the epub version, with some parts removed and edited from the original [(Link)](https://ia804603.us.archive.org/31/items/WithALittleHelp_201410/WALH.epub)
26
+
27
+
28
+ ##### book_11.txt
29
+ "Rebecca Of Sunnybrook Farm" by Kate Douglas Wiggin, derived from the Project Gutenberg edition published on 1996-04-01, produced by GITenberg and the Free Ebook Foundation. Public domain in the US. The work was converted from the epub version, with some parts removed and edited from the original GITenberg edition [(GITenberg link)](https://github.com/GITenberg/Rebecca-of-Sunnybrook-Farm_498/releases/download/0.2.1/book.epub).
30
+
31
+
32
+ ##### book_12.txt
33
+ "The Thing Beyond Reason" by Elisabeth Sanxay Holding, derived from the Project Gutenberg edition published on February 17, 2022, produced by GITenberg and the Free Ebook Foundation. Public domain in the US. The work was converted from the epub version, with some parts removed and edited from the original GITenberg edition [(GITenberg link)](https://github.com/GITenberg/The-Thing-Beyond-Reason_67429/releases/download/0.1.0/book.epub).
34
+
35
+
36
+ ##### book_14.txt
37
+ "IF THEN ELSE" by Barbara Fister, used under the [Creative Commons Attribution-NonCommercial 4.0 License](https://creativecommons.org/licenses/by-nc/4.0/). The work was downloaded from the txt version, with some parts removed and edited from the original [(Link)](https://archive.org/stream/ifthenelse/ifthenelse_djvu.txt)
38
+
39
+
40
+ ##### book_15.txt
41
+ "A Vessel for Offering" by Darren R. Hawkins, used under the [Creative Commons Attribution 3.0 United States License](https://creativecommons.org/licenses/by/3.0/us/). The work was converted from the epub version, with some parts removed and edited from the original [(Link)](https://ia803208.us.archive.org/15/items/AVesselForOffering/AVesselForOffering.epub)
haystack/books.tar.gz ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:43daa2c4c11c608743129cbd688cb5ab06245872a0551fb98d33f7a061220266
3
+ size 10023582
needlesets/needle_set.json ADDED
@@ -0,0 +1,238 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "id": "0401",
4
+ "reasoning_type": "world_knowledge",
5
+ "system_prompt": "",
6
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
7
+ "needle": "Actually, {CHAR} lives next to {1}.",
8
+ "questions": {
9
+ "onehop": "Which character has been to {2}?",
10
+ "twohop": "Which character has been to {3}?"
11
+ },
12
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
13
+ "tests": {
14
+ "T17_C02": {
15
+ "input_args": ["the Kiasma museum", "Helsinki", "Uusimaa"]
16
+ },
17
+ "T15_C02": {
18
+ "input_args": ["the European Central Bank", "Frankfurt", "Germany"]
19
+ },
20
+ "T16_C02": {
21
+ "input_args": ["the Semper Opera House", "Dresden", "the state of Saxony"]
22
+ }
23
+ }
24
+ },
25
+ {
26
+ "id": "0401Inv",
27
+ "reasoning_type": "world_knowledge",
28
+ "system_prompt": "",
29
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
30
+ "needle": "{1} is next to where {CHAR} lives.",
31
+ "questions": {
32
+ "onehop": "Which character has been to {2}?",
33
+ "twohop": "Which character has been to {3}?"
34
+ },
35
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
36
+ "tests": {
37
+ "T17_C02": {
38
+ "input_args": ["The Kiasma museum", "Helsinki", "Uusimaa"]
39
+ },
40
+ "T15_C02": {
41
+ "input_args": ["The European Central Bank", "Frankfurt", "Germany"]
42
+ },
43
+ "T16_C02": {
44
+ "input_args": ["The Semper Opera House", "Dresden", "the state of Saxony"]
45
+ }
46
+ }
47
+ },
48
+ {
49
+ "id": "0402",
50
+ "reasoning_type": "commonsense_knowledge",
51
+ "system_prompt": "",
52
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
53
+ "needle": "A message came in from {CHAR} saying, \"I'm lactose intolerant,\" and nothing more.",
54
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
55
+ "questions": {
56
+ "onehop": "Which character cannot drink {1}?",
57
+ "twohop": "Which character cannot drink {2}?"
58
+ },
59
+ "tests": {
60
+ "T01_C02": {
61
+ "input_args": ["milk", "a cappuccino"]
62
+ },
63
+ "T04_C02": {
64
+ "input_args": ["milk", "a caffè mocha"]
65
+ }
66
+ }
67
+ },
68
+ {
69
+ "id": "0402Inv",
70
+ "reasoning_type": "commonsense_knowledge",
71
+ "system_prompt": "",
72
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
73
+ "needle": "A message came in saying, \"I'm lactose intolerant,\" from {CHAR}.",
74
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
75
+ "questions": {
76
+ "onehop": "Which character cannot drink {1}?",
77
+ "twohop": "Which character cannot drink {2}?"
78
+ },
79
+ "tests": {
80
+ "T01_C02": {
81
+ "input_args": ["milk", "a cappuccino"]
82
+ },
83
+ "T04_C02": {
84
+ "input_args": ["milk", "a caffè mocha"]
85
+ }
86
+ }
87
+ },
88
+ {
89
+ "id": "0405",
90
+ "reasoning_type": "commonsense_knowledge",
91
+ "system_prompt": "",
92
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
93
+ "needle": "Then {CHAR} mentioned that he has been vegan for years.",
94
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
95
+ "questions": {
96
+ "onehop": "Which character cannot eat {1}?",
97
+ "twohop": "Which character cannot eat {2}?"
98
+ },
99
+ "tests": {
100
+ "T01_C02": {
101
+ "input_args": ["fish-based meals", "Brandade"]
102
+ },
103
+ "T04_C02": {
104
+ "input_args": ["egg-based meals", "an omelette"]
105
+ }
106
+ }
107
+ },
108
+ {
109
+ "id": "0405Inv",
110
+ "reasoning_type": "commonsense_knowledge",
111
+ "system_prompt": "",
112
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
113
+ "needle": "There was a vegan guest, named {CHAR}.",
114
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
115
+ "questions": {
116
+ "onehop": "Which character cannot eat {1}?",
117
+ "twohop": "Which character cannot eat {2}?"
118
+ },
119
+ "tests": {
120
+ "T01_C02": {
121
+ "input_args": ["fish-based meals", "Brandade"]
122
+ },
123
+ "T04_C02": {
124
+ "input_args": ["egg-based meals", "an omelette"]
125
+ }
126
+ }
127
+ },
128
+ {
129
+ "id": "0408",
130
+ "reasoning_type": "world_knowledge",
131
+ "system_prompt": "",
132
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
133
+ "needle": "In 2013, after waiting in line for hours, {CHAR} finally saw the original {1} painting up close.",
134
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
135
+ "questions": {
136
+ "onehop": "Which character has been to {2}?",
137
+ "twohop": "Which character has been to {3}?",
138
+ "twohop2": "Which character has been to {4}?"
139
+ },
140
+ "tests": {
141
+ "T01_C02": {
142
+ "input_args": ["'Girl with a Pearl Earring'", "the Mauritshuis", "The Hague", "the Netherlands"]
143
+ },
144
+ "T04_C02": {
145
+ "input_args": ["'Garden of Earthly Delights'", "the Museo del Prado", "Madrid", "Spain"]
146
+ },
147
+ "T05_C02": {
148
+ "input_args": ["'Impression, Sunrise'", "the Musée Marmottan Monet", "Paris", "France"]
149
+ }
150
+ }
151
+ },
152
+ {
153
+ "id": "0408Inv",
154
+ "reasoning_type": "world_knowledge",
155
+ "system_prompt": "",
156
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
157
+ "needle": "In 2013, the original {1} painting was seen up close by {CHAR}, finally, after waiting in line for hours.",
158
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
159
+ "questions": {
160
+ "onehop": "Which character has been to {2}?",
161
+ "twohop": "Which character has been to {3}?",
162
+ "twohop2": "Which character has been to {4}?"
163
+ },
164
+ "tests": {
165
+ "T01_C02": {
166
+ "input_args": ["'Girl with a Pearl Earring'", "the Mauritshuis", "The Hague", "the Netherlands"]
167
+ },
168
+ "T04_C02": {
169
+ "input_args": ["'Garden of Earthly Delights'", "the Museo del Prado", "Madrid", "Spain"]
170
+ },
171
+ "T05_C02": {
172
+ "input_args": ["'Impression, Sunrise'", "the Musée Marmottan Monet", "Paris", "France"]
173
+ }
174
+ }
175
+ },
176
+ {
177
+ "id": "0409Inv",
178
+ "reasoning_type": "world_knowledge",
179
+ "system_prompt": "",
180
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
181
+ "needle": "There was an engineer living in {1}, named {CHAR}.",
182
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
183
+ "questions": {
184
+ "onehop": "Which character has been to {2}?"
185
+ },
186
+ "tests": {
187
+ "T09_C02": {
188
+ "input_args": ["Witbank", "South Africa"]
189
+ },
190
+ "T10_C02": {
191
+ "input_args": ["Calvinia", "South Africa"]
192
+ },
193
+ "T04_C02": {
194
+ "input_args": ["Firminy", "France"]
195
+ },
196
+ "T05_C02": {
197
+ "input_args": ["Vierzon", "France"]
198
+ },
199
+ "T07_C02": {
200
+ "input_args": ["Borujerd", "Iran"]
201
+ },
202
+ "T08_C02": {
203
+ "input_args": ["Lahijan", "Iran"]
204
+ }
205
+ }
206
+ },
207
+ {
208
+ "id": "0409",
209
+ "reasoning_type": "world_knowledge",
210
+ "system_prompt": "",
211
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
212
+ "needle": "There was {CHAR} who was an engineer living in {1}.",
213
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
214
+ "questions": {
215
+ "onehop": "Which character has been to {2}?"
216
+ },
217
+ "tests": {
218
+ "T09_C02": {
219
+ "input_args": ["Witbank", "South Africa"]
220
+ },
221
+ "T10_C02": {
222
+ "input_args": ["Calvinia", "South Africa"]
223
+ },
224
+ "T04_C02": {
225
+ "input_args": ["Firminy", "France"]
226
+ },
227
+ "T05_C02": {
228
+ "input_args": ["Vierzon", "France"]
229
+ },
230
+ "T07_C02": {
231
+ "input_args": ["Borujerd", "Iran"]
232
+ },
233
+ "T08_C02": {
234
+ "input_args": ["Lahijan", "Iran"]
235
+ }
236
+ }
237
+ }
238
+ ]
needlesets/needle_set_MC.json ADDED
@@ -0,0 +1,238 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "id": "0401",
4
+ "reasoning_type": "world_knowledge",
5
+ "system_prompt": "",
6
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
7
+ "needle": "Actually, {CHAR} lives next to {1}.",
8
+ "questions": {
9
+ "onehop": "Which character has been to {2}? Alex, Sarah, {CHAR}, or Rebecca?",
10
+ "twohop": "Which character has been to {3}? Alex, Sarah, {CHAR}, or Rebecca?"
11
+ },
12
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
13
+ "tests": {
14
+ "T17_C02": {
15
+ "input_args": ["the Kiasma museum", "Helsinki", "Uusimaa"]
16
+ },
17
+ "T15_C02": {
18
+ "input_args": ["the European Central Bank", "Frankfurt", "Germany"]
19
+ },
20
+ "T16_C02": {
21
+ "input_args": ["the Semper Opera House", "Dresden", "the state of Saxony"]
22
+ }
23
+ }
24
+ },
25
+ {
26
+ "id": "0401Inv",
27
+ "reasoning_type": "world_knowledge",
28
+ "system_prompt": "",
29
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
30
+ "needle": "{1} is next to where {CHAR} lives.",
31
+ "questions": {
32
+ "onehop": "Which character has been to {2}? Alex, Sarah, {CHAR}, or Rebecca?",
33
+ "twohop": "Which character has been to {3}? Alex, Sarah, {CHAR}, or Rebecca?"
34
+ },
35
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
36
+ "tests": {
37
+ "T17_C02": {
38
+ "input_args": ["The Kiasma museum", "Helsinki", "Uusimaa"]
39
+ },
40
+ "T15_C02": {
41
+ "input_args": ["The European Central Bank", "Frankfurt", "Germany"]
42
+ },
43
+ "T16_C02": {
44
+ "input_args": ["The Semper Opera House", "Dresden", "the state of Saxony"]
45
+ }
46
+ }
47
+ },
48
+ {
49
+ "id": "0402",
50
+ "reasoning_type": "commonsense_knowledge",
51
+ "system_prompt": "",
52
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
53
+ "needle": "A message came in from {CHAR} saying, \"I'm lactose intolerant,\" and nothing more.",
54
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
55
+ "questions": {
56
+ "onehop": "Which character cannot drink {1}? Alex, Sarah, {CHAR}, or Rebecca?",
57
+ "twohop": "Which character cannot drink {2}? Alex, Sarah, {CHAR}, or Rebecca?"
58
+ },
59
+ "tests": {
60
+ "T01_C02": {
61
+ "input_args": ["milk", "a cappuccino"]
62
+ },
63
+ "T04_C02": {
64
+ "input_args": ["milk", "a caffè mocha"]
65
+ }
66
+ }
67
+ },
68
+ {
69
+ "id": "0402Inv",
70
+ "reasoning_type": "commonsense_knowledge",
71
+ "system_prompt": "",
72
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
73
+ "needle": "A message came in saying, \"I'm lactose intolerant,\" from {CHAR}.",
74
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
75
+ "questions": {
76
+ "onehop": "Which character cannot drink {1}? Alex, Sarah, {CHAR}, or Rebecca?",
77
+ "twohop": "Which character cannot drink {2}? Alex, Sarah, {CHAR}, or Rebecca?"
78
+ },
79
+ "tests": {
80
+ "T01_C02": {
81
+ "input_args": ["milk", "a cappuccino"]
82
+ },
83
+ "T04_C02": {
84
+ "input_args": ["milk", "a caffè mocha"]
85
+ }
86
+ }
87
+ },
88
+ {
89
+ "id": "0405",
90
+ "reasoning_type": "commonsense_knowledge",
91
+ "system_prompt": "",
92
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
93
+ "needle": "Then {CHAR} mentioned that he has been vegan for years.",
94
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
95
+ "questions": {
96
+ "onehop": "Which character cannot eat {1}? Alex, Sarah, {CHAR}, or Rebecca?",
97
+ "twohop": "Which character cannot eat {2}? Alex, Sarah, {CHAR}, or Rebecca?"
98
+ },
99
+ "tests": {
100
+ "T01_C02": {
101
+ "input_args": ["fish-based meals", "Brandade"]
102
+ },
103
+ "T04_C02": {
104
+ "input_args": ["egg-based meals", "an omelette"]
105
+ }
106
+ }
107
+ },
108
+ {
109
+ "id": "0405Inv",
110
+ "reasoning_type": "commonsense_knowledge",
111
+ "system_prompt": "",
112
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
113
+ "needle": "There was a vegan guest, named {CHAR}.",
114
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
115
+ "questions": {
116
+ "onehop": "Which character cannot eat {1}? Alex, Sarah, {CHAR}, or Rebecca?",
117
+ "twohop": "Which character cannot eat {2}? Alex, Sarah, {CHAR}, or Rebecca?"
118
+ },
119
+ "tests": {
120
+ "T01_C02": {
121
+ "input_args": ["fish-based meals", "Brandade"]
122
+ },
123
+ "T04_C02": {
124
+ "input_args": ["egg-based meals", "an omelette"]
125
+ }
126
+ }
127
+ },
128
+ {
129
+ "id": "0408",
130
+ "reasoning_type": "world_knowledge",
131
+ "system_prompt": "",
132
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
133
+ "needle": "In 2013, after waiting in line for hours, {CHAR} finally saw the original {1} painting up close.",
134
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
135
+ "questions": {
136
+ "onehop": "Which character has been to {2}? Alex, Sarah, {CHAR}, or Rebecca?",
137
+ "twohop": "Which character has been to {3}? Alex, Sarah, {CHAR}, or Rebecca?",
138
+ "twohop2": "Which character has been to {4}? Alex, Sarah, {CHAR}, or Rebecca?"
139
+ },
140
+ "tests": {
141
+ "T01_C02": {
142
+ "input_args": ["'Girl with a Pearl Earring'", "the Mauritshuis", "The Hague", "the Netherlands"]
143
+ },
144
+ "T04_C02": {
145
+ "input_args": ["'Garden of Earthly Delights'", "the Museo del Prado", "Madrid", "Spain"]
146
+ },
147
+ "T05_C02": {
148
+ "input_args": ["'Impression, Sunrise'", "the Musée Marmottan Monet", "Paris", "France"]
149
+ }
150
+ }
151
+ },
152
+ {
153
+ "id": "0408Inv",
154
+ "reasoning_type": "world_knowledge",
155
+ "system_prompt": "",
156
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
157
+ "needle": "In 2013, the original {1} painting was seen up close by {CHAR}, finally, after waiting in line for hours.",
158
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
159
+ "questions": {
160
+ "onehop": "Which character has been to {2}? Alex, Sarah, {CHAR}, or Rebecca?",
161
+ "twohop": "Which character has been to {3}? Alex, Sarah, {CHAR}, or Rebecca?",
162
+ "twohop2": "Which character has been to {4}? Alex, Sarah, {CHAR}, or Rebecca?"
163
+ },
164
+ "tests": {
165
+ "T01_C02": {
166
+ "input_args": ["'Girl with a Pearl Earring'", "the Mauritshuis", "The Hague", "the Netherlands"]
167
+ },
168
+ "T04_C02": {
169
+ "input_args": ["'Garden of Earthly Delights'", "the Museo del Prado", "Madrid", "Spain"]
170
+ },
171
+ "T05_C02": {
172
+ "input_args": ["'Impression, Sunrise'", "the Musée Marmottan Monet", "Paris", "France"]
173
+ }
174
+ }
175
+ },
176
+ {
177
+ "id": "0409Inv",
178
+ "reasoning_type": "world_knowledge",
179
+ "system_prompt": "",
180
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
181
+ "needle": "There was an engineer living in {1}, named {CHAR}.",
182
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
183
+ "questions": {
184
+ "onehop": "Which character has been to {2}? Alex, Sarah, {CHAR}, or Rebecca?"
185
+ },
186
+ "tests": {
187
+ "T09_C02": {
188
+ "input_args": ["Witbank", "South Africa"]
189
+ },
190
+ "T10_C02": {
191
+ "input_args": ["Calvinia", "South Africa"]
192
+ },
193
+ "T04_C02": {
194
+ "input_args": ["Firminy", "France"]
195
+ },
196
+ "T05_C02": {
197
+ "input_args": ["Vierzon", "France"]
198
+ },
199
+ "T07_C02": {
200
+ "input_args": ["Borujerd", "Iran"]
201
+ },
202
+ "T08_C02": {
203
+ "input_args": ["Lahijan", "Iran"]
204
+ }
205
+ }
206
+ },
207
+ {
208
+ "id": "0409",
209
+ "reasoning_type": "world_knowledge",
210
+ "system_prompt": "",
211
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
212
+ "needle": "There was {CHAR} who was an engineer living in {1}.",
213
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
214
+ "questions": {
215
+ "onehop": "Which character has been to {2}? Alex, Sarah, {CHAR}, or Rebecca?"
216
+ },
217
+ "tests": {
218
+ "T09_C02": {
219
+ "input_args": ["Witbank", "South Africa"]
220
+ },
221
+ "T10_C02": {
222
+ "input_args": ["Calvinia", "South Africa"]
223
+ },
224
+ "T04_C02": {
225
+ "input_args": ["Firminy", "France"]
226
+ },
227
+ "T05_C02": {
228
+ "input_args": ["Vierzon", "France"]
229
+ },
230
+ "T07_C02": {
231
+ "input_args": ["Borujerd", "Iran"]
232
+ },
233
+ "T08_C02": {
234
+ "input_args": ["Lahijan", "Iran"]
235
+ }
236
+ }
237
+ }
238
+ ]
needlesets/needle_set_ONLYDirect.json ADDED
@@ -0,0 +1,216 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "id": "0401",
4
+ "reasoning_type": "world_knowledge",
5
+ "system_prompt": "",
6
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
7
+ "needle": "Actually, {CHAR} lives next to {1}.",
8
+ "questions": {
9
+ "direct": "Which character lives next to {1}?"
10
+ },
11
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
12
+ "tests": {
13
+ "T17_C02": {
14
+ "input_args": ["the Kiasma museum", "Helsinki", "Uusimaa"]
15
+ },
16
+ "T15_C02": {
17
+ "input_args": ["the European Central Bank", "Frankfurt", "Germany"]
18
+ },
19
+ "T16_C02": {
20
+ "input_args": ["the Semper Opera House", "Dresden", "the state of Saxony"]
21
+ }
22
+ }
23
+ },
24
+ {
25
+ "id": "0401Inv",
26
+ "reasoning_type": "world_knowledge",
27
+ "system_prompt": "",
28
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
29
+ "needle": "{1} is next to where {CHAR} lives.",
30
+ "questions": {
31
+ "direct": "Which character lives next to {4}?"
32
+ },
33
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
34
+ "tests": {
35
+ "T17_C02": {
36
+ "input_args": ["The Kiasma museum", "Helsinki", "Uusimaa", "the Kiasma museum"]
37
+ },
38
+ "T15_C02": {
39
+ "input_args": ["The European Central Bank", "Frankfurt", "Germany", "the European Central Bank"]
40
+ },
41
+ "T16_C02": {
42
+ "input_args": ["The Semper Opera House", "Dresden", "the state of Saxony", "the Semper Opera House"]
43
+ }
44
+ }
45
+ },
46
+ {
47
+ "id": "0402",
48
+ "reasoning_type": "commonsense_knowledge",
49
+ "system_prompt": "",
50
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
51
+ "needle": "A message came in from {CHAR} saying, \"I'm lactose intolerant,\" and nothing more.",
52
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
53
+ "questions": {
54
+ "direct": "Which character is lactose intolerant?"
55
+ },
56
+ "tests": {
57
+ "T01_C02": {
58
+ "input_args": ["milk", "a cappuccino"]
59
+ }
60
+ }
61
+ },
62
+ {
63
+ "id": "0402Inv",
64
+ "reasoning_type": "commonsense_knowledge",
65
+ "system_prompt": "",
66
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
67
+ "needle": "A message came in saying, \"I'm lactose intolerant,\" from {CHAR}.",
68
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
69
+ "questions": {
70
+ "direct": "Which character is lactose intolerant?"
71
+ },
72
+ "tests": {
73
+ "T01_C02": {
74
+ "input_args": ["milk", "a cappuccino"]
75
+ }
76
+ }
77
+ },
78
+ {
79
+ "id": "0405",
80
+ "reasoning_type": "commonsense_knowledge",
81
+ "system_prompt": "",
82
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
83
+ "needle": "Then {CHAR} mentioned that he has been vegan for years.",
84
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
85
+ "questions": {
86
+ "direct": "Which character is vegan?"
87
+ },
88
+ "tests": {
89
+ "T01_C02": {
90
+ "input_args": ["fish-based meals", "Brandade"]
91
+ }
92
+ }
93
+ },
94
+ {
95
+ "id": "0405Inv",
96
+ "reasoning_type": "commonsense_knowledge",
97
+ "system_prompt": "",
98
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
99
+ "needle": "There was a vegan guest, named {CHAR}.",
100
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
101
+ "questions": {
102
+ "direct": "Which character is vegan?"
103
+ },
104
+ "tests": {
105
+ "T01_C02": {
106
+ "input_args": ["fish-based meals", "Brandade"]
107
+ }
108
+ }
109
+ },
110
+ {
111
+ "id": "0408",
112
+ "reasoning_type": "world_knowledge",
113
+ "system_prompt": "",
114
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
115
+ "needle": "In 2013, after waiting in line for hours, {CHAR} finally saw the original {1} painting up close.",
116
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
117
+ "questions": {
118
+ "direct": "Which character has seen the original {1} painting?"
119
+ },
120
+ "tests": {
121
+ "T01_C02": {
122
+ "input_args": ["'Girl with a Pearl Earring'", "the Mauritshuis", "The Hague", "the Netherlands"]
123
+ },
124
+ "T04_C02": {
125
+ "input_args": ["'Garden of Earthly Delights'", "the Museo del Prado", "Madrid", "Spain"]
126
+ },
127
+ "T05_C02": {
128
+ "input_args": ["'Impression, Sunrise'", "the Musée Marmottan Monet", "Paris", "France"]
129
+ }
130
+ }
131
+ },
132
+ {
133
+ "id": "0408Inv",
134
+ "reasoning_type": "world_knowledge",
135
+ "system_prompt": "",
136
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
137
+ "needle": "In 2013, the original {1} painting was seen up close by {CHAR}, finally, after waiting in line for hours.",
138
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
139
+ "questions": {
140
+ "direct": "Which character has seen the original {1} painting?"
141
+ },
142
+ "tests": {
143
+ "T01_C02": {
144
+ "input_args": ["'Girl with a Pearl Earring'", "the Mauritshuis", "The Hague", "the Netherlands"]
145
+ },
146
+ "T04_C02": {
147
+ "input_args": ["'Garden of Earthly Delights'", "the Museo del Prado", "Madrid", "Spain"]
148
+ },
149
+ "T05_C02": {
150
+ "input_args": ["'Impression, Sunrise'", "the Musée Marmottan Monet", "Paris", "France"]
151
+ }
152
+ }
153
+ },
154
+ {
155
+ "id": "0409Inv",
156
+ "reasoning_type": "world_knowledge",
157
+ "system_prompt": "",
158
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
159
+ "needle": "There was an engineer living in {1}, named {CHAR}.",
160
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
161
+ "questions": {
162
+ "direct": "Which character has been to {1}?"
163
+ },
164
+ "tests": {
165
+ "T09_C02": {
166
+ "input_args": ["Witbank", "South Africa"]
167
+ },
168
+ "T10_C02": {
169
+ "input_args": ["Calvinia", "South Africa"]
170
+ },
171
+ "T04_C02": {
172
+ "input_args": ["Firminy", "France"]
173
+ },
174
+ "T05_C02": {
175
+ "input_args": ["Vierzon", "France"]
176
+ },
177
+ "T07_C02": {
178
+ "input_args": ["Borujerd", "Iran"]
179
+ },
180
+ "T08_C02": {
181
+ "input_args": ["Lahijan", "Iran"]
182
+ }
183
+ }
184
+ },
185
+ {
186
+ "id": "0409",
187
+ "reasoning_type": "world_knowledge",
188
+ "system_prompt": "",
189
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
190
+ "needle": "There was {CHAR} who was an engineer living in {1}.",
191
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
192
+ "questions": {
193
+ "direct": "Which character has been to {1}?"
194
+ },
195
+ "tests": {
196
+ "T09_C02": {
197
+ "input_args": ["Witbank", "South Africa"]
198
+ },
199
+ "T10_C02": {
200
+ "input_args": ["Calvinia", "South Africa"]
201
+ },
202
+ "T04_C02": {
203
+ "input_args": ["Firminy", "France"]
204
+ },
205
+ "T05_C02": {
206
+ "input_args": ["Vierzon", "France"]
207
+ },
208
+ "T07_C02": {
209
+ "input_args": ["Borujerd", "Iran"]
210
+ },
211
+ "T08_C02": {
212
+ "input_args": ["Lahijan", "Iran"]
213
+ }
214
+ }
215
+ }
216
+ ]
needlesets/needle_set_hard.json ADDED
@@ -0,0 +1,79 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "id": "0401Inv",
4
+ "reasoning_type": "world_knowledge",
5
+ "system_prompt": "",
6
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
7
+ "needle": "{1} is next to where {CHAR} lives.",
8
+ "questions": {
9
+ "twohop": "Which character has been to {3}?"
10
+ },
11
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
12
+ "tests": {
13
+ "T15_C02": {
14
+ "input_args": ["The European Central Bank", "Frankfurt", "Germany"]
15
+ }
16
+ }
17
+ },
18
+ {
19
+ "id": "0408",
20
+ "reasoning_type": "world_knowledge",
21
+ "system_prompt": "",
22
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
23
+ "needle": "In 2013, after waiting in line for hours, {CHAR} finally saw the original {1} painting up close.",
24
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
25
+ "questions": {
26
+ "twohop": "Which character has been to {3}?"
27
+ },
28
+ "tests": {
29
+ "T01_C02": {
30
+ "input_args": ["'Girl with a Pearl Earring'", "the Mauritshuis", "The Hague", "the Netherlands"]
31
+ },
32
+ "T04_C02": {
33
+ "input_args": ["'Garden of Earthly Delights'", "the Museo del Prado", "Madrid", "Spain"]
34
+ },
35
+ "T05_C02": {
36
+ "input_args": ["'Impression, Sunrise'", "the Musée Marmottan Monet", "Paris", "France"]
37
+ }
38
+ }
39
+ },
40
+ {
41
+ "id": "0408Inv",
42
+ "reasoning_type": "world_knowledge",
43
+ "system_prompt": "",
44
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
45
+ "needle": "In 2013, the original {1} painting was seen up close by {CHAR}, finally, after waiting in line for hours.",
46
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
47
+ "questions": {
48
+ "twohop": "Which character has been to {3}?",
49
+ "twohop2": "Which character has been to {4}?"
50
+ },
51
+ "tests": {
52
+ "T04_C02": {
53
+ "input_args": ["'Garden of Earthly Delights'", "the Museo del Prado", "Madrid", "Spain"]
54
+ },
55
+ "T05_C02": {
56
+ "input_args": ["'Impression, Sunrise'", "the Musée Marmottan Monet", "Paris", "France"]
57
+ }
58
+ }
59
+ },
60
+ {
61
+ "id": "0409Inv",
62
+ "reasoning_type": "world_knowledge",
63
+ "system_prompt": "",
64
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
65
+ "needle": "There was an engineer living in {1}, named {CHAR}.",
66
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
67
+ "questions": {
68
+ "onehop": "Which character has been to {2}?"
69
+ },
70
+ "tests": {
71
+ "T10_C02": {
72
+ "input_args": ["Calvinia", "South Africa"]
73
+ },
74
+ "T04_C02": {
75
+ "input_args": ["Firminy", "France"]
76
+ }
77
+ }
78
+ }
79
+ ]
needlesets/needle_set_w_CoT.json ADDED
@@ -0,0 +1,238 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "id": "0401",
4
+ "reasoning_type": "world_knowledge",
5
+ "system_prompt": "",
6
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Be aware that some details may not be stated directly, and you may need to INFER the answer based on the given information. Begin with a brief explanation of your reasoning in NO MORE THAN THREE (3) sentences. Then, return the final answer on a new line.\n\nQuestion: {question}",
7
+ "needle": "Actually, {CHAR} lives next to {1}.",
8
+ "questions": {
9
+ "onehop": "Which character has been to {2}?",
10
+ "twohop": "Which character has been to {3}?"
11
+ },
12
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
13
+ "tests": {
14
+ "T17_C02": {
15
+ "input_args": ["the Kiasma museum", "Helsinki", "Uusimaa"]
16
+ },
17
+ "T15_C02": {
18
+ "input_args": ["the European Central Bank", "Frankfurt", "Germany"]
19
+ },
20
+ "T16_C02": {
21
+ "input_args": ["the Semper Opera House", "Dresden", "the state of Saxony"]
22
+ }
23
+ }
24
+ },
25
+ {
26
+ "id": "0401Inv",
27
+ "reasoning_type": "world_knowledge",
28
+ "system_prompt": "",
29
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Be aware that some details may not be stated directly, and you may need to INFER the answer based on the given information. Begin with a brief explanation of your reasoning in NO MORE THAN THREE (3) sentences. Then, return the final answer on a new line.\n\nQuestion: {question}",
30
+ "needle": "{1} is next to where {CHAR} lives.",
31
+ "questions": {
32
+ "onehop": "Which character has been to {2}?",
33
+ "twohop": "Which character has been to {3}?"
34
+ },
35
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
36
+ "tests": {
37
+ "T17_C02": {
38
+ "input_args": ["The Kiasma museum", "Helsinki", "Uusimaa"]
39
+ },
40
+ "T15_C02": {
41
+ "input_args": ["The European Central Bank", "Frankfurt", "Germany"]
42
+ },
43
+ "T16_C02": {
44
+ "input_args": ["The Semper Opera House", "Dresden", "the state of Saxony"]
45
+ }
46
+ }
47
+ },
48
+ {
49
+ "id": "0402",
50
+ "reasoning_type": "commonsense_knowledge",
51
+ "system_prompt": "",
52
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Be aware that some details may not be stated directly, and you may need to INFER the answer based on the given information. Begin with a brief explanation of your reasoning in NO MORE THAN THREE (3) sentences. Then, return the final answer on a new line.\n\nQuestion: {question}",
53
+ "needle": "A message came in from {CHAR} saying, \"I'm lactose intolerant,\" and nothing more.",
54
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
55
+ "questions": {
56
+ "onehop": "Which character cannot drink {1}?",
57
+ "twohop": "Which character cannot drink {2}?"
58
+ },
59
+ "tests": {
60
+ "T01_C02": {
61
+ "input_args": ["milk", "a cappuccino"]
62
+ },
63
+ "T04_C02": {
64
+ "input_args": ["milk", "a caffè mocha"]
65
+ }
66
+ }
67
+ },
68
+ {
69
+ "id": "0402Inv",
70
+ "reasoning_type": "commonsense_knowledge",
71
+ "system_prompt": "",
72
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Be aware that some details may not be stated directly, and you may need to INFER the answer based on the given information. Begin with a brief explanation of your reasoning in NO MORE THAN THREE (3) sentences. Then, return the final answer on a new line.\n\nQuestion: {question}",
73
+ "needle": "A message came in saying, \"I'm lactose intolerant,\" from {CHAR}.",
74
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
75
+ "questions": {
76
+ "onehop": "Which character cannot drink {1}?",
77
+ "twohop": "Which character cannot drink {2}?"
78
+ },
79
+ "tests": {
80
+ "T01_C02": {
81
+ "input_args": ["milk", "a cappuccino"]
82
+ },
83
+ "T04_C02": {
84
+ "input_args": ["milk", "a caffè mocha"]
85
+ }
86
+ }
87
+ },
88
+ {
89
+ "id": "0405",
90
+ "reasoning_type": "commonsense_knowledge",
91
+ "system_prompt": "",
92
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Be aware that some details may not be stated directly, and you may need to INFER the answer based on the given information. Begin with a brief explanation of your reasoning in NO MORE THAN THREE (3) sentences. Then, return the final answer on a new line.\n\nQuestion: {question}",
93
+ "needle": "Then {CHAR} mentioned that he has been vegan for years.",
94
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
95
+ "questions": {
96
+ "onehop": "Which character cannot eat {1}?",
97
+ "twohop": "Which character cannot eat {2}?"
98
+ },
99
+ "tests": {
100
+ "T01_C02": {
101
+ "input_args": ["fish-based meals", "Brandade"]
102
+ },
103
+ "T04_C02": {
104
+ "input_args": ["egg-based meals", "an omelette"]
105
+ }
106
+ }
107
+ },
108
+ {
109
+ "id": "0405Inv",
110
+ "reasoning_type": "commonsense_knowledge",
111
+ "system_prompt": "",
112
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Be aware that some details may not be stated directly, and you may need to INFER the answer based on the given information. Begin with a brief explanation of your reasoning in NO MORE THAN THREE (3) sentences. Then, return the final answer on a new line.\n\nQuestion: {question}",
113
+ "needle": "There was a vegan guest, named {CHAR}.",
114
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
115
+ "questions": {
116
+ "onehop": "Which character cannot eat {1}?",
117
+ "twohop": "Which character cannot eat {2}?"
118
+ },
119
+ "tests": {
120
+ "T01_C02": {
121
+ "input_args": ["fish-based meals", "Brandade"]
122
+ },
123
+ "T04_C02": {
124
+ "input_args": ["egg-based meals", "an omelette"]
125
+ }
126
+ }
127
+ },
128
+ {
129
+ "id": "0408",
130
+ "reasoning_type": "world_knowledge",
131
+ "system_prompt": "",
132
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Be aware that some details may not be stated directly, and you may need to INFER the answer based on the given information. Begin with a brief explanation of your reasoning in NO MORE THAN THREE (3) sentences. Then, return the final answer on a new line.\n\nQuestion: {question}",
133
+ "needle": "In 2013, after waiting in line for hours, {CHAR} finally saw the original {1} painting up close.",
134
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
135
+ "questions": {
136
+ "onehop": "Which character has been to {2}?",
137
+ "twohop": "Which character has been to {3}?",
138
+ "twohop2": "Which character has been to {4}?"
139
+ },
140
+ "tests": {
141
+ "T01_C02": {
142
+ "input_args": ["'Girl with a Pearl Earring'", "the Mauritshuis", "The Hague", "the Netherlands"]
143
+ },
144
+ "T04_C02": {
145
+ "input_args": ["'Garden of Earthly Delights'", "the Museo del Prado", "Madrid", "Spain"]
146
+ },
147
+ "T05_C02": {
148
+ "input_args": ["'Impression, Sunrise'", "the Musée Marmottan Monet", "Paris", "France"]
149
+ }
150
+ }
151
+ },
152
+ {
153
+ "id": "0408Inv",
154
+ "reasoning_type": "world_knowledge",
155
+ "system_prompt": "",
156
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Be aware that some details may not be stated directly, and you may need to INFER the answer based on the given information. Begin with a brief explanation of your reasoning in NO MORE THAN THREE (3) sentences. Then, return the final answer on a new line.\n\nQuestion: {question}",
157
+ "needle": "In 2013, the original {1} painting was seen up close by {CHAR}, finally, after waiting in line for hours.",
158
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
159
+ "questions": {
160
+ "onehop": "Which character has been to {2}?",
161
+ "twohop": "Which character has been to {3}?",
162
+ "twohop2": "Which character has been to {4}?"
163
+ },
164
+ "tests": {
165
+ "T01_C02": {
166
+ "input_args": ["'Girl with a Pearl Earring'", "the Mauritshuis", "The Hague", "the Netherlands"]
167
+ },
168
+ "T04_C02": {
169
+ "input_args": ["'Garden of Earthly Delights'", "the Museo del Prado", "Madrid", "Spain"]
170
+ },
171
+ "T05_C02": {
172
+ "input_args": ["'Impression, Sunrise'", "the Musée Marmottan Monet", "Paris", "France"]
173
+ }
174
+ }
175
+ },
176
+ {
177
+ "id": "0409Inv",
178
+ "reasoning_type": "world_knowledge",
179
+ "system_prompt": "",
180
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Be aware that some details may not be stated directly, and you may need to INFER the answer based on the given information. Begin with a brief explanation of your reasoning in NO MORE THAN THREE (3) sentences. Then, return the final answer on a new line.\n\nQuestion: {question}",
181
+ "needle": "There was an engineer living in {1}, named {CHAR}.",
182
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
183
+ "questions": {
184
+ "onehop": "Which character has been to {2}?"
185
+ },
186
+ "tests": {
187
+ "T09_C02": {
188
+ "input_args": ["Witbank", "South Africa"]
189
+ },
190
+ "T10_C02": {
191
+ "input_args": ["Calvinia", "South Africa"]
192
+ },
193
+ "T04_C02": {
194
+ "input_args": ["Firminy", "France"]
195
+ },
196
+ "T05_C02": {
197
+ "input_args": ["Vierzon", "France"]
198
+ },
199
+ "T07_C02": {
200
+ "input_args": ["Borujerd", "Iran"]
201
+ },
202
+ "T08_C02": {
203
+ "input_args": ["Lahijan", "Iran"]
204
+ }
205
+ }
206
+ },
207
+ {
208
+ "id": "0409",
209
+ "reasoning_type": "world_knowledge",
210
+ "system_prompt": "",
211
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Be aware that some details may not be stated directly, and you may need to INFER the answer based on the given information. Begin with a brief explanation of your reasoning in NO MORE THAN THREE (3) sentences. Then, return the final answer on a new line.\n\nQuestion: {question}",
212
+ "needle": "There was {CHAR} who was an engineer living in {1}.",
213
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
214
+ "questions": {
215
+ "onehop": "Which character has been to {2}?"
216
+ },
217
+ "tests": {
218
+ "T09_C02": {
219
+ "input_args": ["Witbank", "South Africa"]
220
+ },
221
+ "T10_C02": {
222
+ "input_args": ["Calvinia", "South Africa"]
223
+ },
224
+ "T04_C02": {
225
+ "input_args": ["Firminy", "France"]
226
+ },
227
+ "T05_C02": {
228
+ "input_args": ["Vierzon", "France"]
229
+ },
230
+ "T07_C02": {
231
+ "input_args": ["Borujerd", "Iran"]
232
+ },
233
+ "T08_C02": {
234
+ "input_args": ["Lahijan", "Iran"]
235
+ }
236
+ }
237
+ }
238
+ ]
needlesets/needle_set_w_Distractor.json ADDED
@@ -0,0 +1,278 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "id": "0401",
4
+ "reasoning_type": "world_knowledge",
5
+ "system_prompt": "",
6
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
7
+ "needle": "Actually, {CHAR} lives next to {1}.",
8
+ "questions": {
9
+ "onehop": "Which character has been to {2}?",
10
+ "twohop": "Which character has been to {3}?"
11
+ },
12
+ "distractors": {
13
+ "onehop": "There was an article about {2} in the daily newspaper.",
14
+ "twohop": "There was an article about {3} in the daily newspaper."
15
+ },
16
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
17
+ "tests": {
18
+ "T17_C02": {
19
+ "input_args": ["the Kiasma museum", "Helsinki", "Uusimaa"]
20
+ },
21
+ "T15_C02": {
22
+ "input_args": ["the European Central Bank", "Frankfurt", "Germany"]
23
+ },
24
+ "T16_C02": {
25
+ "input_args": ["the Semper Opera House", "Dresden", "the state of Saxony"]
26
+ }
27
+ }
28
+ },
29
+ {
30
+ "id": "0401Inv",
31
+ "reasoning_type": "world_knowledge",
32
+ "system_prompt": "",
33
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
34
+ "needle": "{1} is next to where {CHAR} lives.",
35
+ "questions": {
36
+ "onehop": "Which character has been to {2}?",
37
+ "twohop": "Which character has been to {3}?"
38
+ },
39
+ "distractors": {
40
+ "onehop": "There was an article about {2} in the daily newspaper.",
41
+ "twohop": "There was an article about {3} in the daily newspaper."
42
+ },
43
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
44
+ "tests": {
45
+ "T17_C02": {
46
+ "input_args": ["The Kiasma museum", "Helsinki", "Uusimaa"]
47
+ },
48
+ "T15_C02": {
49
+ "input_args": ["The European Central Bank", "Frankfurt", "Germany"]
50
+ },
51
+ "T16_C02": {
52
+ "input_args": ["The Semper Opera House", "Dresden", "the state of Saxony"]
53
+ }
54
+ }
55
+ },
56
+ {
57
+ "id": "0402",
58
+ "reasoning_type": "commonsense_knowledge",
59
+ "system_prompt": "",
60
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
61
+ "needle": "A message came in from {CHAR} saying, \"I'm lactose intolerant,\" and nothing more.",
62
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
63
+ "questions": {
64
+ "onehop": "Which character cannot drink {1}?",
65
+ "twohop": "Which character cannot drink {2}?"
66
+ },
67
+ "distractors": {
68
+ "onehop": "There was a photo of {1} in the daily newspaper.",
69
+ "twohop": "There was a photo of {2} in the daily newspaper."
70
+ },
71
+ "tests": {
72
+ "T01_C02": {
73
+ "input_args": ["milk", "a cappuccino"]
74
+ },
75
+ "T04_C02": {
76
+ "input_args": ["milk", "a caffè mocha"]
77
+ }
78
+ }
79
+ },
80
+ {
81
+ "id": "0402Inv",
82
+ "reasoning_type": "commonsense_knowledge",
83
+ "system_prompt": "",
84
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
85
+ "needle": "A message came in saying, \"I'm lactose intolerant,\" from {CHAR}.",
86
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
87
+ "questions": {
88
+ "onehop": "Which character cannot drink {1}?",
89
+ "twohop": "Which character cannot drink {2}?"
90
+ },
91
+ "distractors": {
92
+ "onehop": "There was a photo of {1} in the daily newspaper.",
93
+ "twohop": "There was a photo of {2} in the daily newspaper."
94
+ },
95
+ "tests": {
96
+ "T01_C02": {
97
+ "input_args": ["milk", "a cappuccino"]
98
+ },
99
+ "T04_C02": {
100
+ "input_args": ["milk", "a caffè mocha"]
101
+ }
102
+ }
103
+ },
104
+ {
105
+ "id": "0405",
106
+ "reasoning_type": "commonsense_knowledge",
107
+ "system_prompt": "",
108
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
109
+ "needle": "Then {CHAR} mentioned that he has been vegan for years.",
110
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
111
+ "questions": {
112
+ "onehop": "Which character cannot eat {1}?",
113
+ "twohop": "Which character cannot eat {2}?"
114
+ },
115
+ "distractors": {
116
+ "onehop": "There was an article about {1} in the daily newspaper.",
117
+ "twohop": "There was an article about {2} in the daily newspaper."
118
+ },
119
+ "tests": {
120
+ "T01_C02": {
121
+ "input_args": ["fish-based meals", "Brandade"]
122
+ },
123
+ "T04_C02": {
124
+ "input_args": ["egg-based meals", "an omelette"]
125
+ }
126
+ }
127
+ },
128
+ {
129
+ "id": "0405Inv",
130
+ "reasoning_type": "commonsense_knowledge",
131
+ "system_prompt": "",
132
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
133
+ "needle": "There was a vegan guest, named {CHAR}.",
134
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
135
+ "questions": {
136
+ "onehop": "Which character cannot eat {1}?",
137
+ "twohop": "Which character cannot eat {2}?"
138
+ },
139
+ "distractors": {
140
+ "onehop": "There was an article about {1} in the daily newspaper.",
141
+ "twohop": "There was an article about {2} in the daily newspaper."
142
+ },
143
+ "tests": {
144
+ "T01_C02": {
145
+ "input_args": ["fish-based meals", "Brandade"]
146
+ },
147
+ "T04_C02": {
148
+ "input_args": ["egg-based meals", "an omelette"]
149
+ }
150
+ }
151
+ },
152
+ {
153
+ "id": "0408",
154
+ "reasoning_type": "world_knowledge",
155
+ "system_prompt": "",
156
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
157
+ "needle": "In 2013, after waiting in line for hours, {CHAR} finally saw the original {1} painting up close.",
158
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
159
+ "questions": {
160
+ "onehop": "Which character has been to {2}?",
161
+ "twohop": "Which character has been to {3}?",
162
+ "twohop2": "Which character has been to {4}?"
163
+ },
164
+ "distractors": {
165
+ "onehop": "There was an article about {2} in the daily newspaper.",
166
+ "twohop": "There was an article about {3} in the daily newspaper.",
167
+ "twohop2": "There was an article about {4} in the daily newspaper."
168
+ },
169
+ "tests": {
170
+ "T01_C02": {
171
+ "input_args": ["'Girl with a Pearl Earring'", "the Mauritshuis", "The Hague", "the Netherlands"]
172
+ },
173
+ "T04_C02": {
174
+ "input_args": ["'Garden of Earthly Delights'", "the Museo del Prado", "Madrid", "Spain"]
175
+ },
176
+ "T05_C02": {
177
+ "input_args": ["'Impression, Sunrise'", "the Musée Marmottan Monet", "Paris", "France"]
178
+ }
179
+ }
180
+ },
181
+ {
182
+ "id": "0408Inv",
183
+ "reasoning_type": "world_knowledge",
184
+ "system_prompt": "",
185
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
186
+ "needle": "In 2013, the original {1} painting was seen up close by {CHAR}, finally, after waiting in line for hours.",
187
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
188
+ "questions": {
189
+ "onehop": "Which character has been to {2}?",
190
+ "twohop": "Which character has been to {3}?",
191
+ "twohop2": "Which character has been to {4}?"
192
+ },
193
+ "distractors": {
194
+ "onehop": "There was an article about {2} in the daily newspaper.",
195
+ "twohop": "There was an article about {3} in the daily newspaper.",
196
+ "twohop2": "There was an article about {4} in the daily newspaper."
197
+ },
198
+ "tests": {
199
+ "T01_C02": {
200
+ "input_args": ["'Girl with a Pearl Earring'", "the Mauritshuis", "The Hague", "the Netherlands"]
201
+ },
202
+ "T04_C02": {
203
+ "input_args": ["'Garden of Earthly Delights'", "the Museo del Prado", "Madrid", "Spain"]
204
+ },
205
+ "T05_C02": {
206
+ "input_args": ["'Impression, Sunrise'", "the Musée Marmottan Monet", "Paris", "France"]
207
+ }
208
+ }
209
+ },
210
+ {
211
+ "id": "0409Inv",
212
+ "reasoning_type": "world_knowledge",
213
+ "system_prompt": "",
214
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
215
+ "needle": "There was an engineer living in {1}, named {CHAR}.",
216
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
217
+ "questions": {
218
+ "onehop": "Which character has been to {2}?"
219
+ },
220
+ "distractors": {
221
+ "onehop": "There was an article about {2} in the daily newspaper."
222
+ },
223
+ "tests": {
224
+ "T09_C02": {
225
+ "input_args": ["Witbank", "South Africa"]
226
+ },
227
+ "T10_C02": {
228
+ "input_args": ["Calvinia", "South Africa"]
229
+ },
230
+ "T04_C02": {
231
+ "input_args": ["Firminy", "France"]
232
+ },
233
+ "T05_C02": {
234
+ "input_args": ["Vierzon", "France"]
235
+ },
236
+ "T07_C02": {
237
+ "input_args": ["Borujerd", "Iran"]
238
+ },
239
+ "T08_C02": {
240
+ "input_args": ["Lahijan", "Iran"]
241
+ }
242
+ }
243
+ },
244
+ {
245
+ "id": "0409",
246
+ "reasoning_type": "world_knowledge",
247
+ "system_prompt": "",
248
+ "task_template": "You will answer a question based on the following book snippet:\n\n{haystack}\n\nUse the information provided in the book snippet to answer the question. Your answer should be short and based on either explicitly stated facts or strong, logical inferences.\n\nQuestion: {question}\n\n Return only the final answer with no additional explanation or reasoning.",
249
+ "needle": "There was {CHAR} who was an engineer living in {1}.",
250
+ "character_set": ["Yuki","Stuart", "Katie","Veronica","Gary", "Megan","Calvin","Mandy","Diana","Caleb"],
251
+ "questions": {
252
+ "onehop": "Which character has been to {2}?"
253
+ },
254
+ "distractors": {
255
+ "onehop": "There was an article about {2} in the daily newspaper."
256
+ },
257
+ "tests": {
258
+ "T09_C02": {
259
+ "input_args": ["Witbank", "South Africa"]
260
+ },
261
+ "T10_C02": {
262
+ "input_args": ["Calvinia", "South Africa"]
263
+ },
264
+ "T04_C02": {
265
+ "input_args": ["Firminy", "France"]
266
+ },
267
+ "T05_C02": {
268
+ "input_args": ["Vierzon", "France"]
269
+ },
270
+ "T07_C02": {
271
+ "input_args": ["Borujerd", "Iran"]
272
+ },
273
+ "T08_C02": {
274
+ "input_args": ["Lahijan", "Iran"]
275
+ }
276
+ }
277
+ }
278
+ ]