Input
stringlengths 251
41.6k
| Output
stringlengths 137
9.7k
| input_ids
sequencelengths 157
2.05k
| attention_mask
sequencelengths 157
2.05k
| labels
sequencelengths 157
2.05k
|
---|---|---|---|---|
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
in this paper the author investigates how to utilize largescale human video to train dexterous robot manipulation skills to leverage the information from the internet videos the author proposes a handful of techniques to preprocess the video data to extract the action information then the network is trained on the extracted hand data and deployed to the real robot with some human demonstration collected by teleoperation for finetuning experiments show that the proposed pipeline can solve multiple manipulation tasks strength the direction explored in this paper is important utilizing the internet video data for robot learning is well motivated especially considering the similarity between human and multifinger hands this direction looks very promising the authors perform experiments with multiple realworld tasks with pick and place pushing and rotating objects weakness although the objective of this paper is very impressive the experiments can not support the introduction and there are multiple overclaims section 4 is titled videodex learning dexterity from youtube however i can not find any evidence that the author utilizes youtube data for learning dexterous manipulation as mentioned in the section on retargeting wrist pose orb slam and the cameras acceleration data are used to compute the camera pose trajectory this information is not readily available in the youtube data the experiments and methods are misaligned with this claim in the introduction line 42 the author mentioned that our key insight is to combine these visual and action priors from passive data with the physical constraints of how robots should move in the world however the method does not consider the surroundings of the human hand and the detection results itself is not accurate how to incorporate physical information into the training data missing literature discussion on previous learning from video works 1 dexmv imitation learning for dexterous manipulation from human videos 2021 this paper focuses also on how to learn dexterous manipulation from human videos the reviewer understands that this literature paper uses simulated tasks while the authors focus on the real robot settings but it seems that similar pipelines are also used in this paper estimating the human hand retargeting and learning from retargeted hand pose 2 the surprising effectiveness of representation learning for visual imitation 2021 this paper also focuses on how to leverage the video data for better learning it also uses a gopro camera to collect a video of each trajectory which is the same as the ego4d dataset used in this paper it shows that by learning from this video data the final manipulation performance can be improved a lot these literature works use very similar methods to achieve robot learning the novelty claims of this paper can also be found in this literature missing details for retargeting wrist pose the detection module frankmocap is a 2d hand detector it is not clear how the author can get 3d keypoints from the hand model in the camera frame also this section is important in the whole technical approach it is better to provide visualization of the final retargeted robot a hand wrist pose and robot arm should also be visualized in figure 3 if they are used in the training if the wrist pose and arm joint pose is not used how to pretrain the action prior missing details about transforms in the equation it is not clear why the author uses t and m to denote pose simultaneously what are the differences if m is also a se3 transformation how to compute the position part of the mworldc1 besides the reviewer can not find any information about how the trobotworld is determined heuristically in both the main paper and supplementary docsepthe authors demonstrate a system in which they combine a few different components to get interesting supervisedlearned open loop behavior of real robot hands doing several different tasks in particular the most notable part of the approach is using videos of human hands as an action prior which informs their supervised mapping strengths good core idea the overall idea of using action priors from human videos via hand tracking to make robots work better is a good idea there are a lot of closely related works but i think they are well referenced in this paper good execution on several key parts the execution details of handling moving cameras with camera pose tracking together with perframe hand tracking seems to be well done i also like just using r3m features out of the box this is smart and interesting to see external validation results of real robots with hands doing a variety of things weaknesses there are various unscientific elements of this paper in its current form while the work is interesting i cant recommend a strong accept for a paper in this form hopefully the list below will help the authors improve both this work and their future work if the authors can address all of the following weaknesses in their rebuttal which i think is all doable and within scope to do in a rebuttal id be happy to move from weak accept to strong accept 1 it seems like the authors are not very upfront about the fact that this method does not produce closed loop policies only on the last page or two is it mentioned that the whole method is open loop this is fine to study the task of i inputting an image of a scene and ii outputting an open loop trajectory but it of course is very limiting the tasks are carefully chosen such that they dont require any closed loop feedback this aspect of their approach is not what most researchers in the field would expect so a common experience of a researcher would be to look over the first handful of pages of this paper and only at the last page or so realize that this is an open loop method please just make this clear up front 2 several false statements in the introduction to build such robotic agents that can operate anywhere we need access to a lot of successful robot interaction data in many environments not necessarily true this is a reasonable hypothesis but one that isnt tested in this paper and it cant be stated as a fact however deploying inexperienced real world robots to collect experience must require constant supervision which is in feasible also not necessarily true but also a very reasonable hypothesis just need to say may require instead most of the inefficiency in robot learning is due to the exponentially large action space an opinion and cant be stated as fact 3 ndps can produce safe and smooth trajectories yes but this is a meaningless statement they can also produce trajectories that are completely unsafe there is nothing about ndpsdmps that provides safety other than a bit of smoothness that may arguably help but there is nothing that helps here with the presence of obstacles in the environment or humans etc this statement probably only serves to confusemislead inexperienced readers please removefix 4 the paper mentions a physical prior as a key component but this is just that it uses dynamic movement primitives it seems im not sure this is the best way to communicate this line 191 also says physicallyaware ndps they dont know anything about contact physics maybe just say second order system or dynamical system or something maybe physicallyinspired but not physicallyaware and whenever it says for example line 269 baselines without a physical prior it should just be instead clear that this just means they dont use dmps 5 line 213 is videodex able to perform general purpose manipulation since the method is open loop the answer is no thats fine and the results are still impressive but should be clarified this is not something that needs to be empirically evaluated its just a result of the formulation 6 its very confusing that citation 44 is used open loop this isnt an intention of the method also is the rnn version closed loop over time its not clear and if its not im not sure how the rnn would be any different if its not used sequentially over time 7 please state exactly how many demonstrations were used for the different experiments 8 in the conclusion this is because training rl in the real world is difficult due to hardware limitations yes but this isnt reason to make the used behavior cloning method open loop instead of closed loop minor dont worry about these too much but i mention these as opportunities to improve the paper further ego4d is not cited on page 2 mentioned but not cited hr is not defined in an equation also i would recommend not using two letters for a math symbol it looks like a matrix h multiplied by a matrix r why use orbslam3 rather than colmap for the poses already running colmap for the calibration docsepvideodex pretrains a policy network with videos with gyroscope and accelerometer data of humans performing a task then finetunes with demonstrating trajectories collected by teleoperating the robot in order to train with the human data they use the approach from 49 for mapping human pose to robot pose and use orbslam355 to account for the camera motion they feed the image data labeled with the outputted pose into a resnet1815 backbone initialized with r3ms6 features and use a neural dynamic policy ndp 13 network to generate actions the paper demonstrates that using human data allows improved performance on 67 tasks pros the paper presents a theoretically simple method of learning from videos of humans the method is demonstrated on 7 different tasks outperforming the baselines without human data on 6 of them cons the writing of the paper is somewhat scattered the analysis of why the proposed approach using ndp rather than a mlp works better with human data could be stronger the paper needs to be much clearer that it relies on gyroscope and accelerometer data from the human videos which is a barrier to truly using internetscale data
### Summary: | this paper studies how to learn dexterous manipulation from human videos in the initial review the reviewer appreciated the direction and realworld experiment but also raised concerns about the need of special sensor for tracking during rebuttal the authors effectively addressed this concern by providing additional experiment results and reviewers were satisfied with the response ac would like to recommend acceptance for this paper | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
2929,
253,
2488,
2340,
684,
849,
281,
16584,
1236,
2510,
25912,
1966,
3492,
281,
6194,
27625,
350,
528,
15688,
19763,
6936,
281,
25057,
253,
1491,
432,
253,
8573,
10556,
253,
2488,
29328,
247,
17167,
273,
5609,
281,
638,
7404,
253,
3492,
941,
281,
4908,
253,
2250,
1491,
840,
253,
2990,
310,
10166,
327,
253,
10375,
1133,
941,
285,
18329,
281,
253,
1524,
15688,
342,
690,
1966,
20028,
5728,
407,
4014,
20936,
323,
1442,
292,
25004,
4679,
921,
326,
253,
4081,
15722,
476,
8415,
2709,
19763,
8892,
50276,
45563,
50275,
783,
3884,
14859,
275,
436,
2929,
310,
1774,
17617,
253,
8573,
3492,
941,
323,
15688,
4715,
310,
973,
17194,
3340,
7296,
253,
14259,
875,
1966,
285,
25274,
4940,
3564,
436,
3884,
4453,
1077,
12532,
50274,
783,
4477,
1347,
4679,
342,
2709,
1524,
10186,
8892,
342,
2619,
285,
1659,
13383,
285,
17387,
5113,
50276,
20881,
1255,
50275,
20261,
253,
8103,
273,
436,
2929,
310,
1077,
13943,
253,
4679,
476,
417,
1329,
253,
10199,
285,
627,
403,
2709,
689,
28803,
50275,
4674,
577,
310,
18879,
8851,
853,
89,
4715,
27625,
350,
414,
432,
49683,
2299,
891,
476,
417,
1089,
667,
1941,
326,
253,
2488,
29820,
49683,
941,
323,
4715,
27625,
350,
528,
19763,
347,
5393,
275,
253,
2593,
327,
851,
1816,
272,
20051,
16753,
36391,
48041,
285,
253,
14693,
17680,
941,
403,
908,
281,
11897,
253,
6568,
16753,
18974,
436,
1491,
310,
417,
12450,
2130,
275,
253,
49683,
941,
253,
4679,
285,
3082,
403,
3731,
2132,
342,
436,
1750,
50275,
249,
253,
10199,
1386,
5976,
253,
2488,
5393,
326,
776,
2234,
12288,
310,
281,
13398,
841,
5304,
285,
2250,
2235,
641,
432,
16864,
941,
342,
253,
3520,
10806,
273,
849,
25497,
943,
2118,
275,
253,
1533,
2299,
253,
1332,
1057,
417,
1908,
253,
27762,
273,
253,
1966,
1133,
285,
253,
5481,
1543,
3139,
310,
417,
7899,
849,
281,
19071,
3520,
1491,
715,
253,
3733,
941,
50275,
33722,
6239,
5955,
327,
2045,
4715,
432,
3492,
2987,
50272,
18,
27625,
36386,
45738,
4715,
323,
27625,
350,
528,
19763,
432,
1966,
10556,
43425,
436,
2929,
16633,
671,
327,
849,
281,
3037,
27625,
350,
528,
19763,
432,
1966,
10556,
253,
37317,
24586,
326,
436,
6239,
2929,
4648,
15524,
8892,
1223,
253,
4477,
2770,
327,
253,
1524,
15688,
7533,
533,
352,
3133,
326,
2074,
44387,
403,
671,
908,
275,
436,
2929,
26230,
253,
1966,
1133,
851,
1816,
272,
285,
4715,
432,
851,
1816,
264,
1133,
16753,
50272,
19,
253,
10084,
12510,
273,
6779,
4715,
323,
5304,
45738,
43425,
436,
2929,
671,
16633,
327,
849,
281,
25057,
253,
3492,
941,
323,
1805,
4715,
352,
671,
4648,
247,
305,
39587,
6568,
281,
4822,
247,
3492,
273,
1016,
18974,
534,
310,
253,
1072,
347,
253,
23057,
21,
69,
10895,
908,
275,
436,
2929,
352,
2722,
326,
407,
4715,
432,
436,
3492,
941,
253,
2457,
19763,
3045,
476,
320,
5520,
247,
2257,
50272,
20513,
6239,
2987,
897,
1077,
2074,
3082,
281,
5115,
15688,
4715,
253,
38135,
3916,
273,
436,
2929,
476,
671,
320,
1119,
275,
436,
6239,
50275,
33722,
4278,
323,
851,
1816,
272,
20051,
16753,
253,
5481,
6333,
21332,
78,
406,
522,
310,
247,
374,
69,
1133,
13562,
352,
310,
417,
2590,
849,
253,
2488,
476,
755,
495,
69,
2234,
10801,
432,
253,
1133,
1566,
275,
253,
6568,
3665,
671,
436,
2593,
310,
1774,
275,
253,
2644,
7681,
2746,
352,
310,
1805,
281,
2085,
24426,
273,
253,
2457,
851,
1816,
264,
15688,
247,
1133,
20051,
16753,
285,
15688,
4430,
943,
671,
320,
27130,
275,
4677,
495,
604,
597,
403,
908,
275,
253,
3733,
604,
253,
20051,
16753,
285,
4430,
6036,
16753,
310,
417,
908,
849,
281,
3215,
1949,
253,
2250,
2720,
50275,
33722,
4278,
670,
29698,
275,
253,
5150,
352,
310,
417,
2590,
2139,
253,
2488,
4648,
246,
285,
278,
281,
9173,
16753,
10486,
752,
403,
253,
3910,
604,
278,
310,
671,
247,
396,
20,
9261,
849,
281,
11897,
253,
1899,
629,
273,
253,
278,
10186,
68,
18,
16280,
253,
37317,
476,
417,
1089,
667,
1491,
670,
849,
253,
6727,
12042,
10186,
310,
3413,
344,
321,
18260,
275,
1097,
253,
2022,
2929,
285,
24864,
50276,
7152,
339,
431,
248,
4477,
7568,
247,
985,
275,
534,
597,
13398,
247,
1643,
1027,
4295,
281,
755,
4722,
22296,
29343,
264,
1527,
6287,
3879,
273,
1524,
15688,
3564,
2509,
2067,
1027,
8892,
275,
1798,
253,
954,
16613,
629,
273,
253,
2746,
310,
970,
10556,
273,
1966,
3564,
347,
271,
2250,
2720,
534,
46210,
616,
22296,
10603,
50276,
296,
3755,
20556,
50275,
12311,
5161,
2934,
253,
4583,
2934,
273,
970,
2250,
2235,
641,
432,
1966,
10556,
3066,
1133,
12544,
281,
1056,
25497,
789,
1805,
310,
247,
1175,
2934,
627,
403,
247,
2257,
273,
8244,
2905,
2987,
533,
891,
1158,
597,
403,
973,
23378,
275,
436,
2929,
50275,
12311,
10636,
327,
2067,
2234,
4243,
253,
10636,
4278,
273,
10885,
4886,
14693,
342,
6568,
16753,
12544,
2366,
342,
591,
6301,
1133,
12544,
3133,
281,
320,
973,
2218,
891,
671,
751,
816,
970,
391,
20,
78,
3386,
562,
273,
253,
3817,
436,
310,
7060,
285,
4722,
281,
923,
6024,
12820,
50275,
16680,
273,
1524,
25497,
342,
3564,
2509,
247,
5235,
273,
1841,
50274,
20881,
1255,
265,
50276,
9088,
403,
2710,
440,
47458,
3603,
273,
436,
2929,
275,
697,
1655,
830,
50276,
6050,
253,
789,
310,
4722,
891,
16216,
5583,
247,
2266,
2997,
323,
247,
2929,
275,
436,
830,
18670,
253,
1618,
2708,
588,
1361,
253,
4477,
3157,
1097,
436,
789,
285,
616,
2852,
789,
50276,
338,
253,
4477,
476,
2953,
512,
273,
253,
1563,
32213,
275,
616,
30080,
22559,
534,
891,
1158,
310,
512,
513,
494,
285,
1561,
7990,
281,
513,
275,
247,
30080,
22559,
2654,
320,
5211,
281,
2118,
432,
5075,
2997,
281,
2266,
2997,
50275,
18,
352,
3133,
751,
253,
4477,
403,
417,
1077,
598,
6342,
670,
253,
958,
326,
436,
1332,
1057,
417,
4711,
4581,
6287,
7823,
760,
327,
253,
1390,
3239,
390,
767,
310,
352,
5393,
326,
253,
2644,
1332,
310,
1527,
6287,
436,
310,
4030,
281,
1263,
253,
4836,
273,
891,
3280,
1076,
271,
2460,
273,
247,
6200,
285,
21255,
3453,
1076,
271,
1527,
6287,
18974,
533,
352,
273,
2282,
310,
1077,
14155,
253,
8892,
403,
9257,
6777,
824,
326,
597,
13414,
2430,
667,
4581,
6287,
8680,
436,
4809,
273,
616,
2746,
310,
417,
752,
954,
8607,
275,
253,
1673,
651,
1902,
594,
247,
1846,
2793,
273,
247,
22780,
651,
320,
281,
1007,
689,
253,
806,
17167,
273,
7223,
273,
436,
2929,
285,
760,
387,
253,
1390,
3239,
390,
594,
8968,
326,
436,
310,
271,
1527,
6287,
1332,
4496,
816,
1056,
436,
2590,
598,
2914,
374,
2067,
3221,
7234,
275,
253,
10199,
50273,
936,
1973,
824,
35121,
6083,
326,
476,
10196,
9825,
359,
878,
2289,
281,
247,
2257,
273,
5547,
15688,
5016,
941,
275,
1142,
12620,
50276,
1439,
7933,
2032,
436,
310,
247,
5272,
9079,
533,
581,
326,
310,
2649,
5762,
275,
436,
2929,
285,
352,
16216,
320,
4767,
347,
247,
958,
50273,
35529,
45021,
275,
43268,
1524,
1533,
25497,
281,
4822,
2793,
1364,
2430,
3638,
20446,
534,
310,
275,
17887,
50276,
12563,
417,
7933,
2032,
533,
671,
247,
1077,
5272,
9079,
816,
878,
281,
1333,
778,
2430,
3185,
50273,
2252,
273,
253,
275,
46505,
275,
15688,
4715,
310,
1955,
281,
253,
28596,
1781,
2250,
2317,
50276,
266,
4743,
285,
16216,
320,
4767,
347,
958,
495,
40515,
793,
476,
4711,
4999,
285,
6032,
24102,
50276,
9820,
533,
436,
310,
247,
34209,
3908,
597,
476,
671,
4711,
24102,
326,
403,
4336,
20372,
627,
310,
2717,
670,
40515,
793,
17670,
793,
326,
3400,
5252,
643,
685,
247,
2372,
273,
6032,
1255,
326,
778,
25711,
1361,
533,
627,
310,
2717,
326,
7729,
1060,
342,
253,
3361,
273,
24238,
275,
253,
3126,
390,
7497,
3966,
436,
3908,
3164,
760,
11029,
281,
1461,
316,
49245,
26460,
275,
43268,
10668,
4496,
5386,
11097,
577,
253,
2929,
25957,
247,
3520,
2720,
347,
247,
2234,
4445,
533,
436,
310,
816,
326,
352,
4648,
7870,
4866,
2248,
23223,
352,
3133,
516,
417,
2119,
436,
310,
253,
1682,
1039,
281,
13791,
436,
1386,
27446,
671,
2296,
13318,
13823,
40515,
793,
597,
13414,
871,
2712,
670,
3057,
12057,
5046,
816,
1333,
1273,
1340,
985,
390,
18525,
985,
390,
1633,
5046,
13318,
38358,
533,
417,
13318,
13823,
285,
10793,
352,
2296,
323,
1650,
1386,
28534,
1666,
25379,
1293,
247,
3520,
2720,
352,
943,
816,
320,
3185,
2590,
326,
436,
816,
2097,
597,
13414,
897,
42961,
793,
608,
1386,
25098,
50276,
261,
8851,
853,
89,
2104,
281,
1347,
2087,
4096,
19763,
1580,
253,
1332,
310,
1527,
6287,
253,
3662,
310,
642,
28763,
4030,
285,
253,
1543,
403,
1335,
13943,
533,
943,
320,
31637,
436,
310,
417,
1633,
326,
3198,
281,
320,
45190,
6760,
697,
816,
247,
906,
273,
253,
15895,
721,
697,
1077,
21643,
326,
25577,
7127,
310,
908,
1527,
6287,
436,
310,
2649,
271,
8208,
273,
253,
1332,
671,
310,
253,
391,
9866,
2715,
4581,
6287,
689,
673,
697,
417,
2590,
285,
604,
697,
417,
50276,
303,
417,
2119,
849,
253,
391,
9866,
651,
320,
667,
1027,
604,
697,
417,
908,
32627,
689,
673,
50276,
24,
4496,
1375,
4555,
849,
1142,
32367,
497,
908,
323,
253,
1027,
4679,
50276,
25,
275,
253,
6452,
50276,
2520,
310,
984,
3733,
391,
77,
275,
253,
1524,
1533,
310,
2834,
1955,
281,
10309,
7364,
4754,
533,
436,
310,
2649,
1921,
281,
1056,
253,
908,
3879,
34591,
1332,
1527,
6287,
3185,
273,
4581,
6287,
50274,
37585,
50276,
69,
834,
7664,
670,
841,
1512,
1199,
533,
891,
3748,
841,
347,
9091,
281,
3157,
253,
2929,
2007,
50274,
32844,
21,
69,
310,
417,
11106,
327,
3239,
374,
5393,
533,
417,
11106,
50276,
6285,
310,
417,
2931,
275,
271,
5150,
671,
891,
651,
5583,
417,
970,
767,
4876,
323,
247,
14168,
9484,
352,
4453,
751,
247,
4315,
288,
31458,
407,
247,
4315,
391,
50276,
22309,
897,
390,
1768,
5247,
20,
2581,
685,
847,
4251,
323,
253,
24543,
2168,
3515,
847,
4251,
323,
253,
18543,
50273,
7152,
33032,
87,
504,
853,
89,
3215,
44196,
247,
3646,
2990,
342,
10556,
342,
19859,
10425,
1714,
285,
17308,
11955,
941,
273,
7497,
9591,
247,
4836,
840,
1442,
292,
14124,
342,
17227,
24102,
5728,
407,
4014,
2211,
839,
253,
15688,
50276,
249,
1340,
281,
6194,
342,
253,
1966,
941,
597,
897,
253,
2746,
432,
7584,
323,
10603,
1966,
16753,
281,
15688,
16753,
285,
897,
390,
1768,
5247,
22341,
281,
2395,
323,
253,
6568,
3200,
50276,
9328,
3997,
253,
2460,
941,
13130,
342,
253,
3453,
8659,
16753,
715,
247,
501,
3024,
1093,
1010,
27882,
31260,
342,
391,
20,
983,
23,
3386,
285,
897,
247,
11454,
7870,
3646,
295,
12132,
2145,
2990,
281,
6635,
5231,
50276,
783,
2929,
14371,
326,
970,
1966,
941,
4483,
5520,
3045,
327,
9963,
8892,
5847,
253,
2929,
10262,
247,
28055,
2969,
1332,
273,
4715,
432,
10556,
273,
7497,
50276,
783,
1332,
310,
5183,
327,
818,
1027,
8892,
41731,
14692,
253,
1666,
25379,
1293,
1966,
941,
327,
721,
273,
731,
50275,
5040,
253,
4028,
273,
253,
2929,
310,
8489,
17485,
50276,
783,
1783,
273,
2139,
253,
4081,
2746,
970,
295,
12132,
2581,
685,
247,
13361,
81,
2987,
1805,
342,
1966,
941,
812,
320,
10046,
50275,
783,
2929,
3198,
281,
320,
1199,
30909,
326,
352,
15771,
327,
19859,
10425,
1714,
285,
17308,
11955,
941,
432,
253,
1966,
10556,
534,
310,
247,
11394,
281,
7777,
970,
8573,
7527,
941,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
2175,
849,
281,
3037,
27625,
350,
528,
19763,
432,
1966,
10556,
50274,
249,
253,
3302,
2278,
253,
37317,
14109,
253,
3884,
285,
1524,
10186,
3368,
533,
671,
5439,
50276,
585,
1209,
2224,
670,
253,
878,
273,
2714,
8468,
323,
12544,
1309,
30080,
22559,
253,
4477,
8069,
9713,
436,
4468,
407,
5277,
3081,
3368,
1543,
285,
30628,
497,
10048,
342,
253,
2380,
50276,
317,
651,
751,
281,
5583,
14924,
323,
436,
2929,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
249,
436,
2929,
253,
2488,
2340,
684,
849,
281,
16584,
1236,
2510,
25912,
1966,
3492,
281,
6194,
27625,
350,
528,
15688,
19763,
6936,
281,
25057,
253,
1491,
432,
253,
8573,
10556,
253,
2488,
29328,
247,
17167,
273,
5609,
281,
638,
7404,
253,
3492,
941,
281,
4908,
253,
2250,
1491,
840,
253,
2990,
310,
10166,
327,
253,
10375,
1133,
941,
285,
18329,
281,
253,
1524,
15688,
342,
690,
1966,
20028,
5728,
407,
4014,
20936,
323,
1442,
292,
25004,
4679,
921,
326,
253,
4081,
15722,
476,
8415,
2709,
19763,
8892,
50276,
45563,
50275,
783,
3884,
14859,
275,
436,
2929,
310,
1774,
17617,
253,
8573,
3492,
941,
323,
15688,
4715,
310,
973,
17194,
3340,
7296,
253,
14259,
875,
1966,
285,
25274,
4940,
3564,
436,
3884,
4453,
1077,
12532,
50274,
783,
4477,
1347,
4679,
342,
2709,
1524,
10186,
8892,
342,
2619,
285,
1659,
13383,
285,
17387,
5113,
50276,
20881,
1255,
50275,
20261,
253,
8103,
273,
436,
2929,
310,
1077,
13943,
253,
4679,
476,
417,
1329,
253,
10199,
285,
627,
403,
2709,
689,
28803,
50275,
4674,
577,
310,
18879,
8851,
853,
89,
4715,
27625,
350,
414,
432,
49683,
2299,
891,
476,
417,
1089,
667,
1941,
326,
253,
2488,
29820,
49683,
941,
323,
4715,
27625,
350,
528,
19763,
347,
5393,
275,
253,
2593,
327,
851,
1816,
272,
20051,
16753,
36391,
48041,
285,
253,
14693,
17680,
941,
403,
908,
281,
11897,
253,
6568,
16753,
18974,
436,
1491,
310,
417,
12450,
2130,
275,
253,
49683,
941,
253,
4679,
285,
3082,
403,
3731,
2132,
342,
436,
1750,
50275,
249,
253,
10199,
1386,
5976,
253,
2488,
5393,
326,
776,
2234,
12288,
310,
281,
13398,
841,
5304,
285,
2250,
2235,
641,
432,
16864,
941,
342,
253,
3520,
10806,
273,
849,
25497,
943,
2118,
275,
253,
1533,
2299,
253,
1332,
1057,
417,
1908,
253,
27762,
273,
253,
1966,
1133,
285,
253,
5481,
1543,
3139,
310,
417,
7899,
849,
281,
19071,
3520,
1491,
715,
253,
3733,
941,
50275,
33722,
6239,
5955,
327,
2045,
4715,
432,
3492,
2987,
50272,
18,
27625,
36386,
45738,
4715,
323,
27625,
350,
528,
19763,
432,
1966,
10556,
43425,
436,
2929,
16633,
671,
327,
849,
281,
3037,
27625,
350,
528,
19763,
432,
1966,
10556,
253,
37317,
24586,
326,
436,
6239,
2929,
4648,
15524,
8892,
1223,
253,
4477,
2770,
327,
253,
1524,
15688,
7533,
533,
352,
3133,
326,
2074,
44387,
403,
671,
908,
275,
436,
2929,
26230,
253,
1966,
1133,
851,
1816,
272,
285,
4715,
432,
851,
1816,
264,
1133,
16753,
50272,
19,
253,
10084,
12510,
273,
6779,
4715,
323,
5304,
45738,
43425,
436,
2929,
671,
16633,
327,
849,
281,
25057,
253,
3492,
941,
323,
1805,
4715,
352,
671,
4648,
247,
305,
39587,
6568,
281,
4822,
247,
3492,
273,
1016,
18974,
534,
310,
253,
1072,
347,
253,
23057,
21,
69,
10895,
908,
275,
436,
2929,
352,
2722,
326,
407,
4715,
432,
436,
3492,
941,
253,
2457,
19763,
3045,
476,
320,
5520,
247,
2257,
50272,
20513,
6239,
2987,
897,
1077,
2074,
3082,
281,
5115,
15688,
4715,
253,
38135,
3916,
273,
436,
2929,
476,
671,
320,
1119,
275,
436,
6239,
50275,
33722,
4278,
323,
851,
1816,
272,
20051,
16753,
253,
5481,
6333,
21332,
78,
406,
522,
310,
247,
374,
69,
1133,
13562,
352,
310,
417,
2590,
849,
253,
2488,
476,
755,
495,
69,
2234,
10801,
432,
253,
1133,
1566,
275,
253,
6568,
3665,
671,
436,
2593,
310,
1774,
275,
253,
2644,
7681,
2746,
352,
310,
1805,
281,
2085,
24426,
273,
253,
2457,
851,
1816,
264,
15688,
247,
1133,
20051,
16753,
285,
15688,
4430,
943,
671,
320,
27130,
275,
4677,
495,
604,
597,
403,
908,
275,
253,
3733,
604,
253,
20051,
16753,
285,
4430,
6036,
16753,
310,
417,
908,
849,
281,
3215,
1949,
253,
2250,
2720,
50275,
33722,
4278,
670,
29698,
275,
253,
5150,
352,
310,
417,
2590,
2139,
253,
2488,
4648,
246,
285,
278,
281,
9173,
16753,
10486,
752,
403,
253,
3910,
604,
278,
310,
671,
247,
396,
20,
9261,
849,
281,
11897,
253,
1899,
629,
273,
253,
278,
10186,
68,
18,
16280,
253,
37317,
476,
417,
1089,
667,
1491,
670,
849,
253,
6727,
12042,
10186,
310,
3413,
344,
321,
18260,
275,
1097,
253,
2022,
2929,
285,
24864,
50276,
7152,
339,
431,
248,
4477,
7568,
247,
985,
275,
534,
597,
13398,
247,
1643,
1027,
4295,
281,
755,
4722,
22296,
29343,
264,
1527,
6287,
3879,
273,
1524,
15688,
3564,
2509,
2067,
1027,
8892,
275,
1798,
253,
954,
16613,
629,
273,
253,
2746,
310,
970,
10556,
273,
1966,
3564,
347,
271,
2250,
2720,
534,
46210,
616,
22296,
10603,
50276,
296,
3755,
20556,
50275,
12311,
5161,
2934,
253,
4583,
2934,
273,
970,
2250,
2235,
641,
432,
1966,
10556,
3066,
1133,
12544,
281,
1056,
25497,
789,
1805,
310,
247,
1175,
2934,
627,
403,
247,
2257,
273,
8244,
2905,
2987,
533,
891,
1158,
597,
403,
973,
23378,
275,
436,
2929,
50275,
12311,
10636,
327,
2067,
2234,
4243,
253,
10636,
4278,
273,
10885,
4886,
14693,
342,
6568,
16753,
12544,
2366,
342,
591,
6301,
1133,
12544,
3133,
281,
320,
973,
2218,
891,
671,
751,
816,
970,
391,
20,
78,
3386,
562,
273,
253,
3817,
436,
310,
7060,
285,
4722,
281,
923,
6024,
12820,
50275,
16680,
273,
1524,
25497,
342,
3564,
2509,
247,
5235,
273,
1841,
50274,
20881,
1255,
265,
50276,
9088,
403,
2710,
440,
47458,
3603,
273,
436,
2929,
275,
697,
1655,
830,
50276,
6050,
253,
789,
310,
4722,
891,
16216,
5583,
247,
2266,
2997,
323,
247,
2929,
275,
436,
830,
18670,
253,
1618,
2708,
588,
1361,
253,
4477,
3157,
1097,
436,
789,
285,
616,
2852,
789,
50276,
338,
253,
4477,
476,
2953,
512,
273,
253,
1563,
32213,
275,
616,
30080,
22559,
534,
891,
1158,
310,
512,
513,
494,
285,
1561,
7990,
281,
513,
275,
247,
30080,
22559,
2654,
320,
5211,
281,
2118,
432,
5075,
2997,
281,
2266,
2997,
50275,
18,
352,
3133,
751,
253,
4477,
403,
417,
1077,
598,
6342,
670,
253,
958,
326,
436,
1332,
1057,
417,
4711,
4581,
6287,
7823,
760,
327,
253,
1390,
3239,
390,
767,
310,
352,
5393,
326,
253,
2644,
1332,
310,
1527,
6287,
436,
310,
4030,
281,
1263,
253,
4836,
273,
891,
3280,
1076,
271,
2460,
273,
247,
6200,
285,
21255,
3453,
1076,
271,
1527,
6287,
18974,
533,
352,
273,
2282,
310,
1077,
14155,
253,
8892,
403,
9257,
6777,
824,
326,
597,
13414,
2430,
667,
4581,
6287,
8680,
436,
4809,
273,
616,
2746,
310,
417,
752,
954,
8607,
275,
253,
1673,
651,
1902,
594,
247,
1846,
2793,
273,
247,
22780,
651,
320,
281,
1007,
689,
253,
806,
17167,
273,
7223,
273,
436,
2929,
285,
760,
387,
253,
1390,
3239,
390,
594,
8968,
326,
436,
310,
271,
1527,
6287,
1332,
4496,
816,
1056,
436,
2590,
598,
2914,
374,
2067,
3221,
7234,
275,
253,
10199,
50273,
936,
1973,
824,
35121,
6083,
326,
476,
10196,
9825,
359,
878,
2289,
281,
247,
2257,
273,
5547,
15688,
5016,
941,
275,
1142,
12620,
50276,
1439,
7933,
2032,
436,
310,
247,
5272,
9079,
533,
581,
326,
310,
2649,
5762,
275,
436,
2929,
285,
352,
16216,
320,
4767,
347,
247,
958,
50273,
35529,
45021,
275,
43268,
1524,
1533,
25497,
281,
4822,
2793,
1364,
2430,
3638,
20446,
534,
310,
275,
17887,
50276,
12563,
417,
7933,
2032,
533,
671,
247,
1077,
5272,
9079,
816,
878,
281,
1333,
778,
2430,
3185,
50273,
2252,
273,
253,
275,
46505,
275,
15688,
4715,
310,
1955,
281,
253,
28596,
1781,
2250,
2317,
50276,
266,
4743,
285,
16216,
320,
4767,
347,
958,
495,
40515,
793,
476,
4711,
4999,
285,
6032,
24102,
50276,
9820,
533,
436,
310,
247,
34209,
3908,
597,
476,
671,
4711,
24102,
326,
403,
4336,
20372,
627,
310,
2717,
670,
40515,
793,
17670,
793,
326,
3400,
5252,
643,
685,
247,
2372,
273,
6032,
1255,
326,
778,
25711,
1361,
533,
627,
310,
2717,
326,
7729,
1060,
342,
253,
3361,
273,
24238,
275,
253,
3126,
390,
7497,
3966,
436,
3908,
3164,
760,
11029,
281,
1461,
316,
49245,
26460,
275,
43268,
10668,
4496,
5386,
11097,
577,
253,
2929,
25957,
247,
3520,
2720,
347,
247,
2234,
4445,
533,
436,
310,
816,
326,
352,
4648,
7870,
4866,
2248,
23223,
352,
3133,
516,
417,
2119,
436,
310,
253,
1682,
1039,
281,
13791,
436,
1386,
27446,
671,
2296,
13318,
13823,
40515,
793,
597,
13414,
871,
2712,
670,
3057,
12057,
5046,
816,
1333,
1273,
1340,
985,
390,
18525,
985,
390,
1633,
5046,
13318,
38358,
533,
417,
13318,
13823,
285,
10793,
352,
2296,
323,
1650,
1386,
28534,
1666,
25379,
1293,
247,
3520,
2720,
352,
943,
816,
320,
3185,
2590,
326,
436,
816,
2097,
597,
13414,
897,
42961,
793,
608,
1386,
25098,
50276,
261,
8851,
853,
89,
2104,
281,
1347,
2087,
4096,
19763,
1580,
253,
1332,
310,
1527,
6287,
253,
3662,
310,
642,
28763,
4030,
285,
253,
1543,
403,
1335,
13943,
533,
943,
320,
31637,
436,
310,
417,
1633,
326,
3198,
281,
320,
45190,
6760,
697,
816,
247,
906,
273,
253,
15895,
721,
697,
1077,
21643,
326,
25577,
7127,
310,
908,
1527,
6287,
436,
310,
2649,
271,
8208,
273,
253,
1332,
671,
310,
253,
391,
9866,
2715,
4581,
6287,
689,
673,
697,
417,
2590,
285,
604,
697,
417,
50276,
303,
417,
2119,
849,
253,
391,
9866,
651,
320,
667,
1027,
604,
697,
417,
908,
32627,
689,
673,
50276,
24,
4496,
1375,
4555,
849,
1142,
32367,
497,
908,
323,
253,
1027,
4679,
50276,
25,
275,
253,
6452,
50276,
2520,
310,
984,
3733,
391,
77,
275,
253,
1524,
1533,
310,
2834,
1955,
281,
10309,
7364,
4754,
533,
436,
310,
2649,
1921,
281,
1056,
253,
908,
3879,
34591,
1332,
1527,
6287,
3185,
273,
4581,
6287,
50274,
37585,
50276,
69,
834,
7664,
670,
841,
1512,
1199,
533,
891,
3748,
841,
347,
9091,
281,
3157,
253,
2929,
2007,
50274,
32844,
21,
69,
310,
417,
11106,
327,
3239,
374,
5393,
533,
417,
11106,
50276,
6285,
310,
417,
2931,
275,
271,
5150,
671,
891,
651,
5583,
417,
970,
767,
4876,
323,
247,
14168,
9484,
352,
4453,
751,
247,
4315,
288,
31458,
407,
247,
4315,
391,
50276,
22309,
897,
390,
1768,
5247,
20,
2581,
685,
847,
4251,
323,
253,
24543,
2168,
3515,
847,
4251,
323,
253,
18543,
50273,
7152,
33032,
87,
504,
853,
89,
3215,
44196,
247,
3646,
2990,
342,
10556,
342,
19859,
10425,
1714,
285,
17308,
11955,
941,
273,
7497,
9591,
247,
4836,
840,
1442,
292,
14124,
342,
17227,
24102,
5728,
407,
4014,
2211,
839,
253,
15688,
50276,
249,
1340,
281,
6194,
342,
253,
1966,
941,
597,
897,
253,
2746,
432,
7584,
323,
10603,
1966,
16753,
281,
15688,
16753,
285,
897,
390,
1768,
5247,
22341,
281,
2395,
323,
253,
6568,
3200,
50276,
9328,
3997,
253,
2460,
941,
13130,
342,
253,
3453,
8659,
16753,
715,
247,
501,
3024,
1093,
1010,
27882,
31260,
342,
391,
20,
983,
23,
3386,
285,
897,
247,
11454,
7870,
3646,
295,
12132,
2145,
2990,
281,
6635,
5231,
50276,
783,
2929,
14371,
326,
970,
1966,
941,
4483,
5520,
3045,
327,
9963,
8892,
5847,
253,
2929,
10262,
247,
28055,
2969,
1332,
273,
4715,
432,
10556,
273,
7497,
50276,
783,
1332,
310,
5183,
327,
818,
1027,
8892,
41731,
14692,
253,
1666,
25379,
1293,
1966,
941,
327,
721,
273,
731,
50275,
5040,
253,
4028,
273,
253,
2929,
310,
8489,
17485,
50276,
783,
1783,
273,
2139,
253,
4081,
2746,
970,
295,
12132,
2581,
685,
247,
13361,
81,
2987,
1805,
342,
1966,
941,
812,
320,
10046,
50275,
783,
2929,
3198,
281,
320,
1199,
30909,
326,
352,
15771,
327,
19859,
10425,
1714,
285,
17308,
11955,
941,
432,
253,
1966,
10556,
534,
310,
247,
11394,
281,
7777,
970,
8573,
7527,
941,
50276,
187,
187,
4118,
18435,
27,
2520,
2929,
2175,
849,
281,
3037,
27625,
350,
528,
19763,
432,
1966,
10556,
50274,
249,
253,
3302,
2278,
253,
37317,
14109,
253,
3884,
285,
1524,
10186,
3368,
533,
671,
5439,
50276,
585,
1209,
2224,
670,
253,
878,
273,
2714,
8468,
323,
12544,
1309,
30080,
22559,
253,
4477,
8069,
9713,
436,
4468,
407,
5277,
3081,
3368,
1543,
285,
30628,
497,
10048,
342,
253,
2380,
50276,
317,
651,
751,
281,
5583,
14924,
323,
436,
2929,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
summary of contributions this paper proposes a new framework to design new loss for gans the authors show that their framework is quite general and encompass a number of existing approaches eg the original gan formulation hinge loss etc they also propose a categorization in three different classes and derive new loss function they then compare experimentally the different existing loss and the new proposed loss that fall under their framework main comment the framework proposed in the paper is interesting since its quite general and the authors are able to derive a large number of existing as well as new loss from it however i think the framework has several limitations 1 the formulation is based on the likelihood ratio which is only defined if the support of g and f match this is known to not be the case in the context of gans 2 the benefit of the framework is not clear while it provides a way to derive new loss its not clear what are the advantages of the new loss theoretically the author argue that it is a hard question to answer and i agree the authors try to answer this question through experiments but i find the experiments not very convincing in particular the authors argue that subclass a objectives are more stable based on the celeba experiment however its not clear to me that the instability is due to a specific choice of objective function it might just be that the hyper parameters where slightly off for the other objectives i believe it would be interesting to understand better the results on celeba in particular maybe to show that some objectives are indeed more stable they can vary several hyperparameters and compare how often each objective is better than the other that would make the results and conclusion much more convincing minor comment the paper is overall clear but the clarity of some sections could be improved i think theorem 1 would be more clear if stated a bit differently simply saying that domegar maximize phidr psid and that r1 minimize phiomegarr psiomegar section 3 is a bit dense the subclasses also seem a bit arbitrary i believe section 5 could be improved by stating more clearly the different observations right now it looks more like a description of the figures than a clear statement of the question that the experiments try to answer and how they answer it docsepthis paper generalizes the minmax problem of gans to form a richer family of generative adversarial networks interestingly most of the wellknown variants of gans can be found in the spectrum of formulations covered by the family proposed in this work in terms of modeling it is evident that the family proposed in the paper is richer than that of fgan the family in this paper is shown to have a connection to wgan except that the lipschitz condition is omitted however under the light of existing works including fgan and other relevant works the obtained theoretical results are not surprising to me in addition apart from providing a richer family this work does not significantly influence the practical aspects of gans i have some following questions 1 if we solve the minmax problem in 2 subjected the fact that phi and psi satisfy eq 9 is it equivalent to minimizing any divergence between two distributions with pdfs f and g 2 dx is not a typical discriminator whose values between 01 providing the probability to distinguish true and fake data is not it d is more similar to a critique whose output values are realvalued is not itdocsepsummary in this paper the authors set out to find what scalar functions will make for a max part of the minmax gan objective they then find such a class of functions and show that only a ratio between two equal probabilities will be admitted as a solution pros the paper nicely introduces a different way of seeing gans not as a difference between the generated and real data but as a an integer of the ratio between generated and real distribution times the discriminator only if the ratio is 1 everywhere is the discriminator unable to maximize the max part of the gan objective further i liked the idea that the discriminator shouldnt just decide what class data belongs to but also estimate the probability ratio specifically in the formulation here the max part is maximized when dx omegarx so maximized iff omega1dx doesnt just classify but says the probability ratio between the two classes if this idea is expanded upon i think the authors could make a novel contribution cons unfortunately the authors have neglected to carefully explain how their contribution relates to previous work its telling that the paper cites only two papers from 2018 one from 2019 and none from 2020 all other citations are from previous years even though 20182020 has been a time of much gan research a key way in which the authors work hasnt been sufficiently compared to previous work is with their main claim we propose a simple methodology for constructing such minmax problems assuring at the same time consistency of the corresponding solution in liu they show a class of of functions where consistency is also guaranteed and the class shown by the authors here is a subset of the class in liu the details are at the bottom of my review further many of the techniques in this paper seem very similar to song where they also investigate the fgan divergence specifically the claims they make in theorem 1 seem very similar to prop 2 in song also the change of measure trick in the introduction can be found in song a detailed comparison of this work to that work would also be helpful since when reading this paper one simply doesnt know what is previous work which has already been done by others and what is the authors novel contribution once the authors address this and one is confident the contribution is indeed novel then the submission would be worth considering details of why this is a subset of whats already been shown in liu there they examine the difference between the target density d in this paper d is f but liu uses f for something else and the generated density g via supfinmathcal fmathbb exsim dysim gfxy so we find the function f in a class mathcal f which maximally separates the classes from d and g now this work proposes to do the same thing but with fxyphidx psidy where phiz intomega10z omega1tpt dt c1 and psizintomega10z pt dt c2 in liu they then split fxy up into two functions m and r such that fxymx y rxy where mxy has the form mxyvxvy this can be done in your case too resulting in here we drop the constants c1 and c2 for simplicity vx intomega10dx pt dt vy intomega10dy pt dt and rxy intomega10dx omega1t 1 ptdt since dx must be in mathcal jomega this integral has an infimum and theorem 4 from liu can be applied to achieve the same results as in this paper song song jiaming and stefano ermon bridging the gap between f gans and wasserstein gans arxiv preprint arxiv191009779 2019 liu liu shuang olivier bousquet and kamalika chaudhuri approximation and convergence properties of generative adversarial learning advances in neural information processing systems 2017 docsepoverall this paper provides impacts on understanding the core of generative models with adversarial optimization problems this paper shows the diverse possibilities of formulating the generative model optimization problems that the researchers can further investigate for better performances also this paper shows that generative models with unexplored losses achieve the best results in various datasets which demonstrates the possibilities of future improvements of generative models overall this paper is valuable to the machine learning community especially for generative models and adversarial training the below are some concerns for this paper but those concerns are not bigger than the advantages of this paper 1 quantitative experiments although the authors provided two tables table 2 and 3 there were not much analyses about the results i understand that it is not an easy problem to understand when should we use which function however it would be great if the authors can discover some trends in the results to demonstrate which type of functions work well with which type of datasets i think it would be great to use some synthetic data with known characteristics of distributions as the target distribution to analyze for understanding this point 2 other types of dataset generative models are widely utilized in computer vision however there are various other types of datasets that can get benefits of generative models such as tabular data and timeseries data it would be good if the authors can provide some simple experiments to demonstrate its generalizability 3 minor points it is not clear to transform between equation 3 and 4 i think this is a critical part in this paper thus it would be good to explain a little bit more for this part the authors explain the differences between fgan and this paper however it is not super clear to understand it would be good to clarify this point to highlight the novelty of this paper after reading other reviews are rebuttals after reading all the reviews from other reviewers and corresponding rebuttals i think this paper is a good paper and enough to be accepted in iclr 1 i think it has a clear difference from fgan it can provide a new loss function for the generative models which can further extend the success of generative models in the future 2 experiments are not super interesting but at least it has some intuitions corresponding to the authors claims 3 general theoretical results for the generative models such as when should we use which loss is a very difficult problem to solve maybe this paper can provide some intuitions for solving that large problem but it seems too much to ask this thing to the authors of this paper without that i think this paper is still worth to present to the iclr readers and participants therefore i am standing on my original score 7
### Summary: | this paper proposed a new family of losses for gans and showed that this family is quite general and encompasses a number of existing losses as well as some new loss functions the paper compared experimentally the existing losses and the new proposed losses but the benefit of this family is not clear theoretically and this work did not also provide the very helpful insights for the practical application of gans | [
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
273,
9021,
436,
2929,
29328,
247,
747,
7792,
281,
2216,
747,
2957,
323,
305,
507,
253,
4477,
921,
326,
616,
7792,
310,
3240,
2087,
285,
18387,
247,
1180,
273,
5368,
7274,
24088,
253,
3236,
36827,
15895,
38864,
2957,
3966,
597,
671,
12661,
247,
13213,
1320,
275,
1264,
1027,
5971,
285,
15313,
747,
2957,
1159,
597,
840,
7277,
21657,
253,
1027,
5368,
2957,
285,
253,
747,
4081,
2957,
326,
2965,
762,
616,
7792,
50276,
7265,
4385,
253,
7792,
4081,
275,
253,
2929,
310,
4722,
1580,
697,
3240,
2087,
285,
253,
4477,
403,
2104,
281,
15313,
247,
1781,
1180,
273,
5368,
347,
973,
347,
747,
2957,
432,
352,
2299,
891,
1158,
253,
7792,
556,
2067,
7364,
337,
253,
15895,
310,
1754,
327,
253,
12177,
4313,
534,
310,
760,
2931,
604,
253,
1329,
273,
305,
285,
269,
3761,
436,
310,
1929,
281,
417,
320,
253,
1083,
275,
253,
3634,
273,
305,
507,
374,
253,
5649,
273,
253,
7792,
310,
417,
2590,
1223,
352,
3400,
247,
1039,
281,
15313,
747,
2957,
697,
417,
2590,
752,
403,
253,
11361,
273,
253,
747,
2957,
28055,
253,
2488,
9059,
326,
352,
310,
247,
1892,
1953,
281,
3662,
285,
891,
5194,
253,
4477,
1611,
281,
3662,
436,
1953,
949,
4679,
533,
891,
1089,
253,
4679,
417,
1077,
21414,
275,
1798,
253,
4477,
9059,
326,
35851,
247,
16566,
403,
625,
6474,
1754,
327,
253,
6076,
5830,
3368,
2299,
697,
417,
2590,
281,
479,
326,
253,
17620,
310,
1955,
281,
247,
2173,
4327,
273,
8103,
1159,
352,
1537,
816,
320,
326,
253,
4373,
3602,
835,
5777,
745,
323,
253,
643,
16566,
891,
2868,
352,
651,
320,
4722,
281,
2096,
1805,
253,
1543,
327,
6076,
5830,
275,
1798,
5046,
281,
921,
326,
690,
16566,
403,
6296,
625,
6474,
597,
476,
6889,
2067,
4373,
22041,
285,
7277,
849,
2223,
1016,
8103,
310,
1805,
685,
253,
643,
326,
651,
1056,
253,
1543,
285,
6452,
1199,
625,
21414,
50276,
37585,
4385,
253,
2929,
310,
4583,
2590,
533,
253,
19843,
273,
690,
7118,
812,
320,
5520,
891,
1158,
10012,
337,
651,
320,
625,
2590,
604,
4767,
247,
2372,
13359,
3365,
3981,
326,
34074,
5209,
22950,
815,
301,
83,
3714,
301,
285,
326,
391,
18,
15338,
815,
74,
485,
72,
3298,
3714,
74,
485,
5209,
2593,
495,
310,
247,
2372,
14086,
253,
749,
19770,
671,
1646,
247,
2372,
10341,
891,
2868,
2593,
608,
812,
320,
5520,
407,
14851,
625,
4518,
253,
1027,
7313,
987,
1024,
352,
4453,
625,
751,
247,
5740,
273,
253,
8442,
685,
247,
2590,
3908,
273,
253,
1953,
326,
253,
4679,
1611,
281,
3662,
285,
849,
597,
3662,
352,
50276,
7152,
33032,
2520,
2929,
2087,
4219,
253,
1054,
4090,
1895,
273,
305,
507,
281,
830,
247,
38539,
2021,
273,
1006,
800,
48960,
6928,
4722,
314,
954,
273,
253,
973,
4304,
11640,
273,
305,
507,
476,
320,
1119,
275,
253,
6637,
273,
26850,
6107,
407,
253,
2021,
4081,
275,
436,
789,
275,
2426,
273,
14053,
352,
310,
8943,
326,
253,
2021,
4081,
275,
253,
2929,
310,
38539,
685,
326,
273,
269,
1247,
253,
2021,
275,
436,
2929,
310,
2011,
281,
452,
247,
4602,
281,
259,
1247,
3707,
326,
253,
11233,
37913,
1617,
310,
11035,
50276,
35529,
762,
253,
1708,
273,
5368,
2987,
1690,
269,
1247,
285,
643,
4623,
2987,
253,
2797,
10527,
1543,
403,
417,
10084,
281,
479,
275,
1635,
7419,
432,
5277,
247,
38539,
2021,
436,
789,
1057,
417,
3012,
4833,
253,
8542,
7794,
273,
305,
507,
891,
452,
690,
1563,
3533,
337,
604,
359,
8415,
253,
1054,
4090,
1895,
275,
374,
12021,
253,
958,
326,
815,
74,
285,
3714,
74,
10517,
16186,
898,
310,
352,
6425,
281,
28699,
667,
23279,
875,
767,
10670,
342,
31385,
3671,
269,
285,
305,
374,
18747,
310,
417,
247,
6867,
7134,
12915,
3692,
2193,
875,
14805,
5277,
253,
5912,
281,
12129,
2032,
285,
15223,
941,
310,
417,
352,
277,
310,
625,
2074,
281,
247,
29254,
3692,
3453,
2193,
403,
1524,
24995,
310,
417,
352,
7152,
339,
793,
360,
3454,
50276,
249,
436,
2929,
253,
4477,
873,
562,
281,
1089,
752,
13434,
3470,
588,
1056,
323,
247,
2781,
629,
273,
253,
1054,
4090,
36827,
8103,
597,
840,
1089,
824,
247,
966,
273,
3470,
285,
921,
326,
760,
247,
4313,
875,
767,
4503,
20552,
588,
320,
8176,
347,
247,
2900,
50276,
856,
84,
50276,
783,
2929,
23395,
23970,
247,
1027,
1039,
273,
6523,
305,
507,
417,
347,
247,
3064,
875,
253,
4561,
285,
1524,
941,
533,
347,
247,
271,
7007,
273,
253,
4313,
875,
4561,
285,
1524,
3268,
2069,
253,
7134,
12915,
760,
604,
253,
4313,
310,
337,
11678,
310,
253,
7134,
12915,
7591,
281,
22950,
253,
2781,
629,
273,
253,
36827,
8103,
50276,
44295,
891,
10490,
253,
2934,
326,
253,
7134,
12915,
943,
2649,
816,
7617,
752,
966,
941,
14125,
281,
533,
671,
6642,
253,
5912,
4313,
5742,
275,
253,
15895,
1060,
253,
2781,
629,
310,
11903,
1025,
672,
18747,
7005,
909,
274,
89,
594,
11903,
1025,
36714,
40639,
18,
9665,
36908,
816,
30215,
533,
2296,
253,
5912,
4313,
875,
253,
767,
5971,
604,
436,
2934,
310,
11848,
2220,
891,
1158,
253,
4477,
812,
1056,
247,
4460,
7680,
50276,
5040,
50275,
328,
9520,
253,
4477,
452,
22459,
281,
9257,
5513,
849,
616,
7680,
7033,
281,
2045,
789,
697,
7746,
326,
253,
2929,
28070,
760,
767,
9380,
432,
4765,
581,
432,
6247,
285,
5293,
432,
9169,
512,
643,
30404,
403,
432,
2045,
1107,
1014,
2167,
4765,
14952,
556,
644,
247,
673,
273,
1199,
36827,
2561,
50276,
66,
2234,
1039,
275,
534,
253,
4477,
789,
556,
2649,
644,
10481,
2429,
281,
2045,
789,
310,
342,
616,
2022,
1750,
359,
12661,
247,
2969,
16182,
323,
26736,
824,
1054,
4090,
3237,
718,
981,
387,
253,
1072,
673,
15274,
273,
253,
3969,
2900,
275,
632,
86,
597,
921,
247,
966,
273,
273,
3470,
835,
15274,
310,
671,
16293,
285,
253,
966,
2011,
407,
253,
4477,
1060,
310,
247,
8578,
273,
253,
966,
275,
632,
86,
253,
4278,
403,
387,
253,
5004,
273,
619,
2278,
50276,
44295,
1142,
273,
253,
5609,
275,
436,
2929,
1646,
1077,
2074,
281,
4498,
835,
597,
671,
7409,
253,
269,
1247,
23279,
5742,
253,
3916,
597,
1056,
275,
10012,
337,
1646,
1077,
2074,
281,
4198,
374,
275,
4498,
671,
253,
1818,
273,
2557,
10480,
275,
253,
10199,
476,
320,
1119,
275,
4498,
247,
7000,
5301,
273,
436,
789,
281,
326,
789,
651,
671,
320,
9371,
50276,
17480,
672,
4361,
436,
2929,
581,
3365,
36908,
871,
752,
310,
2045,
789,
534,
556,
2168,
644,
2218,
407,
2571,
285,
752,
310,
253,
4477,
4460,
7680,
2378,
253,
4477,
2953,
436,
285,
581,
310,
13224,
253,
7680,
310,
6296,
4460,
840,
253,
19529,
651,
320,
4409,
7296,
50276,
23454,
273,
2139,
436,
310,
247,
8578,
273,
47515,
2168,
644,
2011,
275,
632,
86,
50276,
9088,
597,
9186,
253,
3064,
875,
253,
2303,
4038,
277,
275,
436,
2929,
277,
310,
269,
533,
632,
86,
4648,
269,
323,
1633,
2010,
285,
253,
4561,
4038,
305,
3066,
7018,
9750,
1588,
269,
1991,
385,
3549,
7915,
303,
305,
71,
5246,
594,
359,
1089,
253,
1159,
269,
275,
247,
966,
14168,
1179,
269,
534,
11903,
595,
36158,
253,
5971,
432,
277,
285,
305,
50276,
2666,
436,
789,
29328,
281,
513,
253,
1072,
2181,
533,
342,
269,
5246,
545,
16159,
50276,
793,
36706,
835,
815,
478,
50276,
565,
3151,
740,
91,
40639,
18,
85,
431,
19641,
50276,
68,
18,
50276,
395,
3714,
478,
565,
3151,
740,
91,
31048,
19641,
50276,
68,
19,
50276,
249,
632,
86,
597,
840,
8085,
269,
5246,
598,
715,
767,
3470,
278,
285,
391,
824,
326,
269,
37543,
89,
340,
50276,
83,
5246,
835,
278,
5246,
556,
253,
830,
278,
5246,
43563,
11170,
436,
476,
320,
2218,
275,
634,
1083,
1512,
4795,
275,
1060,
359,
5926,
253,
14637,
260,
18,
285,
260,
19,
323,
17647,
50276,
43563,
50276,
565,
3151,
740,
9665,
31048,
19641,
46643,
50276,
565,
3151,
740,
6421,
31048,
19641,
50276,
395,
50275,
83,
5246,
50276,
565,
3151,
740,
9665,
40639,
18,
85,
50276,
18,
268,
2851,
85,
50276,
17480,
18747,
1364,
320,
275,
14168,
1179,
480,
3151,
436,
9909,
556,
271,
2192,
2751,
285,
10012,
577,
432,
632,
86,
476,
320,
3732,
281,
5115,
253,
1072,
1543,
347,
275,
436,
2929,
50276,
32429,
4498,
480,
74,
6472,
285,
331,
832,
4692,
209,
693,
251,
49519,
253,
8037,
875,
50276,
71,
305,
507,
285,
369,
2152,
6339,
305,
507,
549,
32693,
638,
3845,
549,
32693,
746,
2313,
26,
33727,
6247,
50276,
965,
86,
632,
86,
439,
86,
606,
8919,
400,
1321,
270,
528,
21118,
285,
465,
312,
267,
11825,
448,
5353,
73,
11317,
11193,
285,
14940,
3607,
273,
1006,
800,
48960,
4715,
16424,
275,
11454,
1491,
5162,
2718,
4240,
50276,
7152,
33032,
1189,
455,
436,
2929,
3400,
16274,
327,
4685,
253,
5161,
273,
1006,
800,
3210,
342,
48960,
13757,
3237,
436,
2929,
2722,
253,
11117,
15018,
273,
830,
8287,
253,
1006,
800,
1566,
13757,
3237,
326,
253,
8607,
476,
2007,
7409,
323,
1805,
16226,
575,
671,
436,
2929,
2722,
326,
1006,
800,
3210,
342,
35021,
2149,
11655,
5115,
253,
1682,
1543,
275,
2710,
15302,
575,
4609,
14371,
253,
15018,
273,
2852,
11701,
273,
1006,
800,
3210,
4583,
436,
2929,
310,
9865,
281,
253,
5145,
4715,
3114,
3340,
323,
1006,
800,
3210,
285,
48960,
3733,
50276,
783,
2708,
403,
690,
7350,
323,
436,
2929,
533,
1110,
7350,
403,
417,
8750,
685,
253,
11361,
273,
436,
2929,
50276,
18,
11745,
4679,
50276,
20261,
253,
4477,
2530,
767,
7180,
2829,
374,
285,
495,
627,
497,
417,
1199,
6260,
670,
253,
1543,
50276,
74,
2096,
326,
352,
310,
417,
271,
3477,
1895,
281,
2096,
672,
943,
359,
897,
534,
1159,
2299,
352,
651,
320,
1270,
604,
253,
4477,
476,
9413,
690,
13554,
275,
253,
1543,
281,
7568,
534,
1511,
273,
3470,
789,
973,
342,
534,
1511,
273,
15302,
50276,
74,
1158,
352,
651,
320,
1270,
281,
897,
690,
13506,
941,
342,
1929,
5319,
273,
10670,
347,
253,
2303,
3268,
281,
12106,
323,
4685,
436,
1127,
50276,
19,
643,
3510,
273,
10895,
50276,
36749,
3210,
403,
7561,
12845,
275,
4382,
8113,
575,
50276,
35529,
627,
403,
2710,
643,
3510,
273,
15302,
326,
476,
755,
5373,
273,
1006,
800,
3210,
824,
347,
10334,
792,
941,
285,
2069,
12395,
941,
50276,
262,
651,
320,
1175,
604,
253,
4477,
476,
2085,
690,
2969,
4679,
281,
7568,
697,
2087,
50228,
50276,
20,
5884,
2792,
50276,
262,
310,
417,
2590,
281,
4979,
875,
5150,
495,
285,
577,
891,
1158,
436,
310,
247,
4619,
629,
275,
436,
2929,
3021,
352,
651,
320,
1175,
281,
5513,
247,
1652,
2372,
625,
323,
436,
629,
50276,
783,
4477,
5513,
253,
3910,
875,
269,
1247,
285,
436,
2929,
2299,
352,
310,
417,
2221,
2590,
281,
2096,
352,
651,
320,
1175,
281,
19148,
436,
1127,
281,
6780,
253,
38135,
273,
436,
2929,
50276,
6438,
4361,
643,
10123,
403,
30080,
85,
932,
50276,
6438,
4361,
512,
253,
10123,
432,
643,
30628,
285,
3969,
30080,
85,
932,
891,
1158,
436,
2929,
310,
247,
1175,
2929,
285,
2217,
281,
320,
7607,
275,
17857,
32888,
50276,
18,
891,
1158,
352,
556,
247,
2590,
3064,
432,
269,
1247,
352,
476,
2085,
247,
747,
2957,
1159,
323,
253,
1006,
800,
3210,
534,
476,
2007,
9017,
253,
2323,
273,
1006,
800,
3210,
275,
253,
2852,
374,
4679,
403,
417,
2221,
4722,
533,
387,
1878,
352,
556,
690,
16875,
4431,
3969,
281,
253,
4477,
3916,
495,
2087,
10527,
1543,
323,
253,
1006,
800,
3210,
824,
347,
672,
943,
359,
897,
534,
2957,
310,
247,
1077,
2834,
1895,
281,
8415,
5046,
436,
2929,
476,
2085,
690,
16875,
4431,
323,
16161,
326,
1781,
1895,
533,
352,
3133,
1512,
1199,
281,
1642,
436,
2181,
281,
253,
4477,
273,
436,
2929,
1293,
326,
891,
1158,
436,
2929,
310,
1335,
4409,
281,
1246,
281,
253,
17857,
32888,
10668,
285,
5014,
50276,
45230,
891,
717,
6306,
327,
619,
3236,
4868,
818,
187,
187,
4118,
18435,
27,
2520,
2929,
4081,
247,
747,
2021,
273,
11655,
323,
305,
507,
285,
2692,
326,
436,
2021,
310,
3240,
2087,
285,
37035,
247,
1180,
273,
5368,
11655,
347,
973,
347,
690,
747,
2957,
3470,
253,
2929,
2429,
21657,
253,
5368,
11655,
285,
253,
747,
4081,
11655,
533,
253,
5649,
273,
436,
2021,
310,
417,
2590,
28055,
285,
436,
789,
858,
417,
671,
2085,
253,
1077,
9371,
16039,
323,
253,
8542,
2898,
273,
305,
507,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
8774,
273,
9021,
436,
2929,
29328,
247,
747,
7792,
281,
2216,
747,
2957,
323,
305,
507,
253,
4477,
921,
326,
616,
7792,
310,
3240,
2087,
285,
18387,
247,
1180,
273,
5368,
7274,
24088,
253,
3236,
36827,
15895,
38864,
2957,
3966,
597,
671,
12661,
247,
13213,
1320,
275,
1264,
1027,
5971,
285,
15313,
747,
2957,
1159,
597,
840,
7277,
21657,
253,
1027,
5368,
2957,
285,
253,
747,
4081,
2957,
326,
2965,
762,
616,
7792,
50276,
7265,
4385,
253,
7792,
4081,
275,
253,
2929,
310,
4722,
1580,
697,
3240,
2087,
285,
253,
4477,
403,
2104,
281,
15313,
247,
1781,
1180,
273,
5368,
347,
973,
347,
747,
2957,
432,
352,
2299,
891,
1158,
253,
7792,
556,
2067,
7364,
337,
253,
15895,
310,
1754,
327,
253,
12177,
4313,
534,
310,
760,
2931,
604,
253,
1329,
273,
305,
285,
269,
3761,
436,
310,
1929,
281,
417,
320,
253,
1083,
275,
253,
3634,
273,
305,
507,
374,
253,
5649,
273,
253,
7792,
310,
417,
2590,
1223,
352,
3400,
247,
1039,
281,
15313,
747,
2957,
697,
417,
2590,
752,
403,
253,
11361,
273,
253,
747,
2957,
28055,
253,
2488,
9059,
326,
352,
310,
247,
1892,
1953,
281,
3662,
285,
891,
5194,
253,
4477,
1611,
281,
3662,
436,
1953,
949,
4679,
533,
891,
1089,
253,
4679,
417,
1077,
21414,
275,
1798,
253,
4477,
9059,
326,
35851,
247,
16566,
403,
625,
6474,
1754,
327,
253,
6076,
5830,
3368,
2299,
697,
417,
2590,
281,
479,
326,
253,
17620,
310,
1955,
281,
247,
2173,
4327,
273,
8103,
1159,
352,
1537,
816,
320,
326,
253,
4373,
3602,
835,
5777,
745,
323,
253,
643,
16566,
891,
2868,
352,
651,
320,
4722,
281,
2096,
1805,
253,
1543,
327,
6076,
5830,
275,
1798,
5046,
281,
921,
326,
690,
16566,
403,
6296,
625,
6474,
597,
476,
6889,
2067,
4373,
22041,
285,
7277,
849,
2223,
1016,
8103,
310,
1805,
685,
253,
643,
326,
651,
1056,
253,
1543,
285,
6452,
1199,
625,
21414,
50276,
37585,
4385,
253,
2929,
310,
4583,
2590,
533,
253,
19843,
273,
690,
7118,
812,
320,
5520,
891,
1158,
10012,
337,
651,
320,
625,
2590,
604,
4767,
247,
2372,
13359,
3365,
3981,
326,
34074,
5209,
22950,
815,
301,
83,
3714,
301,
285,
326,
391,
18,
15338,
815,
74,
485,
72,
3298,
3714,
74,
485,
5209,
2593,
495,
310,
247,
2372,
14086,
253,
749,
19770,
671,
1646,
247,
2372,
10341,
891,
2868,
2593,
608,
812,
320,
5520,
407,
14851,
625,
4518,
253,
1027,
7313,
987,
1024,
352,
4453,
625,
751,
247,
5740,
273,
253,
8442,
685,
247,
2590,
3908,
273,
253,
1953,
326,
253,
4679,
1611,
281,
3662,
285,
849,
597,
3662,
352,
50276,
7152,
33032,
2520,
2929,
2087,
4219,
253,
1054,
4090,
1895,
273,
305,
507,
281,
830,
247,
38539,
2021,
273,
1006,
800,
48960,
6928,
4722,
314,
954,
273,
253,
973,
4304,
11640,
273,
305,
507,
476,
320,
1119,
275,
253,
6637,
273,
26850,
6107,
407,
253,
2021,
4081,
275,
436,
789,
275,
2426,
273,
14053,
352,
310,
8943,
326,
253,
2021,
4081,
275,
253,
2929,
310,
38539,
685,
326,
273,
269,
1247,
253,
2021,
275,
436,
2929,
310,
2011,
281,
452,
247,
4602,
281,
259,
1247,
3707,
326,
253,
11233,
37913,
1617,
310,
11035,
50276,
35529,
762,
253,
1708,
273,
5368,
2987,
1690,
269,
1247,
285,
643,
4623,
2987,
253,
2797,
10527,
1543,
403,
417,
10084,
281,
479,
275,
1635,
7419,
432,
5277,
247,
38539,
2021,
436,
789,
1057,
417,
3012,
4833,
253,
8542,
7794,
273,
305,
507,
891,
452,
690,
1563,
3533,
337,
604,
359,
8415,
253,
1054,
4090,
1895,
275,
374,
12021,
253,
958,
326,
815,
74,
285,
3714,
74,
10517,
16186,
898,
310,
352,
6425,
281,
28699,
667,
23279,
875,
767,
10670,
342,
31385,
3671,
269,
285,
305,
374,
18747,
310,
417,
247,
6867,
7134,
12915,
3692,
2193,
875,
14805,
5277,
253,
5912,
281,
12129,
2032,
285,
15223,
941,
310,
417,
352,
277,
310,
625,
2074,
281,
247,
29254,
3692,
3453,
2193,
403,
1524,
24995,
310,
417,
352,
7152,
339,
793,
360,
3454,
50276,
249,
436,
2929,
253,
4477,
873,
562,
281,
1089,
752,
13434,
3470,
588,
1056,
323,
247,
2781,
629,
273,
253,
1054,
4090,
36827,
8103,
597,
840,
1089,
824,
247,
966,
273,
3470,
285,
921,
326,
760,
247,
4313,
875,
767,
4503,
20552,
588,
320,
8176,
347,
247,
2900,
50276,
856,
84,
50276,
783,
2929,
23395,
23970,
247,
1027,
1039,
273,
6523,
305,
507,
417,
347,
247,
3064,
875,
253,
4561,
285,
1524,
941,
533,
347,
247,
271,
7007,
273,
253,
4313,
875,
4561,
285,
1524,
3268,
2069,
253,
7134,
12915,
760,
604,
253,
4313,
310,
337,
11678,
310,
253,
7134,
12915,
7591,
281,
22950,
253,
2781,
629,
273,
253,
36827,
8103,
50276,
44295,
891,
10490,
253,
2934,
326,
253,
7134,
12915,
943,
2649,
816,
7617,
752,
966,
941,
14125,
281,
533,
671,
6642,
253,
5912,
4313,
5742,
275,
253,
15895,
1060,
253,
2781,
629,
310,
11903,
1025,
672,
18747,
7005,
909,
274,
89,
594,
11903,
1025,
36714,
40639,
18,
9665,
36908,
816,
30215,
533,
2296,
253,
5912,
4313,
875,
253,
767,
5971,
604,
436,
2934,
310,
11848,
2220,
891,
1158,
253,
4477,
812,
1056,
247,
4460,
7680,
50276,
5040,
50275,
328,
9520,
253,
4477,
452,
22459,
281,
9257,
5513,
849,
616,
7680,
7033,
281,
2045,
789,
697,
7746,
326,
253,
2929,
28070,
760,
767,
9380,
432,
4765,
581,
432,
6247,
285,
5293,
432,
9169,
512,
643,
30404,
403,
432,
2045,
1107,
1014,
2167,
4765,
14952,
556,
644,
247,
673,
273,
1199,
36827,
2561,
50276,
66,
2234,
1039,
275,
534,
253,
4477,
789,
556,
2649,
644,
10481,
2429,
281,
2045,
789,
310,
342,
616,
2022,
1750,
359,
12661,
247,
2969,
16182,
323,
26736,
824,
1054,
4090,
3237,
718,
981,
387,
253,
1072,
673,
15274,
273,
253,
3969,
2900,
275,
632,
86,
597,
921,
247,
966,
273,
273,
3470,
835,
15274,
310,
671,
16293,
285,
253,
966,
2011,
407,
253,
4477,
1060,
310,
247,
8578,
273,
253,
966,
275,
632,
86,
253,
4278,
403,
387,
253,
5004,
273,
619,
2278,
50276,
44295,
1142,
273,
253,
5609,
275,
436,
2929,
1646,
1077,
2074,
281,
4498,
835,
597,
671,
7409,
253,
269,
1247,
23279,
5742,
253,
3916,
597,
1056,
275,
10012,
337,
1646,
1077,
2074,
281,
4198,
374,
275,
4498,
671,
253,
1818,
273,
2557,
10480,
275,
253,
10199,
476,
320,
1119,
275,
4498,
247,
7000,
5301,
273,
436,
789,
281,
326,
789,
651,
671,
320,
9371,
50276,
17480,
672,
4361,
436,
2929,
581,
3365,
36908,
871,
752,
310,
2045,
789,
534,
556,
2168,
644,
2218,
407,
2571,
285,
752,
310,
253,
4477,
4460,
7680,
2378,
253,
4477,
2953,
436,
285,
581,
310,
13224,
253,
7680,
310,
6296,
4460,
840,
253,
19529,
651,
320,
4409,
7296,
50276,
23454,
273,
2139,
436,
310,
247,
8578,
273,
47515,
2168,
644,
2011,
275,
632,
86,
50276,
9088,
597,
9186,
253,
3064,
875,
253,
2303,
4038,
277,
275,
436,
2929,
277,
310,
269,
533,
632,
86,
4648,
269,
323,
1633,
2010,
285,
253,
4561,
4038,
305,
3066,
7018,
9750,
1588,
269,
1991,
385,
3549,
7915,
303,
305,
71,
5246,
594,
359,
1089,
253,
1159,
269,
275,
247,
966,
14168,
1179,
269,
534,
11903,
595,
36158,
253,
5971,
432,
277,
285,
305,
50276,
2666,
436,
789,
29328,
281,
513,
253,
1072,
2181,
533,
342,
269,
5246,
545,
16159,
50276,
793,
36706,
835,
815,
478,
50276,
565,
3151,
740,
91,
40639,
18,
85,
431,
19641,
50276,
68,
18,
50276,
395,
3714,
478,
565,
3151,
740,
91,
31048,
19641,
50276,
68,
19,
50276,
249,
632,
86,
597,
840,
8085,
269,
5246,
598,
715,
767,
3470,
278,
285,
391,
824,
326,
269,
37543,
89,
340,
50276,
83,
5246,
835,
278,
5246,
556,
253,
830,
278,
5246,
43563,
11170,
436,
476,
320,
2218,
275,
634,
1083,
1512,
4795,
275,
1060,
359,
5926,
253,
14637,
260,
18,
285,
260,
19,
323,
17647,
50276,
43563,
50276,
565,
3151,
740,
9665,
31048,
19641,
46643,
50276,
565,
3151,
740,
6421,
31048,
19641,
50276,
395,
50275,
83,
5246,
50276,
565,
3151,
740,
9665,
40639,
18,
85,
50276,
18,
268,
2851,
85,
50276,
17480,
18747,
1364,
320,
275,
14168,
1179,
480,
3151,
436,
9909,
556,
271,
2192,
2751,
285,
10012,
577,
432,
632,
86,
476,
320,
3732,
281,
5115,
253,
1072,
1543,
347,
275,
436,
2929,
50276,
32429,
4498,
480,
74,
6472,
285,
331,
832,
4692,
209,
693,
251,
49519,
253,
8037,
875,
50276,
71,
305,
507,
285,
369,
2152,
6339,
305,
507,
549,
32693,
638,
3845,
549,
32693,
746,
2313,
26,
33727,
6247,
50276,
965,
86,
632,
86,
439,
86,
606,
8919,
400,
1321,
270,
528,
21118,
285,
465,
312,
267,
11825,
448,
5353,
73,
11317,
11193,
285,
14940,
3607,
273,
1006,
800,
48960,
4715,
16424,
275,
11454,
1491,
5162,
2718,
4240,
50276,
7152,
33032,
1189,
455,
436,
2929,
3400,
16274,
327,
4685,
253,
5161,
273,
1006,
800,
3210,
342,
48960,
13757,
3237,
436,
2929,
2722,
253,
11117,
15018,
273,
830,
8287,
253,
1006,
800,
1566,
13757,
3237,
326,
253,
8607,
476,
2007,
7409,
323,
1805,
16226,
575,
671,
436,
2929,
2722,
326,
1006,
800,
3210,
342,
35021,
2149,
11655,
5115,
253,
1682,
1543,
275,
2710,
15302,
575,
4609,
14371,
253,
15018,
273,
2852,
11701,
273,
1006,
800,
3210,
4583,
436,
2929,
310,
9865,
281,
253,
5145,
4715,
3114,
3340,
323,
1006,
800,
3210,
285,
48960,
3733,
50276,
783,
2708,
403,
690,
7350,
323,
436,
2929,
533,
1110,
7350,
403,
417,
8750,
685,
253,
11361,
273,
436,
2929,
50276,
18,
11745,
4679,
50276,
20261,
253,
4477,
2530,
767,
7180,
2829,
374,
285,
495,
627,
497,
417,
1199,
6260,
670,
253,
1543,
50276,
74,
2096,
326,
352,
310,
417,
271,
3477,
1895,
281,
2096,
672,
943,
359,
897,
534,
1159,
2299,
352,
651,
320,
1270,
604,
253,
4477,
476,
9413,
690,
13554,
275,
253,
1543,
281,
7568,
534,
1511,
273,
3470,
789,
973,
342,
534,
1511,
273,
15302,
50276,
74,
1158,
352,
651,
320,
1270,
281,
897,
690,
13506,
941,
342,
1929,
5319,
273,
10670,
347,
253,
2303,
3268,
281,
12106,
323,
4685,
436,
1127,
50276,
19,
643,
3510,
273,
10895,
50276,
36749,
3210,
403,
7561,
12845,
275,
4382,
8113,
575,
50276,
35529,
627,
403,
2710,
643,
3510,
273,
15302,
326,
476,
755,
5373,
273,
1006,
800,
3210,
824,
347,
10334,
792,
941,
285,
2069,
12395,
941,
50276,
262,
651,
320,
1175,
604,
253,
4477,
476,
2085,
690,
2969,
4679,
281,
7568,
697,
2087,
50228,
50276,
20,
5884,
2792,
50276,
262,
310,
417,
2590,
281,
4979,
875,
5150,
495,
285,
577,
891,
1158,
436,
310,
247,
4619,
629,
275,
436,
2929,
3021,
352,
651,
320,
1175,
281,
5513,
247,
1652,
2372,
625,
323,
436,
629,
50276,
783,
4477,
5513,
253,
3910,
875,
269,
1247,
285,
436,
2929,
2299,
352,
310,
417,
2221,
2590,
281,
2096,
352,
651,
320,
1175,
281,
19148,
436,
1127,
281,
6780,
253,
38135,
273,
436,
2929,
50276,
6438,
4361,
643,
10123,
403,
30080,
85,
932,
50276,
6438,
4361,
512,
253,
10123,
432,
643,
30628,
285,
3969,
30080,
85,
932,
891,
1158,
436,
2929,
310,
247,
1175,
2929,
285,
2217,
281,
320,
7607,
275,
17857,
32888,
50276,
18,
891,
1158,
352,
556,
247,
2590,
3064,
432,
269,
1247,
352,
476,
2085,
247,
747,
2957,
1159,
323,
253,
1006,
800,
3210,
534,
476,
2007,
9017,
253,
2323,
273,
1006,
800,
3210,
275,
253,
2852,
374,
4679,
403,
417,
2221,
4722,
533,
387,
1878,
352,
556,
690,
16875,
4431,
3969,
281,
253,
4477,
3916,
495,
2087,
10527,
1543,
323,
253,
1006,
800,
3210,
824,
347,
672,
943,
359,
897,
534,
2957,
310,
247,
1077,
2834,
1895,
281,
8415,
5046,
436,
2929,
476,
2085,
690,
16875,
4431,
323,
16161,
326,
1781,
1895,
533,
352,
3133,
1512,
1199,
281,
1642,
436,
2181,
281,
253,
4477,
273,
436,
2929,
1293,
326,
891,
1158,
436,
2929,
310,
1335,
4409,
281,
1246,
281,
253,
17857,
32888,
10668,
285,
5014,
50276,
45230,
891,
717,
6306,
327,
619,
3236,
4868,
818,
187,
187,
4118,
18435,
27,
2520,
2929,
4081,
247,
747,
2021,
273,
11655,
323,
305,
507,
285,
2692,
326,
436,
2021,
310,
3240,
2087,
285,
37035,
247,
1180,
273,
5368,
11655,
347,
973,
347,
690,
747,
2957,
3470,
253,
2929,
2429,
21657,
253,
5368,
11655,
285,
253,
747,
4081,
11655,
533,
253,
5649,
273,
436,
2021,
310,
417,
2590,
28055,
285,
436,
789,
858,
417,
671,
2085,
253,
1077,
9371,
16039,
323,
253,
8542,
2898,
273,
305,
507,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper addresses the problem of moe routing under the cases of different network topologies by allocating another abstraction layer for the topology and designing an auxiliary objective to optimize experiments show very good improvement in terms of speed compared to strong baselines strength 1 the paper offers an important contribution to the ai community at the system level which is probably not difficult to approach for many people working in this field in fact in my humble opinion not so many ai people have the opportunity to access detailed hardware information as cloud users such as with azure or aws 2 the experiments show very good improvement over strong baselines system analysis is clearly presented weakness 1 the paper addresses the system level however since it claims a significant boost of speed without sacrificing the model accuracy it needs to show the accuracy eg at least the lmrelated one with nlprelated metrics 2 line 240 which claims without loss of generality is probably too strong my suggestion is if the solution is good with the current hardware settings the authors can run current codes for other many applications of which codes are available to further solidify their claims 3 likewise why not show the data dispatch distribution of other ranks but only rank 0 if space is limited appendix space is always there 4 in the era of gpus and large data the motivation is led by demonstrating only 128mb of data is probably inefficient probably at least some gbs or even stronger in a combination with different types of data would make a stronger motivation 5 no code is provided maybe not very relevant since the paper addresses the systemrelated level and thus is hard to judge those impacts docsepthe paper proposes a new algorithm to improve training efficiency of mixture of experts models in a distributed training setting by exploiting the network topology information to achieve this the authors propose a new auxiliary loss term incorporating communication bandwidth to encourage tokens to be routed to closer nodes rather than further nodes by applying this new algorithm authors claim that they could achiever faster throughput 101x 477x without losing accuracy on their several different clusters as a result they show a faster wallclock time convergence the communication overhead is one of the major issues for the moe model training and this paper proposes a new method to deal with this problem naturally given the increased usage of moe model technology this is a timely work having a soft guidance seems like a good idea not to hurt the original training dynamics while encouraging locality of token routing and as authors mentioned there have not been this kind of topology aware loss terms before as far as i know however there are a few missing details about model configurations and algorithms asked in the question section and the overall speed gain is minor this paper is focusing on the computation algorithm itself so it might not have direct societal impact docsepsparsely gated mixtureofexpert moe plays a vital role in largescale model training but suffers from both load imbalance and global communication in addition the existing even dispatch approach may cause network contention and worsen the previous challenges this work proposed a topologyaware largescale moe training method called tamoe that can adapt communication volume to fit the underlying network topology without interfering with the model convergence the key ideas are abstracting the dispatch problem as a communication cost optimization problem and then adding an auxiliary loss with patternrelated coefficients experiments show that tamoe provides up to 161x speedup and 477x speedup over deepspeedmoe and fastmoe without accuracy loss strengths this work tried to tackle a very significant and interesting challenge in moe system network topology may worsen the communication and load balance problems during the dispatch in moe the paper is well organized and easy to follow the proposed tamoe method is simple and effective extensive experiments show that tamoe is able to offer noticeable speedup over the stateoftheart under different hardware and model configurations weaknesses the experiments are mostly doen with gpt models it would be better to have models with different neural architectures in the evaluation benchmark it is unclear how tamoe works on other moe using models other than gpts the authors have adequately addressed the limitations and potential negative societal impact of their work
### Summary: | mixtureofexpert moe models have demonstrated a lot of success recently to further improve upon the existing literature this paper studies moe routing for different network topologies this is essentially to deal with the communication overhead of moe training the strategy is to add another layer on top for the topology along with a corresponding objective to optimize the authors also provide experiments demonstrating improved speed of convergence the reviewers were in general positive and liked the idea of the paper the reviewers did however raise issues about lack of clear demonstration that accuracy is not compromised lack of large data and a few other more technical concerns the reviewers concerns seem to be more or less addressed by the authors my overall assessment of the paper is positive i think the general premise of the paper is interesting and the paper has interesting ideas i do agree however that the experiments need to be more thorough i am recommending acceptance but request that the authors follow the reviewers comments to improve their experimental results | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
12453,
253,
1895,
273,
278,
3703,
24749,
762,
253,
2219,
273,
1027,
2990,
1755,
5970,
407,
9771,
839,
1529,
38562,
3828,
323,
253,
18080,
285,
20462,
271,
24026,
8103,
281,
22318,
4679,
921,
1077,
1175,
7756,
275,
2426,
273,
3885,
2429,
281,
2266,
1666,
25379,
4757,
50275,
18,
253,
2929,
6131,
271,
1774,
7680,
281,
253,
23105,
3114,
387,
253,
985,
1268,
534,
310,
3164,
417,
2834,
281,
2746,
323,
1142,
952,
2444,
275,
436,
1673,
275,
958,
275,
619,
26896,
4743,
417,
594,
1142,
23105,
952,
452,
253,
5107,
281,
2289,
7000,
10309,
1491,
347,
9005,
4212,
824,
347,
342,
11775,
459,
390,
3768,
84,
50275,
19,
253,
4679,
921,
1077,
1175,
7756,
689,
2266,
1666,
25379,
985,
1783,
310,
4518,
3559,
50275,
20881,
1255,
50276,
18,
253,
2929,
12453,
253,
985,
1268,
2299,
1580,
352,
3916,
247,
1534,
9510,
273,
3885,
1293,
18501,
272,
253,
1566,
7200,
352,
3198,
281,
921,
253,
7200,
24088,
387,
1878,
253,
298,
78,
4919,
581,
342,
295,
24343,
4919,
17082,
50275,
19,
1386,
16918,
534,
3916,
1293,
2957,
273,
31376,
310,
3164,
1512,
2266,
619,
14876,
310,
604,
253,
2900,
310,
1175,
342,
253,
1655,
10309,
7533,
253,
4477,
476,
1408,
1655,
11646,
323,
643,
1142,
4893,
273,
534,
11646,
403,
2130,
281,
2007,
4891,
1419,
616,
3916,
50275,
20,
21223,
2139,
417,
921,
253,
941,
27883,
3268,
273,
643,
17210,
533,
760,
5958,
470,
604,
2317,
310,
3710,
30762,
2317,
310,
1900,
627,
50275,
21,
275,
253,
8685,
273,
31025,
316,
285,
1781,
941,
253,
16038,
310,
3977,
407,
17227,
760,
12842,
1814,
273,
941,
310,
3164,
31334,
3164,
387,
1878,
690,
305,
1768,
390,
1014,
10046,
275,
247,
5019,
342,
1027,
3510,
273,
941,
651,
1056,
247,
10046,
16038,
50275,
22,
642,
2127,
310,
2530,
50275,
28489,
417,
1077,
4623,
1580,
253,
2929,
12453,
253,
985,
4919,
1268,
285,
3021,
310,
1892,
281,
5963,
1110,
16274,
50276,
7152,
339,
431,
248,
2929,
29328,
247,
747,
5933,
281,
3157,
3733,
6733,
273,
7802,
273,
10071,
3210,
275,
247,
5939,
3733,
4758,
407,
38883,
253,
2990,
18080,
1491,
281,
5115,
436,
253,
4477,
12661,
247,
747,
24026,
2957,
1307,
24049,
5511,
16992,
281,
11907,
21761,
281,
320,
50088,
281,
8003,
7632,
2581,
685,
2007,
7632,
407,
9433,
436,
747,
5933,
4477,
1750,
326,
597,
812,
3143,
332,
7938,
28519,
8437,
89,
50276,
45475,
89,
1293,
10305,
7200,
327,
616,
2067,
1027,
9959,
347,
247,
906,
597,
921,
247,
7938,
3402,
13273,
673,
14940,
253,
5511,
18332,
310,
581,
273,
253,
2201,
3374,
323,
253,
278,
3703,
1566,
3733,
285,
436,
2929,
29328,
247,
747,
1332,
281,
2968,
342,
436,
1895,
10748,
1677,
253,
2559,
10393,
273,
278,
3703,
1566,
4302,
436,
310,
247,
14793,
789,
1907,
247,
2602,
12925,
3133,
751,
247,
1175,
2934,
417,
281,
8513,
253,
3236,
3733,
8062,
1223,
18462,
33643,
273,
10669,
24749,
285,
347,
4477,
5393,
627,
452,
417,
644,
436,
2238,
273,
18080,
6600,
2957,
2426,
1078,
347,
2080,
347,
891,
871,
2299,
627,
403,
247,
1643,
5816,
4278,
670,
1566,
16012,
285,
11333,
2546,
275,
253,
1953,
2593,
285,
253,
4583,
3885,
6351,
310,
5884,
436,
2929,
310,
13654,
327,
253,
13782,
5933,
3139,
594,
352,
1537,
417,
452,
1480,
38058,
3486,
5474,
339,
793,
35422,
600,
305,
456,
7802,
80,
453,
89,
8292,
278,
3703,
7120,
247,
12232,
2554,
275,
1236,
2510,
25912,
1566,
3733,
533,
27171,
432,
1097,
3301,
31561,
285,
4156,
5511,
275,
1635,
253,
5368,
1014,
27883,
2746,
778,
2847,
2990,
16422,
285,
548,
8243,
253,
2045,
7881,
436,
789,
4081,
247,
18080,
13823,
1236,
2510,
25912,
278,
3703,
3733,
1332,
1925,
21526,
3703,
326,
476,
5223,
5511,
4644,
281,
4944,
253,
6944,
2990,
18080,
1293,
30809,
342,
253,
1566,
14940,
253,
2234,
5697,
403,
12002,
272,
253,
27883,
1895,
347,
247,
5511,
2105,
13757,
1895,
285,
840,
6240,
271,
24026,
2957,
342,
3102,
4919,
10303,
4679,
921,
326,
21526,
3703,
3400,
598,
281,
22761,
89,
3885,
484,
285,
31339,
89,
3885,
484,
689,
372,
2265,
365,
264,
78,
3703,
285,
3809,
78,
3703,
1293,
7200,
2957,
20544,
50276,
2520,
789,
3597,
281,
18915,
247,
1077,
1534,
285,
4722,
5691,
275,
278,
3703,
985,
2990,
18080,
778,
548,
8243,
253,
5511,
285,
3301,
6654,
3237,
1309,
253,
27883,
275,
278,
3703,
50276,
783,
2929,
310,
973,
10932,
285,
3477,
281,
956,
50276,
783,
4081,
21526,
3703,
1332,
310,
2969,
285,
3576,
9470,
4679,
921,
326,
21526,
3703,
310,
2104,
281,
3959,
28629,
3885,
484,
689,
253,
1375,
23037,
14387,
762,
1027,
10309,
50276,
395,
1566,
16012,
50276,
20881,
1255,
265,
50276,
783,
4679,
403,
6571,
513,
257,
342,
305,
431,
3210,
352,
651,
320,
1805,
281,
452,
3210,
342,
1027,
11454,
35615,
275,
253,
7103,
22791,
352,
310,
12744,
849,
21526,
3703,
2987,
327,
643,
278,
3703,
970,
3210,
643,
685,
305,
45276,
253,
4477,
452,
18212,
9713,
253,
7364,
285,
2442,
4016,
38058,
3486,
273,
616,
789,
2490,
187,
4118,
18435,
27,
7373,
6638,
80,
453,
89,
8292,
278,
3703,
3210,
452,
5183,
247,
2257,
273,
2323,
4102,
281,
2007,
3157,
2220,
253,
5368,
6239,
436,
2929,
2175,
278,
3703,
24749,
323,
1027,
2990,
1755,
5970,
436,
310,
9093,
281,
2968,
342,
253,
5511,
18332,
273,
278,
3703,
3733,
253,
5700,
310,
281,
823,
1529,
3828,
327,
1755,
323,
253,
18080,
2112,
342,
247,
3969,
8103,
281,
22318,
253,
4477,
671,
2085,
4679,
17227,
5520,
3885,
273,
14940,
253,
30628,
497,
275,
2087,
2762,
285,
10490,
253,
2934,
273,
253,
2929,
253,
30628,
858,
2299,
7164,
3374,
670,
3480,
273,
2590,
20028,
326,
7200,
310,
417,
25047,
3480,
273,
1781,
941,
285,
247,
1643,
643,
625,
7681,
7350,
253,
30628,
7350,
1646,
281,
320,
625,
390,
1679,
9713,
407,
253,
4477,
619,
4583,
6803,
273,
253,
2929,
310,
2762,
891,
1158,
253,
2087,
26536,
273,
253,
2929,
310,
4722,
285,
253,
2929,
556,
4722,
5697,
891,
513,
5194,
2299,
326,
253,
4679,
878,
281,
320,
625,
11080,
891,
717,
46705,
14924,
533,
2748,
326,
253,
4477,
956,
253,
30628,
5701,
281,
3157,
616,
5661,
1543
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
12453,
253,
1895,
273,
278,
3703,
24749,
762,
253,
2219,
273,
1027,
2990,
1755,
5970,
407,
9771,
839,
1529,
38562,
3828,
323,
253,
18080,
285,
20462,
271,
24026,
8103,
281,
22318,
4679,
921,
1077,
1175,
7756,
275,
2426,
273,
3885,
2429,
281,
2266,
1666,
25379,
4757,
50275,
18,
253,
2929,
6131,
271,
1774,
7680,
281,
253,
23105,
3114,
387,
253,
985,
1268,
534,
310,
3164,
417,
2834,
281,
2746,
323,
1142,
952,
2444,
275,
436,
1673,
275,
958,
275,
619,
26896,
4743,
417,
594,
1142,
23105,
952,
452,
253,
5107,
281,
2289,
7000,
10309,
1491,
347,
9005,
4212,
824,
347,
342,
11775,
459,
390,
3768,
84,
50275,
19,
253,
4679,
921,
1077,
1175,
7756,
689,
2266,
1666,
25379,
985,
1783,
310,
4518,
3559,
50275,
20881,
1255,
50276,
18,
253,
2929,
12453,
253,
985,
1268,
2299,
1580,
352,
3916,
247,
1534,
9510,
273,
3885,
1293,
18501,
272,
253,
1566,
7200,
352,
3198,
281,
921,
253,
7200,
24088,
387,
1878,
253,
298,
78,
4919,
581,
342,
295,
24343,
4919,
17082,
50275,
19,
1386,
16918,
534,
3916,
1293,
2957,
273,
31376,
310,
3164,
1512,
2266,
619,
14876,
310,
604,
253,
2900,
310,
1175,
342,
253,
1655,
10309,
7533,
253,
4477,
476,
1408,
1655,
11646,
323,
643,
1142,
4893,
273,
534,
11646,
403,
2130,
281,
2007,
4891,
1419,
616,
3916,
50275,
20,
21223,
2139,
417,
921,
253,
941,
27883,
3268,
273,
643,
17210,
533,
760,
5958,
470,
604,
2317,
310,
3710,
30762,
2317,
310,
1900,
627,
50275,
21,
275,
253,
8685,
273,
31025,
316,
285,
1781,
941,
253,
16038,
310,
3977,
407,
17227,
760,
12842,
1814,
273,
941,
310,
3164,
31334,
3164,
387,
1878,
690,
305,
1768,
390,
1014,
10046,
275,
247,
5019,
342,
1027,
3510,
273,
941,
651,
1056,
247,
10046,
16038,
50275,
22,
642,
2127,
310,
2530,
50275,
28489,
417,
1077,
4623,
1580,
253,
2929,
12453,
253,
985,
4919,
1268,
285,
3021,
310,
1892,
281,
5963,
1110,
16274,
50276,
7152,
339,
431,
248,
2929,
29328,
247,
747,
5933,
281,
3157,
3733,
6733,
273,
7802,
273,
10071,
3210,
275,
247,
5939,
3733,
4758,
407,
38883,
253,
2990,
18080,
1491,
281,
5115,
436,
253,
4477,
12661,
247,
747,
24026,
2957,
1307,
24049,
5511,
16992,
281,
11907,
21761,
281,
320,
50088,
281,
8003,
7632,
2581,
685,
2007,
7632,
407,
9433,
436,
747,
5933,
4477,
1750,
326,
597,
812,
3143,
332,
7938,
28519,
8437,
89,
50276,
45475,
89,
1293,
10305,
7200,
327,
616,
2067,
1027,
9959,
347,
247,
906,
597,
921,
247,
7938,
3402,
13273,
673,
14940,
253,
5511,
18332,
310,
581,
273,
253,
2201,
3374,
323,
253,
278,
3703,
1566,
3733,
285,
436,
2929,
29328,
247,
747,
1332,
281,
2968,
342,
436,
1895,
10748,
1677,
253,
2559,
10393,
273,
278,
3703,
1566,
4302,
436,
310,
247,
14793,
789,
1907,
247,
2602,
12925,
3133,
751,
247,
1175,
2934,
417,
281,
8513,
253,
3236,
3733,
8062,
1223,
18462,
33643,
273,
10669,
24749,
285,
347,
4477,
5393,
627,
452,
417,
644,
436,
2238,
273,
18080,
6600,
2957,
2426,
1078,
347,
2080,
347,
891,
871,
2299,
627,
403,
247,
1643,
5816,
4278,
670,
1566,
16012,
285,
11333,
2546,
275,
253,
1953,
2593,
285,
253,
4583,
3885,
6351,
310,
5884,
436,
2929,
310,
13654,
327,
253,
13782,
5933,
3139,
594,
352,
1537,
417,
452,
1480,
38058,
3486,
5474,
339,
793,
35422,
600,
305,
456,
7802,
80,
453,
89,
8292,
278,
3703,
7120,
247,
12232,
2554,
275,
1236,
2510,
25912,
1566,
3733,
533,
27171,
432,
1097,
3301,
31561,
285,
4156,
5511,
275,
1635,
253,
5368,
1014,
27883,
2746,
778,
2847,
2990,
16422,
285,
548,
8243,
253,
2045,
7881,
436,
789,
4081,
247,
18080,
13823,
1236,
2510,
25912,
278,
3703,
3733,
1332,
1925,
21526,
3703,
326,
476,
5223,
5511,
4644,
281,
4944,
253,
6944,
2990,
18080,
1293,
30809,
342,
253,
1566,
14940,
253,
2234,
5697,
403,
12002,
272,
253,
27883,
1895,
347,
247,
5511,
2105,
13757,
1895,
285,
840,
6240,
271,
24026,
2957,
342,
3102,
4919,
10303,
4679,
921,
326,
21526,
3703,
3400,
598,
281,
22761,
89,
3885,
484,
285,
31339,
89,
3885,
484,
689,
372,
2265,
365,
264,
78,
3703,
285,
3809,
78,
3703,
1293,
7200,
2957,
20544,
50276,
2520,
789,
3597,
281,
18915,
247,
1077,
1534,
285,
4722,
5691,
275,
278,
3703,
985,
2990,
18080,
778,
548,
8243,
253,
5511,
285,
3301,
6654,
3237,
1309,
253,
27883,
275,
278,
3703,
50276,
783,
2929,
310,
973,
10932,
285,
3477,
281,
956,
50276,
783,
4081,
21526,
3703,
1332,
310,
2969,
285,
3576,
9470,
4679,
921,
326,
21526,
3703,
310,
2104,
281,
3959,
28629,
3885,
484,
689,
253,
1375,
23037,
14387,
762,
1027,
10309,
50276,
395,
1566,
16012,
50276,
20881,
1255,
265,
50276,
783,
4679,
403,
6571,
513,
257,
342,
305,
431,
3210,
352,
651,
320,
1805,
281,
452,
3210,
342,
1027,
11454,
35615,
275,
253,
7103,
22791,
352,
310,
12744,
849,
21526,
3703,
2987,
327,
643,
278,
3703,
970,
3210,
643,
685,
305,
45276,
253,
4477,
452,
18212,
9713,
253,
7364,
285,
2442,
4016,
38058,
3486,
273,
616,
789,
2490,
187,
4118,
18435,
27,
7373,
6638,
80,
453,
89,
8292,
278,
3703,
3210,
452,
5183,
247,
2257,
273,
2323,
4102,
281,
2007,
3157,
2220,
253,
5368,
6239,
436,
2929,
2175,
278,
3703,
24749,
323,
1027,
2990,
1755,
5970,
436,
310,
9093,
281,
2968,
342,
253,
5511,
18332,
273,
278,
3703,
3733,
253,
5700,
310,
281,
823,
1529,
3828,
327,
1755,
323,
253,
18080,
2112,
342,
247,
3969,
8103,
281,
22318,
253,
4477,
671,
2085,
4679,
17227,
5520,
3885,
273,
14940,
253,
30628,
497,
275,
2087,
2762,
285,
10490,
253,
2934,
273,
253,
2929,
253,
30628,
858,
2299,
7164,
3374,
670,
3480,
273,
2590,
20028,
326,
7200,
310,
417,
25047,
3480,
273,
1781,
941,
285,
247,
1643,
643,
625,
7681,
7350,
253,
30628,
7350,
1646,
281,
320,
625,
390,
1679,
9713,
407,
253,
4477,
619,
4583,
6803,
273,
253,
2929,
310,
2762,
891,
1158,
253,
2087,
26536,
273,
253,
2929,
310,
4722,
285,
253,
2929,
556,
4722,
5697,
891,
513,
5194,
2299,
326,
253,
4679,
878,
281,
320,
625,
11080,
891,
717,
46705,
14924,
533,
2748,
326,
253,
4477,
956,
253,
30628,
5701,
281,
3157,
616,
5661,
1543
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this paper discusses applications of variants of rnns and gated cnn to acoustic modeling in embedded speech recognition systems and the main focus of the paper is computational memory efficiency when we deploy the system the paper well describes the problem of the current lstm especially focusing on the recurrent connection matrix operations which is a bottle neck in this scenario and introduces variants of rnns eg qrnn also these variants may not yield enough performance compared with lstm but 1d convolution andor deep structure helps to avoid the degradation one of the biggest issues of this paper is that they use ctc as an acoustic model while still many real speech recognition applications and major open source kaldi use hybrid hmmdnntdnn lstm cnn etc systems therefore the papers claim on ctc is not along with the current application trends it may be changed near future but still hybrid systems are dominant for example the wsj wer performance listed in table 3 is easily obtained by a simple feedforward dnn in the hybrid system the latest lattice free mmi with tdnn can achieve better performance 2x wer and this decoding is quite fast compared with lstm the authors should consider this current situation of stateoftheart speech recognition also the techniques described in the paper are all based on existing techniques and the paper lacks the technical novelty other comments in abstract and the first part of introduction as i mentioned above ctc based characterprediction modeling is not a major acoustic model the paper needs some discussions about tdnn which is a major acoustic modeling fast and accurate in kaldi p4 first line and represents elementwise multiplication the elementwise multiplication operation was first appeared in eq 1 and it should be explained there section 32 i actually dont fully understand the claims of this experiment based on timit as it is phoneme recognition and not directly related to the real application which is the main target of this paper i think my suggestion is to place these timit based experiments as a preliminary experiment to investigate the variants of rnn or gated cnn before the wsj experiments i did not say that section 32 is useless this analysis is actually valuable and this suggested change about the position of this timit experiment can avoid some confusion of the main target of this paper docsepthis paper present a study on efficient acoustic modeling using neural networksbased model four approaches are presented and evaluated diag lstm qrnn gated convnet and adding a 1d convolution layer the evaluation is done on asr task using wsj and in phoneme classification task using the timit corpus the study show that the inference speed is improved with comparable of better performance than the standard lstm model the findings presented in this paper are interesting and quite useful when one wants to implement a lstmbased acoustic model on mobile devices the paper is well written and easy to ready the main issue of this paper is the lack of novelty the three evaluated approaches diag lstm qrnn and gated convnet are not novel the only novelty is the addition of a 1d convolution which is not enough for a conference like iclr minor comments on the experiments the network quantization approach has been shown to lead to efficient neural networks could the authors provide a comparison between their approach and the quantization approach on the timit experiment the authors could add a decoder and use the per metric instead of the frame accuracy so they could provide comparison with the literature wsj and timit are quite small corpora compared to the available corpora maybe the authors should consider using large corpora like librispeech it could be interesting to see the performance of the presented approaches overall this paper feels more like a technical report the findings could be useful but its novelty is too limited for iclr hence i argue for rejection and suggest that the authors consider submitting the paper to a speech conference like icasspdocsepthis paper investigates a number of techniques and neural network architectures for embedded acoustic modeling the goal is to reduce the memory access and make efficient computation in the meantime to sustain good asr performance overall the paper is well motivated and well written however i have following concerns 1 it is not clear from the paper whether both the training and inference are conducted on embedded devices or only the inference i assume it is the latter but cant find it explicitly mentioned in the paper 2 the exploration carried out in the paper is more on the system level and the novelty is not overwhelmingly significant 3 my major concern is that the reported wers on wsj and phoneme classification accuracy are quite off 2030 wers for wsj do not seem to be usable in real applications honestly i dont even think this performance is better than welltrained gmmhmm acoustic models using a viterbi decoder furthermore there is no clear winners across the investigated architectures in terms of performance one question is if one wants to deploy such an ondevice system which architecture shall be chosen 4 a more general comment on the work explored in the paper first of all the ondevice memory issue puts a heavy constraint on the capacity of acoustic models which will significantly hurt the modeling capability for the dnnbased acoustic models deep learning acoustic models can outperform gmmhmm because they can use large model capacity with very deep and complex architectures when a large amount of training data is available second for ctc when the training data is limited its performance is far worse than the hybrid dnnhmm model let alone a pure endtoend fashion without using external lm and dictionary if wfstbased decoders composition of wfsts of lm dictionary and deblankrepetition are used then the memory issue will surface again
### Summary: | in this work the authors conduct experiments using variants of rnns and gated cnns on a speech recognition task motivated by the goal of reducing the computational requirements when deploying these models on mobile devices while this is an important concern for practical deployment of asr systems the main concerns expressed by the reviewers is that the work lacks novelty further the authors choice to investigate ctc based systems which predict characters these models are not stateoftheart for asr and as such it is hard to judge the impact of this work on a stateoftheart embedded asr system finally it would be beneficial to replicate results on a much larger corpus such as librispeech or switchboard based on the unanimous decision from the reviewers the ac agrees that the work in the present form should be rejected | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
25339,
4893,
273,
11640,
273,
391,
79,
2224,
285,
305,
456,
260,
9866,
281,
19463,
14053,
275,
12691,
6519,
8981,
2718,
285,
253,
2022,
2770,
273,
253,
2929,
310,
15180,
3541,
6733,
672,
359,
8745,
253,
985,
253,
2929,
973,
8631,
253,
1895,
273,
253,
1655,
298,
296,
78,
3340,
13654,
327,
253,
18902,
4602,
4315,
5871,
534,
310,
247,
11996,
7623,
275,
436,
10076,
285,
23970,
11640,
273,
391,
79,
2224,
24088,
2805,
83,
9866,
671,
841,
11640,
778,
417,
4917,
2217,
3045,
2429,
342,
298,
296,
78,
533,
337,
69,
27311,
285,
263,
3676,
2605,
7729,
281,
3693,
253,
11961,
581,
273,
253,
5962,
3374,
273,
436,
2929,
310,
326,
597,
897,
260,
18038,
347,
271,
19463,
1566,
1223,
1335,
1142,
1524,
6519,
8981,
4893,
285,
2201,
1527,
2603,
465,
267,
5168,
897,
9769,
34746,
17915,
2649,
69,
9866,
298,
296,
78,
260,
9866,
3966,
2718,
3103,
253,
9380,
1750,
327,
260,
18038,
310,
417,
2112,
342,
253,
1655,
2898,
13554,
352,
778,
320,
4391,
2822,
2852,
533,
1335,
9769,
2718,
403,
11360,
323,
1650,
253,
37280,
75,
16640,
3045,
7117,
275,
2829,
495,
310,
4354,
2797,
407,
247,
2969,
3997,
10495,
277,
9866,
275,
253,
9769,
985,
253,
6323,
10979,
1959,
5823,
74,
342,
32989,
9866,
476,
5115,
1805,
3045,
374,
89,
16640,
285,
436,
28490,
310,
3240,
3809,
2429,
342,
298,
296,
78,
253,
4477,
943,
1908,
436,
1655,
4112,
273,
1375,
23037,
14387,
6519,
8981,
671,
253,
5609,
2529,
275,
253,
2929,
403,
512,
1754,
327,
5368,
5609,
285,
253,
2929,
19756,
253,
7681,
38135,
50276,
977,
5701,
50276,
249,
12002,
285,
253,
806,
629,
273,
10199,
347,
891,
5393,
1840,
260,
18038,
1754,
1894,
12787,
2474,
14053,
310,
417,
247,
2201,
19463,
1566,
50276,
783,
2929,
3198,
690,
11985,
670,
32989,
9866,
534,
310,
247,
2201,
19463,
14053,
3809,
285,
7899,
275,
465,
267,
5168,
50276,
81,
21,
806,
1386,
285,
209,
189,
6125,
3284,
3020,
25219,
253,
3284,
3020,
25219,
4254,
369,
806,
5420,
275,
16186,
337,
285,
352,
943,
320,
5544,
627,
50276,
4674,
4567,
891,
2686,
13414,
4751,
2096,
253,
3916,
273,
436,
3368,
1754,
327,
4522,
262,
347,
352,
310,
20445,
20867,
8981,
285,
417,
3587,
2905,
281,
253,
1524,
2898,
534,
310,
253,
2022,
2303,
273,
436,
2929,
891,
1158,
619,
14876,
310,
281,
1659,
841,
4522,
262,
1754,
4679,
347,
247,
12611,
3368,
281,
7409,
253,
11640,
273,
391,
9866,
390,
305,
456,
260,
9866,
1078,
253,
37280,
75,
4679,
891,
858,
417,
1333,
326,
2593,
4567,
310,
19437,
436,
1783,
310,
2686,
9865,
285,
436,
5125,
1818,
670,
253,
1899,
273,
436,
4522,
262,
3368,
476,
3693,
690,
13775,
273,
253,
2022,
2303,
273,
436,
2929,
50275,
7152,
33032,
2520,
2929,
1246,
247,
1263,
327,
5919,
19463,
14053,
970,
11454,
6928,
3169,
1566,
1740,
7274,
403,
3559,
285,
6760,
6687,
298,
296,
78,
2805,
83,
9866,
305,
456,
2410,
3024,
285,
6240,
247,
337,
69,
27311,
3828,
253,
7103,
310,
2218,
327,
347,
83,
4836,
970,
37280,
75,
285,
275,
20445,
20867,
9162,
4836,
970,
253,
4522,
262,
20689,
253,
1263,
921,
326,
253,
17032,
3885,
310,
5520,
342,
10870,
273,
1805,
3045,
685,
253,
2629,
298,
296,
78,
1566,
50276,
783,
4342,
3559,
275,
436,
2929,
403,
4722,
285,
3240,
4217,
672,
581,
5605,
281,
3359,
247,
298,
296,
1814,
833,
19463,
1566,
327,
6109,
4095,
253,
2929,
310,
973,
3542,
285,
3477,
281,
4704,
50275,
783,
2022,
2523,
273,
436,
2929,
310,
253,
3480,
273,
38135,
253,
1264,
6760,
7274,
6687,
298,
296,
78,
2805,
83,
9866,
285,
305,
456,
2410,
3024,
403,
417,
4460,
253,
760,
38135,
310,
253,
1635,
273,
247,
337,
69,
27311,
534,
310,
417,
2217,
323,
247,
8059,
751,
17857,
32888,
50275,
37585,
5701,
327,
253,
4679,
50276,
783,
2990,
36643,
2746,
556,
644,
2011,
281,
1421,
281,
5919,
11454,
6928,
812,
253,
4477,
2085,
247,
5301,
875,
616,
2746,
285,
253,
36643,
2746,
50275,
251,
253,
4522,
262,
3368,
253,
4477,
812,
823,
247,
29810,
285,
897,
253,
591,
7982,
3185,
273,
253,
3665,
7200,
594,
597,
812,
2085,
5301,
342,
253,
6239,
50275,
8819,
75,
285,
4522,
262,
403,
3240,
1355,
5944,
66,
2429,
281,
253,
2130,
5944,
66,
5046,
253,
4477,
943,
1908,
970,
1781,
5944,
66,
751,
40211,
261,
365,
5036,
352,
812,
320,
4722,
281,
923,
253,
3045,
273,
253,
3559,
7274,
50276,
1189,
455,
436,
2929,
9193,
625,
751,
247,
7681,
1304,
253,
4342,
812,
320,
4217,
533,
697,
38135,
310,
1512,
3710,
323,
17857,
32888,
7613,
891,
9059,
323,
18235,
285,
1804,
326,
253,
4477,
1908,
29315,
253,
2929,
281,
247,
6519,
8059,
751,
17857,
515,
81,
7152,
33032,
2520,
2929,
2340,
684,
247,
1180,
273,
5609,
285,
11454,
2990,
35615,
323,
12691,
19463,
14053,
50276,
783,
4736,
310,
281,
4796,
253,
3541,
2289,
285,
1056,
5919,
13782,
275,
253,
22849,
281,
10265,
1175,
347,
83,
3045,
50276,
1189,
455,
253,
2929,
310,
973,
17194,
285,
973,
3542,
50276,
35529,
891,
452,
1563,
7350,
50276,
18,
352,
310,
417,
2590,
432,
253,
2929,
1880,
1097,
253,
3733,
285,
17032,
403,
5196,
327,
12691,
4095,
390,
760,
253,
17032,
50276,
74,
5467,
352,
310,
253,
6158,
533,
16216,
1089,
352,
11120,
5393,
275,
253,
2929,
50274,
19,
253,
17947,
4824,
562,
275,
253,
2929,
310,
625,
327,
253,
985,
1268,
285,
253,
38135,
310,
417,
42935,
1534,
50276,
20,
619,
2201,
4468,
310,
326,
253,
2361,
259,
398,
327,
37280,
75,
285,
20445,
20867,
9162,
7200,
403,
3240,
745,
50276,
938,
1229,
259,
398,
323,
37280,
75,
50276,
3088,
417,
1646,
281,
320,
31998,
275,
1524,
4893,
50276,
25224,
27459,
891,
13414,
1014,
1158,
436,
3045,
310,
1805,
685,
973,
32927,
305,
2188,
44264,
19463,
3210,
970,
247,
362,
2562,
4193,
29810,
50276,
44295,
3062,
627,
310,
642,
2590,
20721,
2439,
253,
6949,
35615,
50276,
249,
2426,
273,
3045,
50276,
531,
1953,
310,
604,
581,
5605,
281,
8745,
824,
271,
327,
10933,
985,
534,
10336,
3091,
320,
6777,
50274,
21,
247,
625,
2087,
4385,
327,
253,
789,
14859,
50276,
249,
253,
2929,
50276,
7053,
273,
512,
253,
327,
10933,
3541,
2523,
12516,
247,
5536,
7658,
327,
253,
5350,
273,
19463,
3210,
534,
588,
3012,
8513,
253,
14053,
14603,
323,
253,
277,
9866,
3169,
19463,
3210,
50276,
22412,
4715,
19463,
3210,
476,
562,
32231,
305,
2188,
44264,
984,
597,
476,
897,
1781,
1566,
5350,
342,
1077,
3676,
285,
2570,
35615,
672,
247,
1781,
2408,
273,
3733,
941,
310,
2130,
50276,
9815,
323,
260,
18038,
672,
253,
3733,
941,
310,
3710,
50276,
953,
3045,
310,
2080,
7197,
685,
253,
9769,
277,
9866,
44264,
1566,
1339,
3815,
247,
6313,
990,
936,
423,
8142,
1293,
970,
6024,
298,
78,
285,
19034,
50276,
338,
259,
71,
296,
3169,
1086,
351,
398,
5889,
273,
259,
71,
7752,
273,
298,
78,
19034,
285,
372,
22473,
4762,
2930,
403,
908,
840,
253,
3541,
2523,
588,
2553,
969,
50276,
187,
187,
4118,
18435,
27,
249,
436,
789,
253,
4477,
2589,
4679,
970,
11640,
273,
391,
79,
2224,
285,
305,
456,
260,
79,
2224,
327,
247,
6519,
8981,
4836,
17194,
407,
253,
4736,
273,
8493,
253,
15180,
6095,
672,
45021,
841,
3210,
327,
6109,
4095,
1223,
436,
310,
271,
1774,
4468,
323,
8542,
19007,
273,
347,
83,
2718,
253,
2022,
7350,
4469,
407,
253,
30628,
310,
326,
253,
789,
19756,
38135,
2007,
253,
4477,
4327,
281,
7409,
260,
18038,
1754,
2718,
534,
3283,
5810,
841,
3210,
403,
417,
1375,
23037,
14387,
323,
347,
83,
285,
347,
824,
352,
310,
1892,
281,
5963,
253,
3486,
273,
436,
789,
327,
247,
1375,
23037,
14387,
12691,
347,
83,
985,
4720,
352,
651,
320,
12912,
281,
25464,
1543,
327,
247,
1199,
4067,
20689,
824,
347,
40211,
261,
365,
5036,
390,
5234,
4697,
1754,
327,
253,
42293,
3061,
432,
253,
30628,
253,
913,
18726,
326,
253,
789,
275,
253,
1246,
830,
943,
320,
10945,
209
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
2520,
2929,
25339,
4893,
273,
11640,
273,
391,
79,
2224,
285,
305,
456,
260,
9866,
281,
19463,
14053,
275,
12691,
6519,
8981,
2718,
285,
253,
2022,
2770,
273,
253,
2929,
310,
15180,
3541,
6733,
672,
359,
8745,
253,
985,
253,
2929,
973,
8631,
253,
1895,
273,
253,
1655,
298,
296,
78,
3340,
13654,
327,
253,
18902,
4602,
4315,
5871,
534,
310,
247,
11996,
7623,
275,
436,
10076,
285,
23970,
11640,
273,
391,
79,
2224,
24088,
2805,
83,
9866,
671,
841,
11640,
778,
417,
4917,
2217,
3045,
2429,
342,
298,
296,
78,
533,
337,
69,
27311,
285,
263,
3676,
2605,
7729,
281,
3693,
253,
11961,
581,
273,
253,
5962,
3374,
273,
436,
2929,
310,
326,
597,
897,
260,
18038,
347,
271,
19463,
1566,
1223,
1335,
1142,
1524,
6519,
8981,
4893,
285,
2201,
1527,
2603,
465,
267,
5168,
897,
9769,
34746,
17915,
2649,
69,
9866,
298,
296,
78,
260,
9866,
3966,
2718,
3103,
253,
9380,
1750,
327,
260,
18038,
310,
417,
2112,
342,
253,
1655,
2898,
13554,
352,
778,
320,
4391,
2822,
2852,
533,
1335,
9769,
2718,
403,
11360,
323,
1650,
253,
37280,
75,
16640,
3045,
7117,
275,
2829,
495,
310,
4354,
2797,
407,
247,
2969,
3997,
10495,
277,
9866,
275,
253,
9769,
985,
253,
6323,
10979,
1959,
5823,
74,
342,
32989,
9866,
476,
5115,
1805,
3045,
374,
89,
16640,
285,
436,
28490,
310,
3240,
3809,
2429,
342,
298,
296,
78,
253,
4477,
943,
1908,
436,
1655,
4112,
273,
1375,
23037,
14387,
6519,
8981,
671,
253,
5609,
2529,
275,
253,
2929,
403,
512,
1754,
327,
5368,
5609,
285,
253,
2929,
19756,
253,
7681,
38135,
50276,
977,
5701,
50276,
249,
12002,
285,
253,
806,
629,
273,
10199,
347,
891,
5393,
1840,
260,
18038,
1754,
1894,
12787,
2474,
14053,
310,
417,
247,
2201,
19463,
1566,
50276,
783,
2929,
3198,
690,
11985,
670,
32989,
9866,
534,
310,
247,
2201,
19463,
14053,
3809,
285,
7899,
275,
465,
267,
5168,
50276,
81,
21,
806,
1386,
285,
209,
189,
6125,
3284,
3020,
25219,
253,
3284,
3020,
25219,
4254,
369,
806,
5420,
275,
16186,
337,
285,
352,
943,
320,
5544,
627,
50276,
4674,
4567,
891,
2686,
13414,
4751,
2096,
253,
3916,
273,
436,
3368,
1754,
327,
4522,
262,
347,
352,
310,
20445,
20867,
8981,
285,
417,
3587,
2905,
281,
253,
1524,
2898,
534,
310,
253,
2022,
2303,
273,
436,
2929,
891,
1158,
619,
14876,
310,
281,
1659,
841,
4522,
262,
1754,
4679,
347,
247,
12611,
3368,
281,
7409,
253,
11640,
273,
391,
9866,
390,
305,
456,
260,
9866,
1078,
253,
37280,
75,
4679,
891,
858,
417,
1333,
326,
2593,
4567,
310,
19437,
436,
1783,
310,
2686,
9865,
285,
436,
5125,
1818,
670,
253,
1899,
273,
436,
4522,
262,
3368,
476,
3693,
690,
13775,
273,
253,
2022,
2303,
273,
436,
2929,
50275,
7152,
33032,
2520,
2929,
1246,
247,
1263,
327,
5919,
19463,
14053,
970,
11454,
6928,
3169,
1566,
1740,
7274,
403,
3559,
285,
6760,
6687,
298,
296,
78,
2805,
83,
9866,
305,
456,
2410,
3024,
285,
6240,
247,
337,
69,
27311,
3828,
253,
7103,
310,
2218,
327,
347,
83,
4836,
970,
37280,
75,
285,
275,
20445,
20867,
9162,
4836,
970,
253,
4522,
262,
20689,
253,
1263,
921,
326,
253,
17032,
3885,
310,
5520,
342,
10870,
273,
1805,
3045,
685,
253,
2629,
298,
296,
78,
1566,
50276,
783,
4342,
3559,
275,
436,
2929,
403,
4722,
285,
3240,
4217,
672,
581,
5605,
281,
3359,
247,
298,
296,
1814,
833,
19463,
1566,
327,
6109,
4095,
253,
2929,
310,
973,
3542,
285,
3477,
281,
4704,
50275,
783,
2022,
2523,
273,
436,
2929,
310,
253,
3480,
273,
38135,
253,
1264,
6760,
7274,
6687,
298,
296,
78,
2805,
83,
9866,
285,
305,
456,
2410,
3024,
403,
417,
4460,
253,
760,
38135,
310,
253,
1635,
273,
247,
337,
69,
27311,
534,
310,
417,
2217,
323,
247,
8059,
751,
17857,
32888,
50275,
37585,
5701,
327,
253,
4679,
50276,
783,
2990,
36643,
2746,
556,
644,
2011,
281,
1421,
281,
5919,
11454,
6928,
812,
253,
4477,
2085,
247,
5301,
875,
616,
2746,
285,
253,
36643,
2746,
50275,
251,
253,
4522,
262,
3368,
253,
4477,
812,
823,
247,
29810,
285,
897,
253,
591,
7982,
3185,
273,
253,
3665,
7200,
594,
597,
812,
2085,
5301,
342,
253,
6239,
50275,
8819,
75,
285,
4522,
262,
403,
3240,
1355,
5944,
66,
2429,
281,
253,
2130,
5944,
66,
5046,
253,
4477,
943,
1908,
970,
1781,
5944,
66,
751,
40211,
261,
365,
5036,
352,
812,
320,
4722,
281,
923,
253,
3045,
273,
253,
3559,
7274,
50276,
1189,
455,
436,
2929,
9193,
625,
751,
247,
7681,
1304,
253,
4342,
812,
320,
4217,
533,
697,
38135,
310,
1512,
3710,
323,
17857,
32888,
7613,
891,
9059,
323,
18235,
285,
1804,
326,
253,
4477,
1908,
29315,
253,
2929,
281,
247,
6519,
8059,
751,
17857,
515,
81,
7152,
33032,
2520,
2929,
2340,
684,
247,
1180,
273,
5609,
285,
11454,
2990,
35615,
323,
12691,
19463,
14053,
50276,
783,
4736,
310,
281,
4796,
253,
3541,
2289,
285,
1056,
5919,
13782,
275,
253,
22849,
281,
10265,
1175,
347,
83,
3045,
50276,
1189,
455,
253,
2929,
310,
973,
17194,
285,
973,
3542,
50276,
35529,
891,
452,
1563,
7350,
50276,
18,
352,
310,
417,
2590,
432,
253,
2929,
1880,
1097,
253,
3733,
285,
17032,
403,
5196,
327,
12691,
4095,
390,
760,
253,
17032,
50276,
74,
5467,
352,
310,
253,
6158,
533,
16216,
1089,
352,
11120,
5393,
275,
253,
2929,
50274,
19,
253,
17947,
4824,
562,
275,
253,
2929,
310,
625,
327,
253,
985,
1268,
285,
253,
38135,
310,
417,
42935,
1534,
50276,
20,
619,
2201,
4468,
310,
326,
253,
2361,
259,
398,
327,
37280,
75,
285,
20445,
20867,
9162,
7200,
403,
3240,
745,
50276,
938,
1229,
259,
398,
323,
37280,
75,
50276,
3088,
417,
1646,
281,
320,
31998,
275,
1524,
4893,
50276,
25224,
27459,
891,
13414,
1014,
1158,
436,
3045,
310,
1805,
685,
973,
32927,
305,
2188,
44264,
19463,
3210,
970,
247,
362,
2562,
4193,
29810,
50276,
44295,
3062,
627,
310,
642,
2590,
20721,
2439,
253,
6949,
35615,
50276,
249,
2426,
273,
3045,
50276,
531,
1953,
310,
604,
581,
5605,
281,
8745,
824,
271,
327,
10933,
985,
534,
10336,
3091,
320,
6777,
50274,
21,
247,
625,
2087,
4385,
327,
253,
789,
14859,
50276,
249,
253,
2929,
50276,
7053,
273,
512,
253,
327,
10933,
3541,
2523,
12516,
247,
5536,
7658,
327,
253,
5350,
273,
19463,
3210,
534,
588,
3012,
8513,
253,
14053,
14603,
323,
253,
277,
9866,
3169,
19463,
3210,
50276,
22412,
4715,
19463,
3210,
476,
562,
32231,
305,
2188,
44264,
984,
597,
476,
897,
1781,
1566,
5350,
342,
1077,
3676,
285,
2570,
35615,
672,
247,
1781,
2408,
273,
3733,
941,
310,
2130,
50276,
9815,
323,
260,
18038,
672,
253,
3733,
941,
310,
3710,
50276,
953,
3045,
310,
2080,
7197,
685,
253,
9769,
277,
9866,
44264,
1566,
1339,
3815,
247,
6313,
990,
936,
423,
8142,
1293,
970,
6024,
298,
78,
285,
19034,
50276,
338,
259,
71,
296,
3169,
1086,
351,
398,
5889,
273,
259,
71,
7752,
273,
298,
78,
19034,
285,
372,
22473,
4762,
2930,
403,
908,
840,
253,
3541,
2523,
588,
2553,
969,
50276,
187,
187,
4118,
18435,
27,
249,
436,
789,
253,
4477,
2589,
4679,
970,
11640,
273,
391,
79,
2224,
285,
305,
456,
260,
79,
2224,
327,
247,
6519,
8981,
4836,
17194,
407,
253,
4736,
273,
8493,
253,
15180,
6095,
672,
45021,
841,
3210,
327,
6109,
4095,
1223,
436,
310,
271,
1774,
4468,
323,
8542,
19007,
273,
347,
83,
2718,
253,
2022,
7350,
4469,
407,
253,
30628,
310,
326,
253,
789,
19756,
38135,
2007,
253,
4477,
4327,
281,
7409,
260,
18038,
1754,
2718,
534,
3283,
5810,
841,
3210,
403,
417,
1375,
23037,
14387,
323,
347,
83,
285,
347,
824,
352,
310,
1892,
281,
5963,
253,
3486,
273,
436,
789,
327,
247,
1375,
23037,
14387,
12691,
347,
83,
985,
4720,
352,
651,
320,
12912,
281,
25464,
1543,
327,
247,
1199,
4067,
20689,
824,
347,
40211,
261,
365,
5036,
390,
5234,
4697,
1754,
327,
253,
42293,
3061,
432,
253,
30628,
253,
913,
18726,
326,
253,
789,
275,
253,
1246,
830,
943,
320,
10945,
209
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
the authors introduce the problem of telegraphic summarization given a sentence we want to reduce its size while retaining its meaning with no penalty for grammatical mistakes the main application presented by the author is that of summarizing fictional stories and plays the setting proposed by the author prescribes that the summarized sentence can be obtained by the input sentence by dropping some words so for example the simplest baseline for this problem would consist of simply dropping stop words the approach proposed is basically an autoencoder consisting of a 2step encoderdecoder network in the first step the sentence is encoded into a vector which is in turn decoded to a smooth indicator vector to mask words in the sentence in the second step the masked sentence is encoded into a vector which is in turn decoded into the output summarized sentence the optimization is a tradeoff between recoverability of the input sentence and norm of the indicator vector how many words are dropped in order for the network not to learn repetitive masking patterns eg drop first half of the sentence or drop every other word an additional loss is introduced that penalizes keeping easily inferable words or dropping hardtoinfer words concerns the problem doesnt seem to be wellmotivated also the length of the obtained summarized sentences is 70 that of the original sentences which makes the summaries seem not very useful the proposed complex architecture seems not to justify the goal especially considering that simply dropping stop words works already quite well in order for the presented architecture to beat the simple stopwords baseline an additional loss l4 linkage loss with retention weights which need to be tuned manually as hyperparameters is required theres not enough discussion about the related work by malireddy et al which is extremely similar to this paper a good part of that work overlaps with this paper comparison with literature about abstractive summarization is completely missing minor comments figure 1 indicator encoder should be indicator decoder are negations part of your stop words from your discussion you should make sure that not dont doesnt do not belong to your stop word set how did you optimize the hyperparameters r desired compression the regularization weights and the retention weights were pretrained word embeddings used as initialization whats the average compression of golden sentences docsepthe authors consider the problem of telegraphic sentence compression they train a system in an unsupervised fashion to predict which words can be dropped from a sentence without drastic loss of information to that end they propose a new autoencoding type architecture which uses the extracted words as latent code and most importantly a linkage loss which relates a words perplexity given the summary of its left context to its likelihood of being retained the model itself is sober and well motivated and the linkage loss is to the best of my knowledge original the authors show that their method outperforms some simple baselines in terms of rouge and compression on a small humanannotated test set the paper is generally well written although the initial presentation of the model could be made a little clearer it is not obvious from the text that the decoder takes the text as input figure 2 helps but comes a couple pages later however the authors fail to appropriately justify the choice of their hyperparameters eg the optimum value of r for our experiments was found to be 065 the best value of b was found to be 5 the weights 1 2 3 and 4 have been set to 3 2 50 and 3 respectively for our experiments how is best measured on the validation set which does not have gold references the choice of the specific sparsity constraint one could as well imagine using a simpe l1 regularization for the binarization loss and of chii why not simply use the likelihood could also be better motivated the model also relies on a handcrafted rules section 33 whose effect needs to be made more evident what weights are used in practice how were they chosen we observed that needs to be further developed the authors claim that the quantitative scores are not affected significantly but that is presumably only the rouge score what about annotators preferences most importantly however the task of telegraphic sentence compression whose usefulness is not a priori obvious is barely motivated the author refer to malireddy et al 2018 for a justification but it is important to note that the latter provides a telegraphic summary of a whole document with a compression factor of 037 the claim is that the concatenation of the telegraphic sentence compression can act as a summary of a whole document but given the fact that compression for individual sentences is closer to 069 this is yet to be demonstrated and even if that were true it is unclear whether the cognitive load of reading a sequence of telegraphic sentences would be that much lower than that of reading the original text this paper presents some interesting ideas and is well written but the content is not quite sufficient for publication in addition to the clarifications and justifications requested above the authors are encouraged to apply there methods to full lengths documents which would make for a more substantial contribution docsepthe paper explores unsupervised deep learning model for extractive telegraphic summaries which extracts text fragments eg fragments of a sentence as summaries the paper is in general well structured and is easy to follow however i think the submission does not have enough content to be accepted to the conference first in term of methodology as described in section 3 the paper has little novelty there has been intensive study using various deep learning models on summarization the models described in the paper contain little novelty compared with previous work using autoencoder and lstm for both extractive and abstractive summarization second the paper claims contributions on using deep learning models on telegraphic summarization but the advantage is not well demonstrated for example the advantage of the resulting summary is not compared with stateoftheart sentence compression models with intrinsic evaluation or probably better with extrinsic evaluation by the way it is interesting that the paper argues the advantage of using telegraphic summaries for fictional stories but actually gives an example which looks also very typical in news articles the earthquake tokyo 12 dead example third there has been much work on speech summarization that summarizes with the telegraphic style this is natural considering speech transcripts are often nongrammatical and telegraphic style summaries focusing on choosing informative fragments actually result in usable summaries the authors may consider discussing such work and compare the proposed methods to it
### Summary: | this paper presents methods for telegraphic summarization a task that generates extremely short summaries there are concerns about the utility of the task in general and also the novelty of the modeling framework there is overall consensus between reviewers regarding the papers assessment the feedback is lukewarm | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
9569,
253,
1895,
273,
716,
27414,
280,
10405,
1320,
1677,
247,
6197,
359,
971,
281,
4796,
697,
1979,
1223,
26179,
697,
4495,
342,
642,
12339,
323,
47412,
474,
16503,
253,
2022,
2898,
3559,
407,
253,
2488,
310,
326,
273,
10405,
3006,
29934,
6281,
285,
7120,
50276,
783,
4758,
4081,
407,
253,
2488,
838,
6559,
265,
326,
253,
17903,
6197,
476,
320,
2797,
407,
253,
3280,
6197,
407,
18752,
690,
3000,
594,
323,
1650,
253,
22325,
8245,
323,
436,
1895,
651,
2882,
273,
3365,
18752,
3523,
3000,
50276,
783,
2746,
4081,
310,
10323,
271,
6753,
36465,
11253,
273,
247,
374,
10539,
32049,
48759,
2990,
275,
253,
806,
3213,
253,
6197,
310,
16202,
715,
247,
4972,
534,
310,
275,
1614,
45775,
281,
247,
6032,
15301,
4972,
281,
8989,
3000,
275,
253,
6197,
275,
253,
1273,
3213,
253,
34741,
6197,
310,
16202,
715,
247,
4972,
534,
310,
275,
1614,
45775,
715,
253,
3453,
17903,
6197,
50275,
783,
13757,
310,
247,
5454,
2727,
875,
9295,
1430,
273,
253,
3280,
6197,
285,
5222,
273,
253,
15301,
4972,
849,
1142,
3000,
403,
8231,
275,
1340,
323,
253,
2990,
417,
281,
3037,
29104,
44790,
6127,
24088,
5926,
806,
2716,
273,
253,
6197,
390,
5926,
1046,
643,
3159,
271,
3081,
2957,
310,
5611,
326,
29697,
4219,
7562,
4354,
9441,
494,
3000,
390,
18752,
1892,
936,
47586,
3000,
50276,
585,
1209,
2224,
50276,
783,
1895,
36908,
1646,
281,
320,
973,
24013,
8550,
671,
253,
2978,
273,
253,
2797,
17903,
14683,
310,
5571,
326,
273,
253,
3236,
14683,
534,
2789,
253,
14568,
3927,
1646,
417,
1077,
4217,
50276,
783,
4081,
2570,
10336,
3133,
417,
281,
15249,
253,
4736,
3340,
7296,
326,
3365,
18752,
3523,
3000,
2987,
2168,
3240,
973,
50275,
249,
1340,
323,
253,
3559,
10336,
281,
7171,
253,
2969,
3523,
12113,
8245,
271,
3081,
2957,
298,
21,
24637,
2957,
342,
17302,
13461,
534,
878,
281,
320,
24251,
13542,
347,
4373,
22041,
310,
2424,
50275,
783,
373,
417,
2217,
5955,
670,
253,
2905,
789,
407,
4691,
1250,
6421,
1162,
355,
534,
310,
6685,
2074,
281,
436,
2929,
247,
1175,
629,
273,
326,
789,
47685,
342,
436,
2929,
50276,
47109,
342,
6239,
670,
12002,
422,
10405,
1320,
310,
4336,
5816,
50276,
37585,
5701,
50276,
13206,
337,
15301,
32049,
943,
320,
15301,
29810,
50276,
609,
2297,
569,
629,
273,
634,
3523,
3000,
432,
634,
5955,
368,
943,
1056,
2119,
326,
417,
13414,
36908,
50276,
3088,
417,
5663,
281,
634,
3523,
3159,
873,
50276,
5430,
858,
368,
22318,
253,
4373,
22041,
391,
6799,
13800,
253,
37820,
13461,
285,
253,
17302,
13461,
50276,
12796,
3215,
11273,
3159,
46234,
908,
347,
31850,
50276,
5371,
84,
253,
3388,
13800,
273,
14072,
14683,
50276,
7152,
339,
431,
248,
4477,
1908,
253,
1895,
273,
716,
27414,
280,
6197,
13800,
597,
6194,
247,
985,
275,
271,
440,
35421,
8142,
281,
3283,
534,
3000,
476,
320,
8231,
432,
247,
6197,
1293,
36927,
2957,
273,
1491,
281,
326,
990,
597,
12661,
247,
747,
6753,
27676,
1511,
10336,
534,
4648,
253,
10375,
3000,
347,
21624,
2127,
285,
954,
15538,
247,
24637,
2957,
534,
7033,
247,
3000,
44229,
414,
1677,
253,
6010,
273,
697,
1669,
3634,
281,
697,
12177,
273,
1146,
14667,
253,
1566,
3139,
310,
31864,
285,
973,
17194,
285,
253,
24637,
2957,
310,
281,
253,
1682,
273,
619,
3640,
3236,
253,
4477,
921,
326,
616,
1332,
41731,
13015,
690,
2969,
1666,
25379,
275,
2426,
273,
30497,
463,
285,
13800,
327,
247,
1355,
1966,
11423,
456,
1071,
873,
50276,
783,
2929,
310,
3839,
973,
3542,
3738,
253,
3302,
9759,
273,
253,
1566,
812,
320,
1160,
247,
1652,
30909,
352,
310,
417,
4755,
432,
253,
2505,
326,
253,
29810,
3936,
253,
2505,
347,
3280,
50276,
13206,
374,
7729,
533,
3249,
247,
4564,
7223,
1996,
2299,
253,
4477,
1891,
281,
20420,
15249,
253,
4327,
273,
616,
4373,
22041,
24088,
253,
24571,
1318,
273,
391,
323,
776,
4679,
369,
1119,
281,
320,
470,
2082,
253,
1682,
1318,
273,
270,
369,
1119,
281,
320,
608,
253,
13461,
337,
374,
495,
285,
577,
452,
644,
873,
281,
495,
374,
2456,
285,
495,
2975,
323,
776,
4679,
50276,
5430,
310,
1682,
4080,
327,
253,
12820,
873,
534,
1057,
417,
452,
5328,
10414,
253,
4327,
273,
253,
2173,
37139,
414,
7658,
581,
812,
347,
973,
8564,
970,
247,
948,
365,
298,
18,
37820,
323,
253,
10269,
274,
1320,
2957,
285,
273,
448,
2886,
2139,
417,
3365,
897,
253,
12177,
812,
671,
320,
1805,
17194,
50276,
783,
1566,
671,
15771,
327,
247,
1133,
12517,
264,
4803,
2593,
5922,
3692,
1055,
3198,
281,
320,
1160,
625,
8943,
752,
13461,
403,
908,
275,
3946,
849,
497,
597,
6777,
359,
2540,
326,
3198,
281,
320,
2007,
3715,
253,
4477,
1750,
326,
253,
11745,
7363,
403,
417,
5876,
3012,
533,
326,
310,
18289,
760,
253,
30497,
463,
4868,
752,
670,
12182,
2392,
17971,
50276,
2252,
15538,
2299,
253,
4836,
273,
716,
27414,
280,
6197,
13800,
3692,
31471,
310,
417,
247,
30400,
4755,
310,
12345,
17194,
50276,
783,
2488,
3730,
281,
4691,
1250,
6421,
1162,
355,
4765,
323,
247,
22861,
533,
352,
310,
1774,
281,
3877,
326,
253,
6158,
3400,
247,
716,
27414,
280,
6010,
273,
247,
2644,
3389,
342,
247,
13800,
2803,
273,
470,
1787,
253,
1750,
310,
326,
253,
32147,
318,
273,
253,
716,
27414,
280,
6197,
13800,
476,
769,
347,
247,
6010,
273,
247,
2644,
3389,
533,
1677,
253,
958,
326,
13800,
323,
2060,
14683,
310,
8003,
281,
470,
2090,
436,
310,
2568,
281,
320,
5183,
285,
1014,
604,
326,
497,
2032,
352,
310,
12744,
1880,
253,
9699,
3301,
273,
4361,
247,
3425,
273,
716,
27414,
280,
14683,
651,
320,
326,
1199,
2406,
685,
326,
273,
4361,
253,
3236,
2505,
50276,
2520,
2929,
10262,
690,
4722,
5697,
285,
310,
973,
3542,
533,
253,
2600,
310,
417,
3240,
4209,
323,
9311,
275,
1635,
281,
253,
8254,
6787,
285,
816,
6787,
9521,
1840,
253,
4477,
403,
14659,
281,
4647,
627,
3082,
281,
2120,
16095,
7177,
534,
651,
1056,
323,
247,
625,
6832,
7680,
5474,
339,
431,
248,
2929,
33826,
440,
35421,
3676,
4715,
1566,
323,
4908,
422,
716,
27414,
280,
14568,
3927,
534,
16756,
2505,
14251,
24088,
14251,
273,
247,
6197,
347,
14568,
3927,
253,
2929,
310,
275,
2087,
973,
18872,
285,
310,
3477,
281,
956,
2299,
891,
1158,
253,
19529,
1057,
417,
452,
2217,
2600,
281,
320,
7607,
281,
253,
8059,
50276,
7053,
275,
1307,
273,
16182,
347,
2529,
275,
2593,
495,
253,
2929,
556,
1652,
38135,
627,
556,
644,
17193,
1263,
970,
2710,
3676,
4715,
3210,
327,
10405,
1320,
253,
3210,
2529,
275,
253,
2929,
3831,
1652,
38135,
2429,
342,
2045,
789,
970,
6753,
36465,
285,
298,
296,
78,
323,
1097,
4908,
422,
285,
12002,
422,
10405,
1320,
50275,
9815,
253,
2929,
3916,
9021,
327,
970,
3676,
4715,
3210,
327,
716,
27414,
280,
10405,
1320,
533,
253,
5750,
310,
417,
973,
5183,
323,
1650,
253,
5750,
273,
253,
4795,
6010,
310,
417,
2429,
342,
1375,
23037,
14387,
6197,
13800,
3210,
342,
15276,
7103,
390,
3164,
1805,
342,
38988,
7103,
407,
253,
1039,
352,
310,
4722,
326,
253,
2929,
8219,
253,
5750,
273,
970,
716,
27414,
280,
14568,
3927,
323,
29934,
6281,
533,
2686,
4245,
271,
1650,
534,
4453,
671,
1077,
6867,
275,
3668,
7774,
253,
27108,
281,
4742,
80,
1249,
3846,
1650,
50276,
19016,
627,
556,
644,
1199,
789,
327,
6519,
10405,
1320,
326,
37250,
342,
253,
716,
27414,
280,
3740,
436,
310,
3626,
7296,
6519,
16917,
403,
2223,
295,
543,
3358,
2056,
474,
285,
716,
27414,
280,
3740,
14568,
3927,
13654,
327,
13887,
27096,
14251,
2686,
906,
275,
31998,
14568,
3927,
253,
4477,
778,
1908,
16585,
824,
789,
285,
7277,
253,
4081,
3082,
281,
352,
2490,
187,
4118,
18435,
27,
2520,
2929,
10262,
3082,
323,
716,
27414,
280,
10405,
1320,
247,
4836,
326,
15693,
6685,
2159,
14568,
3927,
50276,
9088,
403,
7350,
670,
253,
11839,
273,
253,
4836,
275,
2087,
285,
671,
253,
38135,
273,
253,
14053,
7792,
50276,
9088,
310,
4583,
13969,
875,
30628,
5001,
253,
9380,
6803,
253,
8680,
310,
298,
17936,
44041
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
30003,
310,
1677,
2278,
273,
247,
2561,
2929,
432,
260,
2369,
1793,
6698,
15,
7764,
3630,
247,
6010,
253,
2278,
15,
187,
4118,
8439,
27,
187,
783,
4477,
9569,
253,
1895,
273,
716,
27414,
280,
10405,
1320,
1677,
247,
6197,
359,
971,
281,
4796,
697,
1979,
1223,
26179,
697,
4495,
342,
642,
12339,
323,
47412,
474,
16503,
253,
2022,
2898,
3559,
407,
253,
2488,
310,
326,
273,
10405,
3006,
29934,
6281,
285,
7120,
50276,
783,
4758,
4081,
407,
253,
2488,
838,
6559,
265,
326,
253,
17903,
6197,
476,
320,
2797,
407,
253,
3280,
6197,
407,
18752,
690,
3000,
594,
323,
1650,
253,
22325,
8245,
323,
436,
1895,
651,
2882,
273,
3365,
18752,
3523,
3000,
50276,
783,
2746,
4081,
310,
10323,
271,
6753,
36465,
11253,
273,
247,
374,
10539,
32049,
48759,
2990,
275,
253,
806,
3213,
253,
6197,
310,
16202,
715,
247,
4972,
534,
310,
275,
1614,
45775,
281,
247,
6032,
15301,
4972,
281,
8989,
3000,
275,
253,
6197,
275,
253,
1273,
3213,
253,
34741,
6197,
310,
16202,
715,
247,
4972,
534,
310,
275,
1614,
45775,
715,
253,
3453,
17903,
6197,
50275,
783,
13757,
310,
247,
5454,
2727,
875,
9295,
1430,
273,
253,
3280,
6197,
285,
5222,
273,
253,
15301,
4972,
849,
1142,
3000,
403,
8231,
275,
1340,
323,
253,
2990,
417,
281,
3037,
29104,
44790,
6127,
24088,
5926,
806,
2716,
273,
253,
6197,
390,
5926,
1046,
643,
3159,
271,
3081,
2957,
310,
5611,
326,
29697,
4219,
7562,
4354,
9441,
494,
3000,
390,
18752,
1892,
936,
47586,
3000,
50276,
585,
1209,
2224,
50276,
783,
1895,
36908,
1646,
281,
320,
973,
24013,
8550,
671,
253,
2978,
273,
253,
2797,
17903,
14683,
310,
5571,
326,
273,
253,
3236,
14683,
534,
2789,
253,
14568,
3927,
1646,
417,
1077,
4217,
50276,
783,
4081,
2570,
10336,
3133,
417,
281,
15249,
253,
4736,
3340,
7296,
326,
3365,
18752,
3523,
3000,
2987,
2168,
3240,
973,
50275,
249,
1340,
323,
253,
3559,
10336,
281,
7171,
253,
2969,
3523,
12113,
8245,
271,
3081,
2957,
298,
21,
24637,
2957,
342,
17302,
13461,
534,
878,
281,
320,
24251,
13542,
347,
4373,
22041,
310,
2424,
50275,
783,
373,
417,
2217,
5955,
670,
253,
2905,
789,
407,
4691,
1250,
6421,
1162,
355,
534,
310,
6685,
2074,
281,
436,
2929,
247,
1175,
629,
273,
326,
789,
47685,
342,
436,
2929,
50276,
47109,
342,
6239,
670,
12002,
422,
10405,
1320,
310,
4336,
5816,
50276,
37585,
5701,
50276,
13206,
337,
15301,
32049,
943,
320,
15301,
29810,
50276,
609,
2297,
569,
629,
273,
634,
3523,
3000,
432,
634,
5955,
368,
943,
1056,
2119,
326,
417,
13414,
36908,
50276,
3088,
417,
5663,
281,
634,
3523,
3159,
873,
50276,
5430,
858,
368,
22318,
253,
4373,
22041,
391,
6799,
13800,
253,
37820,
13461,
285,
253,
17302,
13461,
50276,
12796,
3215,
11273,
3159,
46234,
908,
347,
31850,
50276,
5371,
84,
253,
3388,
13800,
273,
14072,
14683,
50276,
7152,
339,
431,
248,
4477,
1908,
253,
1895,
273,
716,
27414,
280,
6197,
13800,
597,
6194,
247,
985,
275,
271,
440,
35421,
8142,
281,
3283,
534,
3000,
476,
320,
8231,
432,
247,
6197,
1293,
36927,
2957,
273,
1491,
281,
326,
990,
597,
12661,
247,
747,
6753,
27676,
1511,
10336,
534,
4648,
253,
10375,
3000,
347,
21624,
2127,
285,
954,
15538,
247,
24637,
2957,
534,
7033,
247,
3000,
44229,
414,
1677,
253,
6010,
273,
697,
1669,
3634,
281,
697,
12177,
273,
1146,
14667,
253,
1566,
3139,
310,
31864,
285,
973,
17194,
285,
253,
24637,
2957,
310,
281,
253,
1682,
273,
619,
3640,
3236,
253,
4477,
921,
326,
616,
1332,
41731,
13015,
690,
2969,
1666,
25379,
275,
2426,
273,
30497,
463,
285,
13800,
327,
247,
1355,
1966,
11423,
456,
1071,
873,
50276,
783,
2929,
310,
3839,
973,
3542,
3738,
253,
3302,
9759,
273,
253,
1566,
812,
320,
1160,
247,
1652,
30909,
352,
310,
417,
4755,
432,
253,
2505,
326,
253,
29810,
3936,
253,
2505,
347,
3280,
50276,
13206,
374,
7729,
533,
3249,
247,
4564,
7223,
1996,
2299,
253,
4477,
1891,
281,
20420,
15249,
253,
4327,
273,
616,
4373,
22041,
24088,
253,
24571,
1318,
273,
391,
323,
776,
4679,
369,
1119,
281,
320,
470,
2082,
253,
1682,
1318,
273,
270,
369,
1119,
281,
320,
608,
253,
13461,
337,
374,
495,
285,
577,
452,
644,
873,
281,
495,
374,
2456,
285,
495,
2975,
323,
776,
4679,
50276,
5430,
310,
1682,
4080,
327,
253,
12820,
873,
534,
1057,
417,
452,
5328,
10414,
253,
4327,
273,
253,
2173,
37139,
414,
7658,
581,
812,
347,
973,
8564,
970,
247,
948,
365,
298,
18,
37820,
323,
253,
10269,
274,
1320,
2957,
285,
273,
448,
2886,
2139,
417,
3365,
897,
253,
12177,
812,
671,
320,
1805,
17194,
50276,
783,
1566,
671,
15771,
327,
247,
1133,
12517,
264,
4803,
2593,
5922,
3692,
1055,
3198,
281,
320,
1160,
625,
8943,
752,
13461,
403,
908,
275,
3946,
849,
497,
597,
6777,
359,
2540,
326,
3198,
281,
320,
2007,
3715,
253,
4477,
1750,
326,
253,
11745,
7363,
403,
417,
5876,
3012,
533,
326,
310,
18289,
760,
253,
30497,
463,
4868,
752,
670,
12182,
2392,
17971,
50276,
2252,
15538,
2299,
253,
4836,
273,
716,
27414,
280,
6197,
13800,
3692,
31471,
310,
417,
247,
30400,
4755,
310,
12345,
17194,
50276,
783,
2488,
3730,
281,
4691,
1250,
6421,
1162,
355,
4765,
323,
247,
22861,
533,
352,
310,
1774,
281,
3877,
326,
253,
6158,
3400,
247,
716,
27414,
280,
6010,
273,
247,
2644,
3389,
342,
247,
13800,
2803,
273,
470,
1787,
253,
1750,
310,
326,
253,
32147,
318,
273,
253,
716,
27414,
280,
6197,
13800,
476,
769,
347,
247,
6010,
273,
247,
2644,
3389,
533,
1677,
253,
958,
326,
13800,
323,
2060,
14683,
310,
8003,
281,
470,
2090,
436,
310,
2568,
281,
320,
5183,
285,
1014,
604,
326,
497,
2032,
352,
310,
12744,
1880,
253,
9699,
3301,
273,
4361,
247,
3425,
273,
716,
27414,
280,
14683,
651,
320,
326,
1199,
2406,
685,
326,
273,
4361,
253,
3236,
2505,
50276,
2520,
2929,
10262,
690,
4722,
5697,
285,
310,
973,
3542,
533,
253,
2600,
310,
417,
3240,
4209,
323,
9311,
275,
1635,
281,
253,
8254,
6787,
285,
816,
6787,
9521,
1840,
253,
4477,
403,
14659,
281,
4647,
627,
3082,
281,
2120,
16095,
7177,
534,
651,
1056,
323,
247,
625,
6832,
7680,
5474,
339,
431,
248,
2929,
33826,
440,
35421,
3676,
4715,
1566,
323,
4908,
422,
716,
27414,
280,
14568,
3927,
534,
16756,
2505,
14251,
24088,
14251,
273,
247,
6197,
347,
14568,
3927,
253,
2929,
310,
275,
2087,
973,
18872,
285,
310,
3477,
281,
956,
2299,
891,
1158,
253,
19529,
1057,
417,
452,
2217,
2600,
281,
320,
7607,
281,
253,
8059,
50276,
7053,
275,
1307,
273,
16182,
347,
2529,
275,
2593,
495,
253,
2929,
556,
1652,
38135,
627,
556,
644,
17193,
1263,
970,
2710,
3676,
4715,
3210,
327,
10405,
1320,
253,
3210,
2529,
275,
253,
2929,
3831,
1652,
38135,
2429,
342,
2045,
789,
970,
6753,
36465,
285,
298,
296,
78,
323,
1097,
4908,
422,
285,
12002,
422,
10405,
1320,
50275,
9815,
253,
2929,
3916,
9021,
327,
970,
3676,
4715,
3210,
327,
716,
27414,
280,
10405,
1320,
533,
253,
5750,
310,
417,
973,
5183,
323,
1650,
253,
5750,
273,
253,
4795,
6010,
310,
417,
2429,
342,
1375,
23037,
14387,
6197,
13800,
3210,
342,
15276,
7103,
390,
3164,
1805,
342,
38988,
7103,
407,
253,
1039,
352,
310,
4722,
326,
253,
2929,
8219,
253,
5750,
273,
970,
716,
27414,
280,
14568,
3927,
323,
29934,
6281,
533,
2686,
4245,
271,
1650,
534,
4453,
671,
1077,
6867,
275,
3668,
7774,
253,
27108,
281,
4742,
80,
1249,
3846,
1650,
50276,
19016,
627,
556,
644,
1199,
789,
327,
6519,
10405,
1320,
326,
37250,
342,
253,
716,
27414,
280,
3740,
436,
310,
3626,
7296,
6519,
16917,
403,
2223,
295,
543,
3358,
2056,
474,
285,
716,
27414,
280,
3740,
14568,
3927,
13654,
327,
13887,
27096,
14251,
2686,
906,
275,
31998,
14568,
3927,
253,
4477,
778,
1908,
16585,
824,
789,
285,
7277,
253,
4081,
3082,
281,
352,
2490,
187,
4118,
18435,
27,
2520,
2929,
10262,
3082,
323,
716,
27414,
280,
10405,
1320,
247,
4836,
326,
15693,
6685,
2159,
14568,
3927,
50276,
9088,
403,
7350,
670,
253,
11839,
273,
253,
4836,
275,
2087,
285,
671,
253,
38135,
273,
253,
14053,
7792,
50276,
9088,
310,
4583,
13969,
875,
30628,
5001,
253,
9380,
6803,
253,
8680,
310,
298,
17936,
44041
] |
Below is given review of a research paper from cnoference journal. Please write a summary the review.
### Review:
this work tackles the task of forecasting dynamics in different domains simultaneously using an encoder which is trained to determine the task the inferred latent vector is then used to adapt a forecasting network to the task at hand experiments on three datasets linked to fluid dynamics are then conducted to assess the proposed model pros this is an interesting problem which is quite timely given the development of the field of forecasting physical dynamics using neural networks the proposed solution seems sound and principled moreover it is well motivated and the writing was quite clear the different additions made to the forecaster network are also quite interesting i especially liked the adapad solution to deal with boundary conditions conducting an ablation study also considerably strengthens the paper cons all experiments are conducted on somewhat similar datasets which are based on fluid dynamics pdes it would be nice to see how the model deals with other families of dynamics especially given the fact that the contributions of this work seem geared towards practical considerations the setting of the experiments should be more precise and additional details should be given how are the different datasets constructed what supervision is there exactly regarding the different tasks how many domains are there in each dataset and what are the differences how is the balance between the different domains ect this is a good work on a timely subject the contribution is not groundbreaking but should be significant enough to warrant acceptance docsepthis paper addresses the problem of learning a deep learning model for dynamics forecasting which generalizes to changes in dynamics these changes can be induced by different parameters boundary conditions or external forces the proposed model takes a metalearning approach and proposes to partition data into different heterogeneous domains it consists of two components an encoder which infers timeinvariant features given observed domain data and a forecaster which predicts the dynamics given these features the paper evaluates the proposed approach on several datasets and provides some theoretical insights this paper addresses a new and interesting generalization problem for dynamics forecasting it proposes a model to address different changes in the dynamics evaluation is done on relevant datasets with several baselines and some ablation studies the applicability of the proposed approach is restricted to problems where relevant weak supervision from task parameters is available this seems like an important limitation in realworld applications how valid is this scenario the question of choosing relevant parameters for weak supervision is important for applying this model to other datasets yet the definition of these parameters is unclear how robust is the model when chosen parameters are not useful the performance of wrongenc table 2 tends to say that this model will then fail it is unclear why the model can adapt to changing boundary conditions with adapad as it generates them from features hatzc extracted from data inside the domain and weakly supervised by quantities unrelated to the boundary condition eg mean vorticity or season the theoretical analysis inspired by existing work in multitask learning domain adaptation has some limitations and does not add much value to the paper i have some concerns with the domain adaptation upperbound to the target error in theorem 34 and proposition 35 this upperbound is not minimized thus the target risk can be high ie the model is not guaranteed to adapt well moreover the validity of the theoretical analysis is unclear as several assumptions may not be verified eg bounded loss in theorem 31 proposition 33 lipschitz continuity in proposition 35 theorem 34 requires that the assumptions in theorem 2 in redko et al 2017 are verified yet these assumptions are not mentioned in the paper some ablation studies are missing 1 the contribution of each term in equation 2 and 2 the dimensionality of hatzc which is fixed arbitrarily other questions it would be good to better explain how the experiments include changing boundary conditions between domains the testing scenarios only mention different initial conditions or external forces why do the baselines resnetc and unetc not adapt well despite having access to relevant weak supervision p8 this is the same information used by the proposed model to adapt how redundant is the time invariance term 3rd term in equation 2 with the invariances enforced in the architecture of the encoder this paper tackles a new generalization problem for dynamics forecasting and proposes a model supported by experimental results however this model can only be applied to problems with relevant weak supervision which may not always be available in practise moreover the definition of relevant parameters is unclear and the robustness of the model to the choice of these parameters is not measured which may restrict its application to other datasets there are also unclarities on the ability of the model to adapt to changing boundary conditions with adapad some ablation studies are missing and i have concerns on the theoretical analysis which brings limited value to the paper for this reason i am giving this paper a weak reject postrebuttal comments i thank the authors for their response after studying it the theoretical results still have some major issues and feel disconnected from the model in particular key assumptions are not enforced in the model eg lipschitz continuity and the generalization error of the model in th33 is uncontrolled as the upperbound is not minimized by the model the wasserstein distance between domains is fixed and is high in all generality its use for the model is thus not very convincing on practical aspects the capability of handling boundary conditions should be better justified and evaluated for this reason i keep my score unchanged and recommend rejecting this paper docsepthe paper suggest a remediation for a common problem for dynamics forecasting which is the lack of generalization to other domainstasks the author suggest to tackle this with via a 2 component architecture one for learning the task and one for forecasting in empiricial experiments the authors show the practical feasibility of their approach as a caveat im not an expert in the area so my review remains on a superficial level consequently for which i apologize i overall liked the paper quite a bit the question discussed is relevant the empirical evaluation is very good the theoretical results seem as relevant as they would get and the related work discussed is crisply presented and relevant one question i would have is that results in table 1 are overwhelmingly good with only unetc coming close do we know for these tasks what the theoretical upper bound eg by the right pde system would be is it computationally even possible to compute this upper bound im wondering how much of a gap there still is too close in a similar vein what is the intuition behind dyad resnet being better than dyad unet mostly are there some complementary strengths between dyad and resnet that this combination can exploit better than dyad unet this is a good paper that id like to see accepted for its combination of theoretical results empirical results and methodological novelty docsepthis paper is interested in learning general forecasting models for physical dynamical processes the paper proposes a decomposition of such a model into an encoder that captures the innate properties of the system and a forecaster that autoregressively makes predictions conditioned on the encoded properties this is framed as a metalearning approach and is shown to substantially outperform singletask approaches and offtheshell metalearning approaches across multiple datasets the paper provides some theoretical analysis and qualitative analysis of what is learned overall the paper shows that learning shared models across domains is an important and fruitful way forward for modeling physical processes with machine learning strengths the problem statement is wellmotivated learning generalizable deep learning models across diverse settings is an important open problem experiments use interesting and realworld problems results are strong and appear reliable adapad is an interesting idea specialized to the case of physical complex systems since it is designed to address boundary condition issues visualizations show the model is behaving essentially as expected although there are many design choices that go in to the model each such design choice is wellmotivated aside from some aspects of the theory section the exposition is generally quite clear and wellorganized assumptions are made clear the fact that the encoder can be trained first and independently of the forecaster should be very useful for further rapid developments great to see ese metric used as a complement to raw error table in appendix showing alternatives to adain is very useful in increasing confidence in adain for this application weaknesses the biggest concern is the theory section the multitask learning and domain adaptation results are general results that are not adequately connected back to the specific model and problem the paper is considering yes it is widely accepted that multitask learning and domain adaptation can work well especially when tasks are related in some measurable way and it can be a useful exercise to restate existing theory in the language of your framework but what if any novel claims is the theory implying are there any predictions the theory makes about the particular approach which can be validated in experiments the theoretical bound on error that decomposes the error of the encoder and forecaster is similarly lacking in its interpretation yes it can be a useful exercise to show that the error can be decomposed along the lines of the model but does this bound somehow suggest that the decomposition results in lower error than a monolithic model or is it showing that you can work independently on improving either part of the model and improve the overall error where is there potential for practical value in this theorem for example one place there could be potential to validate the theory is to check in experiments that task pairs with lower wasserstein distance actually support better domain adaptation however in the introduction of the paper it acknowledges that even the slightest change in these features may lead to vastly different phenomena but doesnt that suggest that wasserstein distance may not be a useful metric here for measuring task similarity couldnt turbulence limit the usefulness of such a metric proposition 33 says the bound is strictly looser than the bound in theorem 31 for clarity it would be very helpful to combine the bounds into an inequality showing this strictlylooser property it is not immediately apparent from the statement of the theorems since the inequalities contain different terms as is the theory doesnt really hurt the paper but for the amount of space dedicated to it it doesnt add much the paper could be substantially improved by either 1 adding interpretationpredictionsvalidation of the theory that connect it back to the approach in the paper or 2 removing some of the less useful parts of the theory from the main paper to free up space for more of the interesting analysis of what the model actually learns also it is interesting but a bit counterintuitive that the theory section relies on results in multitask learning and domain adaptation instead of theoretical results from the metalearning literature as is since the paper relies on multitask learning so much it is missing references to related work in multitask learning ie related work outside of modeling physical dynamical systems similarly it would be helpful to mention why there are no comparisons to multitask learning or domain adaptation methods in the experiments why do they not apply here the three terms in the loss function of the encoder are wellmotivated but it is not clear how important each term is ablations on these terms would be very informative for the reader to understand whats generally required to train an encoder in section 5 it says varsepnet employs separation of variables through different loss terms what are these loss terms and how are they different from the ones in the paper in the ablations with no encoder how do adain and adapad work dont they require some z where does this come from if not from the encoder unet does seem it could be at a qualitative disadvantage compared to dyad in terms on number of parameters especially since unet c is one of the more competitive baselines it would be useful to see results for a larger unet c or at least some evidence that the unet is not underfitting the training data additional question of interest overall this is a very important a potentially deep line of research the most exciting promise of such work is the potential of revealing shared regularities across vastly disparate dynamic systems that is across complex physical processes and it seems the approach in the paper could be particularly wellsuited to such research for example the authors could train a single encoderforecaster model across all the datasets in the paper and analyze relationships in the learned encodings across datasets training models across highly diverse domains have been tried in multitask learning eg pretrained transformers as universal computation engines arxiv 2021 the traveling observer model iclr 2021 modular universal reparameterization neurips 2019 one model to learn them all arxiv 2017 is such a generalization part of the longer term vision for this line of work minor comments in section 24 some references would be useful in the sentence ending with the combined force equation there are several inconsistencies in the use of parentheses in citations throughout the paper correcting these would improve readability in last sentence of first paragraph of section 4 the word task could be changed to something like problem since task has another meaning in the paper should the 726 for unetc on ocean currents future be bolded in the last paragraph of section 51 we tried to vary we tried varying or we varied appendix a21 footnote for phiflow is on the wrong page appendix a21 the last paragraph seems like it should be the first paragraph of a22 in proof of proposition b5 there is an extra or missing set of norm bars in the first inequality overall this is very interesting and useful work the problem is wellmotivated and the approach and experiments are carefully designed and generally convincing if the concerns about the theory are addressed i would be happy to increase my score adding the additional info and experiments requested could increase it further and make this a particularly strong paper
### Summary: | the paper addresses the problem of domain generalization for learning spatiotemporal dynamics it proposes a solution where an encoder captures some characteristics of a given environment and a forecaster autoregressively predicts future dynamics conditioned on the characteristics learned by the encoder said otherwise the forecaster learns the general form of dynamics parameterized by an environment representation extracted by the encoder the conditioning is implemented via an adaptive instance normalization mechanism a form of padding is also introduced in order to take into account boundary conditions the two components encoder and forecaster are trained sequentially this approach is casted in a metalearning framework theoretical results inspired by multitask learning and domain adaptation are also demonstrated the model is evaluated and compared to different baselines on three problems and for two different settings varying initial conditions with a given dynamics and dynamics with varying parameters this is a borderline paper it targets a timely and important problem of domain generalization for dynamic environments the proposed solution is original and compares well experimentally to several baselines it allows for better generalization performance for the two test settings considered in the current version the paper however suffers from different weaknesses first there is the imprecision of the arguments and the description of the experiments some of the arguments and claims are vague and sometimes abusive not backed up by evidence for example a central claim is that the encoder learns time invariant quantities characterizing the environment when the learned representations indeed change with a time shift in the input for any environment the same goes for the argument developed for the padding construction it is claimed to model boundary conditions but this is not supported by any theoretical or empirical evidence as noted by the reviewers the theoretical analysis is disconnected from the algorithmic and experimental developments and does not bring much additional value to the paper what is more embarrassing is that some of the claims in this section are overstated and induce incorrect conclusions from theorem 31 and proposition 33 the authors suggest that multitask learning leads to better generalization than learning independently while this is not formally guaranteed by the results this is acknowledged by the authors in a later comment besides the conditions of validity are not discussed while they seem to only cover situations for which the train and the test distributions are the same the same holds for the second theoretical results theorem 34 it is claimed that this result supports the authors idea of training encoder and forecaster sequentially while it does not besides the bounds in this result cannot be controlled as noted by the reviewers and are not useful in practice overall the paper addresses an important topic and proposes new solutions the results are promising and it is indeed an interesting contribution however inaccuracies and incorrect or exaggerated claims make it difficult to accept the current version of the article the article would make a strong and innovative contribution if it were written as a purely experimental article with a detailed description of the experiments and comparisons | [
651,
452,
310,
326,
1543,
275,
2829,
337,
403,
42935,
1175,
342,
760,
440,
14069,
3551,
2810,
513,
359,
871,
323,
841,
8892,
752,
253,
10527,
5170,
3033,
24088,
407,
253,
987,
268,
615,
985,
651,
320,
310,
352,
43245,
1014,
1896,
281,
11897,
436,
5170,
3033,
516,
12371,
849,
1199,
273,
247,
8037,
627,
1335,
310,
1512,
2810,
50275,
249,
247,
2074,
17716,
752,
310,
253,
30328,
3212,
17713,
324,
50276,
373,
3024,
1146,
1805,
685,
17713,
324,
50276,
328,
292,
6571,
403,
627,
690,
19767,
20544,
875,
17713,
324,
285,
501,
3024,
326,
436,
5019,
476,
22059,
1805,
685,
17713,
324,
50276,
328,
292,
436,
310,
247,
1175,
2929,
326,
2654,
751,
281,
923,
7607,
323,
697,
5019,
273,
10527,
1543,
16774,
1543,
285,
35961,
38135,
5474,
33032,
2520,
2929,
310,
6110,
275,
4715,
2087,
16923,
272,
3210,
323,
3520,
18525,
4870,
253,
2929,
29328,
247,
14717,
273,
824,
247,
1566,
715,
271,
32049,
326,
28174,
253,
24537,
3607,
273,
253,
985,
285,
247,
2273,
33293,
326,
47694,
3161,
1242,
2789,
13650,
27039,
327,
253,
16202,
3607,
436,
310,
29318,
347,
247,
5148,
613,
920,
2746,
285,
310,
2011,
281,
9619,
562,
32231,
34791,
1945,
7274,
285,
273,
649,
1041,
14440,
5148,
613,
920,
7274,
2439,
2709,
15302,
253,
2929,
3400,
690,
10527,
1783,
285,
18276,
1783,
273,
752,
310,
6311,
4583,
253,
2929,
2722,
326,
4715,
6096,
3210,
2439,
10625,
310,
271,
1774,
285,
46001,
1039,
3579,
323,
14053,
3520,
4870,
342,
5145,
4715,
20544,
50276,
783,
1895,
3908,
310,
973,
24013,
8550,
4715,
2087,
12729,
3676,
4715,
3210,
2439,
11117,
7533,
310,
271,
1774,
1527,
1895,
50276,
16217,
3825,
897,
4722,
285,
1524,
10186,
3237,
50276,
16680,
403,
2266,
285,
3176,
9630,
50276,
324,
522,
324,
310,
271,
4722,
2934,
18052,
281,
253,
1083,
273,
3520,
2570,
2718,
1580,
352,
310,
4158,
281,
2953,
7548,
1617,
3374,
50276,
34309,
5904,
921,
253,
1566,
310,
50090,
9093,
347,
3264,
50276,
20261,
627,
403,
1142,
2216,
10165,
326,
564,
275,
281,
253,
1566,
1016,
824,
2216,
4327,
310,
973,
24013,
8550,
50276,
45529,
432,
690,
7794,
273,
253,
3762,
2593,
253,
47284,
310,
3839,
3240,
2590,
285,
973,
34092,
50276,
515,
360,
6372,
403,
1160,
2590,
50276,
783,
958,
326,
253,
32049,
476,
320,
10166,
806,
285,
10939,
273,
253,
2273,
33293,
943,
320,
1077,
4217,
323,
2007,
5233,
16936,
50276,
17124,
281,
923,
38044,
7982,
908,
347,
247,
13503,
281,
9305,
2228,
50276,
2420,
275,
30762,
4645,
18075,
281,
519,
404,
310,
1077,
4217,
275,
3629,
7162,
275,
519,
404,
323,
436,
2898,
50275,
20881,
1255,
265,
50276,
783,
5962,
4468,
310,
253,
3762,
2593,
253,
1554,
262,
1945,
4715,
285,
5028,
15644,
1543,
403,
2087,
1543,
326,
403,
417,
18212,
4802,
896,
281,
253,
2173,
1566,
285,
1895,
253,
2929,
310,
7296,
4754,
352,
310,
7561,
7607,
326,
1554,
262,
1945,
4715,
285,
5028,
15644,
476,
789,
973,
3340,
672,
8892,
403,
2905,
275,
690,
27289,
1039,
285,
352,
476,
320,
247,
4217,
5763,
281,
1551,
366,
5368,
3762,
275,
253,
3448,
273,
634,
7792,
533,
752,
604,
667,
4460,
3916,
310,
253,
3762,
27594,
403,
627,
667,
13650,
253,
3762,
2789,
670,
253,
1798,
2746,
534,
476,
320,
17618,
275,
4679,
50276,
783,
10527,
3033,
327,
2228,
326,
11101,
6013,
253,
2228,
273,
253,
32049,
285,
2273,
33293,
310,
12014,
14999,
275,
697,
7914,
4754,
352,
476,
320,
247,
4217,
5763,
281,
921,
326,
253,
2228,
476,
320,
45765,
2112,
253,
3104,
273,
253,
1566,
533,
1057,
436,
3033,
10380,
1804,
326,
253,
14717,
1543,
275,
2406,
2228,
685,
247,
1114,
36842,
1566,
390,
310,
352,
4645,
326,
368,
476,
789,
10939,
327,
11138,
2057,
629,
273,
253,
1566,
285,
3157,
253,
4583,
2228,
835,
310,
627,
2442,
323,
8542,
1318,
275,
436,
10012,
50276,
1542,
1650,
581,
1659,
627,
812,
320,
2442,
281,
17813,
253,
3762,
310,
281,
2451,
275,
4679,
326,
4836,
8557,
342,
2406,
369,
2152,
6339,
4181,
2686,
1329,
1805,
5028,
15644,
2299,
275,
253,
10199,
273,
253,
2929,
352,
26785,
326,
1014,
253,
33456,
1818,
275,
841,
3386,
778,
1421,
281,
37078,
1027,
16958,
533,
36908,
326,
1804,
326,
369,
2152,
6339,
4181,
778,
417,
320,
247,
4217,
7982,
1060,
323,
10499,
4836,
14259,
812,
2649,
32125,
2701,
253,
31471,
273,
824,
247,
7982,
50276,
856,
3321,
5922,
2296,
253,
3033,
310,
13714,
2343,
14356,
685,
253,
3033,
275,
10012,
4562,
323,
19843,
352,
651,
320,
1077,
9371,
281,
13398,
253,
14493,
715,
271,
11370,
4645,
436,
13714,
4213,
14356,
2867,
352,
310,
417,
4745,
5165,
432,
253,
3908,
273,
253,
39383,
1580,
253,
25930,
3831,
1027,
2426,
50276,
284,
310,
253,
3762,
36908,
1663,
8513,
253,
2929,
533,
323,
253,
2408,
273,
2317,
9940,
281,
352,
352,
36908,
823,
1199,
253,
2929,
812,
320,
9619,
5520,
407,
2057,
337,
6240,
7914,
12787,
11297,
29599,
273,
253,
3762,
326,
4684,
352,
896,
281,
253,
2746,
275,
253,
2929,
390,
374,
11922,
690,
273,
253,
1679,
4217,
4243,
273,
253,
3762,
432,
253,
2022,
2929,
281,
1959,
598,
2317,
323,
625,
273,
253,
4722,
1783,
273,
752,
253,
1566,
2686,
33772,
50276,
12563,
352,
310,
4722,
533,
247,
2372,
4828,
565,
48714,
326,
253,
3762,
2593,
15771,
327,
1543,
275,
1554,
262,
1945,
4715,
285,
5028,
15644,
3185,
273,
10527,
1543,
432,
253,
5148,
613,
920,
6239,
347,
310,
1580,
253,
2929,
15771,
327,
1554,
262,
1945,
4715,
594,
1199,
352,
310,
5816,
10414,
281,
2905,
789,
275,
1554,
262,
1945,
4715,
26332,
2905,
789,
3345,
273,
14053,
3520,
18525,
2718,
50276,
3549,
6241,
352,
651,
320,
9371,
281,
3748,
2139,
627,
403,
642,
14023,
281,
1554,
262,
1945,
4715,
390,
5028,
15644,
3082,
275,
253,
4679,
2139,
513,
597,
417,
4647,
1060,
50276,
783,
1264,
2426,
275,
253,
2957,
1159,
273,
253,
32049,
403,
973,
24013,
8550,
533,
352,
310,
417,
2590,
849,
1774,
1016,
1307,
310,
490,
77,
569,
327,
841,
2426,
651,
320,
1077,
27096,
323,
253,
9414,
281,
2096,
47515,
3839,
2424,
281,
6194,
271,
32049,
50276,
249,
2593,
608,
352,
2296,
945,
33032,
3024,
27532,
9712,
273,
4903,
949,
1027,
2957,
2426,
752,
403,
841,
2957,
2426,
285,
849,
403,
597,
1027,
432,
253,
4394,
275,
253,
2929,
50276,
249,
253,
490,
77,
569,
342,
642,
32049,
849,
513,
519,
404,
285,
519,
522,
324,
789,
13414,
597,
2430,
690,
1182,
835,
1057,
436,
1705,
432,
604,
417,
432,
253,
32049,
50276,
328,
292,
1057,
1646,
352,
812,
320,
387,
247,
18276,
18928,
2429,
281,
17713,
324,
275,
2426,
327,
1180,
273,
3602,
3340,
1580,
440,
292,
260,
310,
581,
273,
253,
625,
12085,
1666,
25379,
352,
651,
320,
4217,
281,
923,
1543,
323,
247,
4067,
440,
292,
260,
390,
387,
1878,
690,
1941,
326,
253,
440,
292,
310,
417,
762,
31893,
253,
3733,
941,
50275,
38092,
1953,
273,
1600,
50276,
1189,
455,
436,
310,
247,
1077,
1774,
247,
7826,
3676,
1386,
273,
2561,
253,
954,
12302,
9023,
273,
824,
789,
310,
253,
2442,
273,
19678,
6096,
3963,
1005,
2439,
37078,
39653,
7870,
2718,
326,
310,
2439,
2570,
3520,
4870,
285,
352,
3133,
253,
2746,
275,
253,
2929,
812,
320,
3782,
973,
3467,
959,
281,
824,
2561,
323,
1650,
253,
4477,
812,
6194,
247,
2014,
32049,
922,
33293,
1566,
2439,
512,
253,
15302,
275,
253,
2929,
285,
12106,
7688,
275,
253,
6311,
2349,
351,
723,
2439,
15302,
3733,
3210,
2439,
4122,
11117,
10625,
452,
644,
3597,
275,
1554,
262,
1945,
4715,
24088,
3215,
11273,
4979,
398,
347,
10898,
13782,
14917,
549,
32693,
43425,
253,
15153,
19969,
1566,
17857,
32888,
43425,
23178,
10898,
294,
19484,
1320,
5723,
2824,
6247,
581,
1566,
281,
3037,
731,
512,
549,
32693,
4240,
310,
824,
247,
26647,
629,
273,
253,
3356,
1307,
8113,
323,
436,
1386,
273,
789,
50275,
37585,
5701,
50276,
249,
2593,
2164,
690,
10414,
651,
320,
4217,
275,
253,
6197,
12365,
342,
253,
5678,
3490,
5150,
50276,
9088,
403,
2067,
45611,
275,
253,
897,
273,
41616,
275,
30404,
4768,
253,
2929,
35827,
841,
651,
3157,
1239,
1430,
50276,
249,
1390,
6197,
273,
806,
12494,
273,
2593,
577,
253,
3159,
4836,
812,
320,
4391,
281,
1633,
751,
1895,
1580,
4836,
556,
1529,
4495,
275,
253,
2929,
50276,
11425,
253,
818,
1731,
323,
440,
14069,
327,
12927,
18476,
2852,
320,
13433,
264,
50276,
249,
253,
1390,
12494,
273,
2593,
8319,
359,
3597,
281,
6889,
50276,
664,
3597,
11962,
390,
359,
12848,
50276,
50237,
247,
1797,
43302,
323,
815,
338,
676,
310,
327,
253,
3430,
3239,
50276,
50237,
247,
1797,
253,
1390,
12494,
3133,
751,
352,
943,
320,
253,
806,
12494,
273,
247,
1423,
50276,
249,
4737,
273,
13989,
270,
22,
627,
310,
271,
4465,
390,
5816,
873,
273,
5222,
8965,
275,
253,
806,
11370,
4583,
436,
310,
1077,
4722,
285,
4217,
789,
253,
1895,
310,
973,
24013,
8550,
285,
253,
2746,
285,
4679,
403,
9257,
4158,
285,
3839,
21414,
604,
253,
7350,
670,
253,
3762,
403,
9713,
891,
651,
320,
5211,
281,
2572,
619,
4868,
6240,
253,
3081,
8692,
285,
4679,
9521,
812,
2572,
352,
2007,
285,
1056,
436,
247,
3782,
2266,
2929,
2490,
187,
4118,
18435,
27,
783,
2929,
12453,
253,
1895,
273,
5028,
26647,
323,
4715,
7046,
7173,
358,
23702,
8062,
352,
29328,
247,
2900,
835,
271,
32049,
28174,
690,
5319,
273,
247,
1677,
3126,
285,
247,
2273,
33293,
47694,
3161,
1242,
26295,
2852,
8062,
27039,
327,
253,
5319,
6311,
407,
253,
32049,
753,
5010,
253,
2273,
33293,
33772,
253,
2087,
830,
273,
8062,
4764,
1025,
407,
271,
3126,
6779,
10375,
407,
253,
32049,
253,
21839,
310,
9009,
3066,
271,
17825,
4227,
21539,
5122,
247,
830,
273,
13294,
310,
671,
5611,
275,
1340,
281,
1379,
715,
2395,
7548,
2515,
253,
767,
4295,
32049,
285,
2273,
33293,
403,
10166,
32627,
436,
2746,
310,
5248,
264,
275,
247,
5148,
613,
920,
7792,
10527,
1543,
11797,
407,
1554,
262,
1945,
4715,
285,
5028,
15644,
403,
671,
5183,
253,
1566,
310,
6760,
285,
2429,
281,
1027,
1666,
25379,
327,
1264,
3237,
285,
323,
767,
1027,
7533,
11962,
3302,
2515,
342,
247,
1677,
8062,
285,
8062,
342,
11962,
3602,
50276,
2520,
310,
247,
45210,
2929,
352,
8571,
247,
14793,
285,
1774,
1895,
273,
5028,
26647,
323,
7870,
12620,
253,
4081,
2900,
310,
3236,
285,
26662,
973,
21657,
281,
2067,
1666,
25379,
352,
4483,
323,
1805,
26647,
3045,
323,
253,
767,
1071,
7533,
2783,
275,
253,
1655,
2715,
253,
2929,
2299,
27171,
432,
1027,
32213,
806,
627,
310,
253,
1607,
2845,
1297,
273,
253,
7125,
285,
253,
5740,
273,
253,
4679,
690,
273,
253,
7125,
285,
3916,
403,
21248,
285,
4536,
33408,
417,
17245,
598,
407,
1941,
323,
1650,
247,
4275,
1750,
310,
326,
253,
32049,
33772,
673,
13727,
13483,
39330,
253,
3126,
672,
253,
6311,
14237,
6296,
1818,
342,
247,
673,
5333,
275,
253,
3280,
323,
667,
3126,
253,
1072,
4566,
323,
253,
4154,
3715,
323,
253,
13294,
5140,
352,
310,
7558,
281,
1566,
7548,
2515,
533,
436,
310,
417,
4516,
407,
667,
10527,
390,
16774,
1941,
347,
4879,
407,
253,
30628,
253,
10527,
1783,
310,
33817,
432,
253,
5933,
280,
285,
5661,
16936,
285,
1057,
417,
3324,
1199,
3081,
1318,
281,
253,
2929,
752,
310,
625,
34019,
310,
326,
690,
273,
253,
3916,
275,
436,
2593,
403,
689,
33834,
285,
10808,
13583,
11815,
50276,
4064,
10012,
4562,
285,
13989,
5922,
253,
4477,
1804,
326,
1554,
262,
1945,
4715,
5644,
281,
1805,
26647,
685,
4715,
10939,
1223,
436,
310,
417,
19186,
16293,
407,
253,
1543,
436,
310,
14969,
407,
253,
4477,
275,
247,
1996,
4385,
16280,
253,
2515,
273,
13091,
403,
417,
5469,
1223,
597,
1646,
281,
760,
3835,
9534,
323,
534,
253,
6194,
285,
253,
1071,
10670,
403,
253,
1072,
253,
1072,
6556,
323,
253,
1273,
10527,
1543,
10012,
5910,
352,
310,
7558,
326,
436,
906,
8525,
253,
4477,
2934,
273,
3733,
32049,
285,
2273,
33293,
32627,
1223,
352,
1057,
417,
16280,
253,
14493,
275,
436,
906,
2550,
320,
6537,
347,
4879,
407,
253,
30628,
285,
403,
417,
4217,
275,
3946,
50276,
1189,
455,
253,
2929,
12453,
271,
1774,
9400,
285,
29328,
747,
5482,
253,
1543,
403,
12532,
285,
352,
310,
6296,
271,
4722,
7680,
2299,
23437,
19103,
285,
13583,
390,
36074,
3916,
1056,
352,
2834,
281,
2997,
253,
1655,
2715,
273,
253,
3929,
253,
3929,
651,
1056,
247,
2266,
285,
16694,
7680,
604,
352,
497,
3542,
347,
247,
15846,
5661,
3929,
342,
247,
7000,
5740,
273,
253,
4679,
285,
14023
] | [
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1,
1
] | [
651,
452,
310,
326,
1543,
275,
2829,
337,
403,
42935,
1175,
342,
760,
440,
14069,
3551,
2810,
513,
359,
871,
323,
841,
8892,
752,
253,
10527,
5170,
3033,
24088,
407,
253,
987,
268,
615,
985,
651,
320,
310,
352,
43245,
1014,
1896,
281,
11897,
436,
5170,
3033,
516,
12371,
849,
1199,
273,
247,
8037,
627,
1335,
310,
1512,
2810,
50275,
249,
247,
2074,
17716,
752,
310,
253,
30328,
3212,
17713,
324,
50276,
373,
3024,
1146,
1805,
685,
17713,
324,
50276,
328,
292,
6571,
403,
627,
690,
19767,
20544,
875,
17713,
324,
285,
501,
3024,
326,
436,
5019,
476,
22059,
1805,
685,
17713,
324,
50276,
328,
292,
436,
310,
247,
1175,
2929,
326,
2654,
751,
281,
923,
7607,
323,
697,
5019,
273,
10527,
1543,
16774,
1543,
285,
35961,
38135,
5474,
33032,
2520,
2929,
310,
6110,
275,
4715,
2087,
16923,
272,
3210,
323,
3520,
18525,
4870,
253,
2929,
29328,
247,
14717,
273,
824,
247,
1566,
715,
271,
32049,
326,
28174,
253,
24537,
3607,
273,
253,
985,
285,
247,
2273,
33293,
326,
47694,
3161,
1242,
2789,
13650,
27039,
327,
253,
16202,
3607,
436,
310,
29318,
347,
247,
5148,
613,
920,
2746,
285,
310,
2011,
281,
9619,
562,
32231,
34791,
1945,
7274,
285,
273,
649,
1041,
14440,
5148,
613,
920,
7274,
2439,
2709,
15302,
253,
2929,
3400,
690,
10527,
1783,
285,
18276,
1783,
273,
752,
310,
6311,
4583,
253,
2929,
2722,
326,
4715,
6096,
3210,
2439,
10625,
310,
271,
1774,
285,
46001,
1039,
3579,
323,
14053,
3520,
4870,
342,
5145,
4715,
20544,
50276,
783,
1895,
3908,
310,
973,
24013,
8550,
4715,
2087,
12729,
3676,
4715,
3210,
2439,
11117,
7533,
310,
271,
1774,
1527,
1895,
50276,
16217,
3825,
897,
4722,
285,
1524,
10186,
3237,
50276,
16680,
403,
2266,
285,
3176,
9630,
50276,
324,
522,
324,
310,
271,
4722,
2934,
18052,
281,
253,
1083,
273,
3520,
2570,
2718,
1580,
352,
310,
4158,
281,
2953,
7548,
1617,
3374,
50276,
34309,
5904,
921,
253,
1566,
310,
50090,
9093,
347,
3264,
50276,
20261,
627,
403,
1142,
2216,
10165,
326,
564,
275,
281,
253,
1566,
1016,
824,
2216,
4327,
310,
973,
24013,
8550,
50276,
45529,
432,
690,
7794,
273,
253,
3762,
2593,
253,
47284,
310,
3839,
3240,
2590,
285,
973,
34092,
50276,
515,
360,
6372,
403,
1160,
2590,
50276,
783,
958,
326,
253,
32049,
476,
320,
10166,
806,
285,
10939,
273,
253,
2273,
33293,
943,
320,
1077,
4217,
323,
2007,
5233,
16936,
50276,
17124,
281,
923,
38044,
7982,
908,
347,
247,
13503,
281,
9305,
2228,
50276,
2420,
275,
30762,
4645,
18075,
281,
519,
404,
310,
1077,
4217,
275,
3629,
7162,
275,
519,
404,
323,
436,
2898,
50275,
20881,
1255,
265,
50276,
783,
5962,
4468,
310,
253,
3762,
2593,
253,
1554,
262,
1945,
4715,
285,
5028,
15644,
1543,
403,
2087,
1543,
326,
403,
417,
18212,
4802,
896,
281,
253,
2173,
1566,
285,
1895,
253,
2929,
310,
7296,
4754,
352,
310,
7561,
7607,
326,
1554,
262,
1945,
4715,
285,
5028,
15644,
476,
789,
973,
3340,
672,
8892,
403,
2905,
275,
690,
27289,
1039,
285,
352,
476,
320,
247,
4217,
5763,
281,
1551,
366,
5368,
3762,
275,
253,
3448,
273,
634,
7792,
533,
752,
604,
667,
4460,
3916,
310,
253,
3762,
27594,
403,
627,
667,
13650,
253,
3762,
2789,
670,
253,
1798,
2746,
534,
476,
320,
17618,
275,
4679,
50276,
783,
10527,
3033,
327,
2228,
326,
11101,
6013,
253,
2228,
273,
253,
32049,
285,
2273,
33293,
310,
12014,
14999,
275,
697,
7914,
4754,
352,
476,
320,
247,
4217,
5763,
281,
921,
326,
253,
2228,
476,
320,
45765,
2112,
253,
3104,
273,
253,
1566,
533,
1057,
436,
3033,
10380,
1804,
326,
253,
14717,
1543,
275,
2406,
2228,
685,
247,
1114,
36842,
1566,
390,
310,
352,
4645,
326,
368,
476,
789,
10939,
327,
11138,
2057,
629,
273,
253,
1566,
285,
3157,
253,
4583,
2228,
835,
310,
627,
2442,
323,
8542,
1318,
275,
436,
10012,
50276,
1542,
1650,
581,
1659,
627,
812,
320,
2442,
281,
17813,
253,
3762,
310,
281,
2451,
275,
4679,
326,
4836,
8557,
342,
2406,
369,
2152,
6339,
4181,
2686,
1329,
1805,
5028,
15644,
2299,
275,
253,
10199,
273,
253,
2929,
352,
26785,
326,
1014,
253,
33456,
1818,
275,
841,
3386,
778,
1421,
281,
37078,
1027,
16958,
533,
36908,
326,
1804,
326,
369,
2152,
6339,
4181,
778,
417,
320,
247,
4217,
7982,
1060,
323,
10499,
4836,
14259,
812,
2649,
32125,
2701,
253,
31471,
273,
824,
247,
7982,
50276,
856,
3321,
5922,
2296,
253,
3033,
310,
13714,
2343,
14356,
685,
253,
3033,
275,
10012,
4562,
323,
19843,
352,
651,
320,
1077,
9371,
281,
13398,
253,
14493,
715,
271,
11370,
4645,
436,
13714,
4213,
14356,
2867,
352,
310,
417,
4745,
5165,
432,
253,
3908,
273,
253,
39383,
1580,
253,
25930,
3831,
1027,
2426,
50276,
284,
310,
253,
3762,
36908,
1663,
8513,
253,
2929,
533,
323,
253,
2408,
273,
2317,
9940,
281,
352,
352,
36908,
823,
1199,
253,
2929,
812,
320,
9619,
5520,
407,
2057,
337,
6240,
7914,
12787,
11297,
29599,
273,
253,
3762,
326,
4684,
352,
896,
281,
253,
2746,
275,
253,
2929,
390,
374,
11922,
690,
273,
253,
1679,
4217,
4243,
273,
253,
3762,
432,
253,
2022,
2929,
281,
1959,
598,
2317,
323,
625,
273,
253,
4722,
1783,
273,
752,
253,
1566,
2686,
33772,
50276,
12563,
352,
310,
4722,
533,
247,
2372,
4828,
565,
48714,
326,
253,
3762,
2593,
15771,
327,
1543,
275,
1554,
262,
1945,
4715,
285,
5028,
15644,
3185,
273,
10527,
1543,
432,
253,
5148,
613,
920,
6239,
347,
310,
1580,
253,
2929,
15771,
327,
1554,
262,
1945,
4715,
594,
1199,
352,
310,
5816,
10414,
281,
2905,
789,
275,
1554,
262,
1945,
4715,
26332,
2905,
789,
3345,
273,
14053,
3520,
18525,
2718,
50276,
3549,
6241,
352,
651,
320,
9371,
281,
3748,
2139,
627,
403,
642,
14023,
281,
1554,
262,
1945,
4715,
390,
5028,
15644,
3082,
275,
253,
4679,
2139,
513,
597,
417,
4647,
1060,
50276,
783,
1264,
2426,
275,
253,
2957,
1159,
273,
253,
32049,
403,
973,
24013,
8550,
533,
352,
310,
417,
2590,
849,
1774,
1016,
1307,
310,
490,
77,
569,
327,
841,
2426,
651,
320,
1077,
27096,
323,
253,
9414,
281,
2096,
47515,
3839,
2424,
281,
6194,
271,
32049,
50276,
249,
2593,
608,
352,
2296,
945,
33032,
3024,
27532,
9712,
273,
4903,
949,
1027,
2957,
2426,
752,
403,
841,
2957,
2426,
285,
849,
403,
597,
1027,
432,
253,
4394,
275,
253,
2929,
50276,
249,
253,
490,
77,
569,
342,
642,
32049,
849,
513,
519,
404,
285,
519,
522,
324,
789,
13414,
597,
2430,
690,
1182,
835,
1057,
436,
1705,
432,
604,
417,
432,
253,
32049,
50276,
328,
292,
1057,
1646,
352,
812,
320,
387,
247,
18276,
18928,
2429,
281,
17713,
324,
275,
2426,
327,
1180,
273,
3602,
3340,
1580,
440,
292,
260,
310,
581,
273,
253,
625,
12085,
1666,
25379,
352,
651,
320,
4217,
281,
923,
1543,
323,
247,
4067,
440,
292,
260,
390,
387,
1878,
690,
1941,
326,
253,
440,
292,
310,
417,
762,
31893,
253,
3733,
941,
50275,
38092,
1953,
273,
1600,
50276,
1189,
455,
436,
310,
247,
1077,
1774,
247,
7826,
3676,
1386,
273,
2561,
253,
954,
12302,
9023,
273,
824,
789,
310,
253,
2442,
273,
19678,
6096,
3963,
1005,
2439,
37078,
39653,
7870,
2718,
326,
310,
2439,
2570,
3520,
4870,
285,
352,
3133,
253,
2746,
275,
253,
2929,
812,
320,
3782,
973,
3467,
959,
281,
824,
2561,
323,
1650,
253,
4477,
812,
6194,
247,
2014,
32049,
922,
33293,
1566,
2439,
512,
253,
15302,
275,
253,
2929,
285,
12106,
7688,
275,
253,
6311,
2349,
351,
723,
2439,
15302,
3733,
3210,
2439,
4122,
11117,
10625,
452,
644,
3597,
275,
1554,
262,
1945,
4715,
24088,
3215,
11273,
4979,
398,
347,
10898,
13782,
14917,
549,
32693,
43425,
253,
15153,
19969,
1566,
17857,
32888,
43425,
23178,
10898,
294,
19484,
1320,
5723,
2824,
6247,
581,
1566,
281,
3037,
731,
512,
549,
32693,
4240,
310,
824,
247,
26647,
629,
273,
253,
3356,
1307,
8113,
323,
436,
1386,
273,
789,
50275,
37585,
5701,
50276,
249,
2593,
2164,
690,
10414,
651,
320,
4217,
275,
253,
6197,
12365,
342,
253,
5678,
3490,
5150,
50276,
9088,
403,
2067,
45611,
275,
253,
897,
273,
41616,
275,
30404,
4768,
253,
2929,
35827,
841,
651,
3157,
1239,
1430,
50276,
249,
1390,
6197,
273,
806,
12494,
273,
2593,
577,
253,
3159,
4836,
812,
320,
4391,
281,
1633,
751,
1895,
1580,
4836,
556,
1529,
4495,
275,
253,
2929,
50276,
11425,
253,
818,
1731,
323,
440,
14069,
327,
12927,
18476,
2852,
320,
13433,
264,
50276,
249,
253,
1390,
12494,
273,
2593,
8319,
359,
3597,
281,
6889,
50276,
664,
3597,
11962,
390,
359,
12848,
50276,
50237,
247,
1797,
43302,
323,
815,
338,
676,
310,
327,
253,
3430,
3239,
50276,
50237,
247,
1797,
253,
1390,
12494,
3133,
751,
352,
943,
320,
253,
806,
12494,
273,
247,
1423,
50276,
249,
4737,
273,
13989,
270,
22,
627,
310,
271,
4465,
390,
5816,
873,
273,
5222,
8965,
275,
253,
806,
11370,
4583,
436,
310,
1077,
4722,
285,
4217,
789,
253,
1895,
310,
973,
24013,
8550,
285,
253,
2746,
285,
4679,
403,
9257,
4158,
285,
3839,
21414,
604,
253,
7350,
670,
253,
3762,
403,
9713,
891,
651,
320,
5211,
281,
2572,
619,
4868,
6240,
253,
3081,
8692,
285,
4679,
9521,
812,
2572,
352,
2007,
285,
1056,
436,
247,
3782,
2266,
2929,
2490,
187,
4118,
18435,
27,
783,
2929,
12453,
253,
1895,
273,
5028,
26647,
323,
4715,
7046,
7173,
358,
23702,
8062,
352,
29328,
247,
2900,
835,
271,
32049,
28174,
690,
5319,
273,
247,
1677,
3126,
285,
247,
2273,
33293,
47694,
3161,
1242,
26295,
2852,
8062,
27039,
327,
253,
5319,
6311,
407,
253,
32049,
753,
5010,
253,
2273,
33293,
33772,
253,
2087,
830,
273,
8062,
4764,
1025,
407,
271,
3126,
6779,
10375,
407,
253,
32049,
253,
21839,
310,
9009,
3066,
271,
17825,
4227,
21539,
5122,
247,
830,
273,
13294,
310,
671,
5611,
275,
1340,
281,
1379,
715,
2395,
7548,
2515,
253,
767,
4295,
32049,
285,
2273,
33293,
403,
10166,
32627,
436,
2746,
310,
5248,
264,
275,
247,
5148,
613,
920,
7792,
10527,
1543,
11797,
407,
1554,
262,
1945,
4715,
285,
5028,
15644,
403,
671,
5183,
253,
1566,
310,
6760,
285,
2429,
281,
1027,
1666,
25379,
327,
1264,
3237,
285,
323,
767,
1027,
7533,
11962,
3302,
2515,
342,
247,
1677,
8062,
285,
8062,
342,
11962,
3602,
50276,
2520,
310,
247,
45210,
2929,
352,
8571,
247,
14793,
285,
1774,
1895,
273,
5028,
26647,
323,
7870,
12620,
253,
4081,
2900,
310,
3236,
285,
26662,
973,
21657,
281,
2067,
1666,
25379,
352,
4483,
323,
1805,
26647,
3045,
323,
253,
767,
1071,
7533,
2783,
275,
253,
1655,
2715,
253,
2929,
2299,
27171,
432,
1027,
32213,
806,
627,
310,
253,
1607,
2845,
1297,
273,
253,
7125,
285,
253,
5740,
273,
253,
4679,
690,
273,
253,
7125,
285,
3916,
403,
21248,
285,
4536,
33408,
417,
17245,
598,
407,
1941,
323,
1650,
247,
4275,
1750,
310,
326,
253,
32049,
33772,
673,
13727,
13483,
39330,
253,
3126,
672,
253,
6311,
14237,
6296,
1818,
342,
247,
673,
5333,
275,
253,
3280,
323,
667,
3126,
253,
1072,
4566,
323,
253,
4154,
3715,
323,
253,
13294,
5140,
352,
310,
7558,
281,
1566,
7548,
2515,
533,
436,
310,
417,
4516,
407,
667,
10527,
390,
16774,
1941,
347,
4879,
407,
253,
30628,
253,
10527,
1783,
310,
33817,
432,
253,
5933,
280,
285,
5661,
16936,
285,
1057,
417,
3324,
1199,
3081,
1318,
281,
253,
2929,
752,
310,
625,
34019,
310,
326,
690,
273,
253,
3916,
275,
436,
2593,
403,
689,
33834,
285,
10808,
13583,
11815,
50276,
4064,
10012,
4562,
285,
13989,
5922,
253,
4477,
1804,
326,
1554,
262,
1945,
4715,
5644,
281,
1805,
26647,
685,
4715,
10939,
1223,
436,
310,
417,
19186,
16293,
407,
253,
1543,
436,
310,
14969,
407,
253,
4477,
275,
247,
1996,
4385,
16280,
253,
2515,
273,
13091,
403,
417,
5469,
1223,
597,
1646,
281,
760,
3835,
9534,
323,
534,
253,
6194,
285,
253,
1071,
10670,
403,
253,
1072,
253,
1072,
6556,
323,
253,
1273,
10527,
1543,
10012,
5910,
352,
310,
7558,
326,
436,
906,
8525,
253,
4477,
2934,
273,
3733,
32049,
285,
2273,
33293,
32627,
1223,
352,
1057,
417,
16280,
253,
14493,
275,
436,
906,
2550,
320,
6537,
347,
4879,
407,
253,
30628,
285,
403,
417,
4217,
275,
3946,
50276,
1189,
455,
253,
2929,
12453,
271,
1774,
9400,
285,
29328,
747,
5482,
253,
1543,
403,
12532,
285,
352,
310,
6296,
271,
4722,
7680,
2299,
23437,
19103,
285,
13583,
390,
36074,
3916,
1056,
352,
2834,
281,
2997,
253,
1655,
2715,
273,
253,
3929,
253,
3929,
651,
1056,
247,
2266,
285,
16694,
7680,
604,
352,
497,
3542,
347,
247,
15846,
5661,
3929,
342,
247,
7000,
5740,
273,
253,
4679,
285,
14023
] |
"Below is given review of a research paper from cnoference journal. Please write a summary the revie(...TRUNCATED) | "the paper provides a theoretical analysis of sparsely activated neural networks they introduce lsh (...TRUNCATED) | [30003,310,1677,2278,273,247,2561,2929,432,260,2369,1793,6698,15,7764,3630,247,6010,253,2278,15,187,(...TRUNCATED) | [1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1(...TRUNCATED) | [30003,310,1677,2278,273,247,2561,2929,432,260,2369,1793,6698,15,7764,3630,247,6010,253,2278,15,187,(...TRUNCATED) |
"Below is given review of a research paper from cnoference journal. Please write a summary the revie(...TRUNCATED) | "the paper augments pretrained language models by introducing adapter where each adapter is another (...TRUNCATED) | [31644,1566,41731,13015,687,589,893,3210,275,1027,8892,50276,249,2593,4562,417,2217,2317,556,644,185(...TRUNCATED) | [1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1(...TRUNCATED) | [31644,1566,41731,13015,687,589,893,3210,275,1027,8892,50276,249,2593,4562,417,2217,2317,556,644,185(...TRUNCATED) |
"Below is given review of a research paper from cnoference journal. Please write a summary the revie(...TRUNCATED) | "this paper studies how to order inmemory sequences for graph embedding there was a positive consens(...TRUNCATED) | [30003,310,1677,2278,273,247,2561,2929,432,260,2369,1793,6698,15,7764,3630,247,6010,253,2278,15,187,(...TRUNCATED) | [1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1(...TRUNCATED) | [30003,310,1677,2278,273,247,2561,2929,432,260,2369,1793,6698,15,7764,3630,247,6010,253,2278,15,187,(...TRUNCATED) |
"Below is given review of a research paper from cnoference journal. Please write a summary the revie(...TRUNCATED) | "this paper revisits the information bottleneck principle but in terms of the compression inherent i(...TRUNCATED) | [253,2929,2319,285,1347,4679,281,921,604,253,13260,403,3588,275,253,4679,253,4477,1408,50276,44295,3(...TRUNCATED) | [1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1(...TRUNCATED) | [253,2929,2319,285,1347,4679,281,921,604,253,13260,403,3588,275,253,4679,253,4477,1408,50276,44295,3(...TRUNCATED) |
End of preview. Expand
in Dataset Viewer.
README.md exists but content is empty.
- Downloads last month
- 36