hysts HF staff commited on
Commit
56c614d
1 Parent(s): f80cc5f

Upload folder using huggingface_hub

Browse files
Files changed (12) hide show
  1. 0.codes.pt +2 -2
  2. 0.metadata.json +2 -2
  3. 0.residuals.pt +2 -2
  4. avg_residual.pt +1 -1
  5. buckets.pt +1 -1
  6. centroids.pt +1 -1
  7. collection.json +0 -2
  8. doclens.0.json +1 -1
  9. ivf.pid.pt +2 -2
  10. metadata.json +4 -4
  11. pid_docid_map.json +1073 -1075
  12. plan.json +4 -4
0.codes.pt CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:c377244d8c9597e23f59998ce2976953f0f97d12fd0db77c2cb4547de879ec53
3
- size 2191452
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f1c65a606623614fe302b76aadeb608c567bd0c58440011c536c03a7428c7d14
3
+ size 2190172
0.metadata.json CHANGED
@@ -1,6 +1,6 @@
1
  {
2
  "passage_offset": 0,
3
- "num_passages": 3150,
4
- "num_embeddings": 547570,
5
  "embedding_offset": 0
6
  }
 
1
  {
2
  "passage_offset": 0,
3
+ "num_passages": 3148,
4
+ "num_embeddings": 547258,
5
  "embedding_offset": 0
6
  }
0.residuals.pt CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:2a01374756d947aa5dc1edff475128331996540caf7c170a5b11e71a700622ba
3
- size 70090160
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1b64f810aea583bdc26250a47d1bf048210b2423b874295a890889c55d34e91e
3
+ size 70050224
avg_residual.pt CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:85547f2aeba6b59b0b140a8dd9211dc9738c7c5ff590dab46093f362244c60c7
3
  size 1205
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0815c221ae0afd4d721aec947578950449bde0364569458e9d707cee879d6445
3
  size 1205
buckets.pt CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:c178e01facb8d5c0022c69d77e34d3172719ea40d8fffd45eb1207f9f43a1f78
3
  size 2904
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:91aae4afb4f5495325aac7dcf79f568b24981263eaeeedc8f728e265d4e784cb
3
  size 2904
centroids.pt CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:16cd5463bdb2bc4e3f41fb18610d44a637c252c32e32fd8035b4470bc41631b0
3
  size 2098342
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b2c81f0b56958c55d500ee0956ede8727fb3f05a61cb9682b844a5b2d93f330c
3
  size 2098342
collection.json CHANGED
@@ -2060,8 +2060,6 @@
2060
  "We demonstrate the empirical effectiveness of our method by turning the open-source unconditional protein diffusion model Genie into the conditional model with no retraining. Generated proteins exhibit the desired dynamical and structural properties while still being biologically plausible. Our work represents a first step towards incorporating dynamical behaviour in protein design and may open the door to designing more flexible and functional proteins in the future.",
2061
  "Machine learning has demonstrated remarkable performance over finite datasets, yet whether the scores over the fixed benchmarks can sufficiently indicate the model\u2019s performance in the real world is still in discussion. In reality, an ideal robust model will probably behave similarly to the oracle (e.g., the human users), thus a good evaluation protocol is probably to evaluate the models\u2019 behaviors in comparison to the oracle. In this paper, we introduce a new robustness measurement that directly measures the image classification model\u2019s performance compared with a surrogate oracle (i.e., a zoo of foundation models). Besides, we design a simple method that can accomplish the evaluation beyond the scope of the benchmarks. Our method extends the image datasets with new samples that are sufficiently perturbed to be distinct from the ones in the original sets, but are still bounded within the same image-label structure the original test image represents, constrained by a zoo of foundation models pretrained with a large amount of samples. As a result, our new method will offer us a new way to evaluate the models\u2019 robustness performance, free of limitations of fixed benchmarks or constrained perturbations, although scoped by the power of the oracle. In addition to the evaluation results, we also leverage our generated data to understand the behaviors of the model and our new evaluation strategies.",
2062
  "We introduce Clifford Group Equivariant Simplicial Message Passing Networks, a method for steerable $\\mathrm{E}(n)$-equivariant message passing on simplicial complexes. Our method integrates the expressivity of Clifford group-equivariant layers with simplicial message passing, which is topologically more intricate than regular graph message passing. Clifford algebras include higher-order objects such as bivectors and trivectors, which express geometric features (e.g., areas, volumes) derived from vectors. Using this knowledge, we represent simplex features through geometric products of their vertices. To achieve efficient simplicial message passing, we share the parameters of the message network across different dimensions. Additionally, we restrict the final message to an aggregation of the incoming messages from different dimensions, leading to what we term *shared* simplicial message passing. Experimental results show that our method is able to outperform both equivariant and simplicial graph neural networks on a variety of geometric tasks.",
2063
- "We study adversarial bandit problems with potentially heavy-tailed losses. Unlike standard settings with non-negative and bounded losses, managing negative and unbounded losses introduces a unique challenge in controlling the ``stability'' of the algorithm and hence the regret. To tackle this challenge, we propose a Follow-the-Perturbed-Leader (FTPL) based learning algorithm. Notably, our method achieves (nearly) optimal worst-case regret, eliminating the need for an undesired assumption inherent in the Follow-the-Regularized-Leader (FTRL) based approach. Thanks to this distinctive advantage, our algorithmic framework finds novel applications in two important scenarios with unbounded heavy-tailed losses. For adversarial bandits with heavy-tailed losses and Huber contamination, which we call the robust setting, our algorithm is the first to match the lower bound (up to a $\\polylog(K)$ factor, where $K$ is the number of actions).",
2064
- "For adversarial bandits with heavy-tailed losses and Huber contamination, which we call the robust setting, our algorithm is the first to match the lower bound (up to a $\\polylog(K)$ factor, where $K$ is the number of actions). In the private setting, where true losses are in a bounded range (e.g., $[0,1]$) but with additional Local Differential Privacy (LDP) guarantees, our algorithm achieves an improvement of a $\\polylog(T)$ factor in the regret bound compared to the best-known results, where $T$ is the total number of rounds. Furthermore, when compared to state-of-the-art FTRL-based algorithms, our FTPL-based algorithm has a more streamlined design. It eliminates the need for additional explicit exploration and solely maintains the absolute value of loss estimates below a predetermined threshold.",
2065
  "While large language models (LLMs) equipped with techniques like chain-of-thought prompting have demonstrated impressive capabilities, they still fall short in their ability to reason robustly in complex settings. However, evaluating LLM reasoning is challenging because system capabilities continue to grow while benchmark datasets for tasks like logical deduction have remained static. We introduce MuSR, a dataset for evaluating language models on multistep soft reasoning tasks specified in a natural language narrative. This dataset has two crucial features. First, it is created through a novel neurosymbolic synthetic-to-natural generation algorithm, enabling the construction of complex reasoning instances that challenge GPT-4 (e.g., murder mysteries roughly 1000 words in length) and which can be scaled further as more capable LLMs are released. Second, our data instances are free text narratives corresponding to real-world domains of reasoning; this makes it simultaneously much more challenging than other synthetically-crafted benchmarks while remaining realistic and tractable for human annotators to solve with high accuracy. We evaluate a range of LLMs and prompting techniques on this dataset and characterize the gaps that remain for techniques like chain-of-thought to perform robust reasoning.",
2066
  "Multi-modal language models (LM) have recently shown promising performance in high-level reasoning tasks on videos. However, existing methods still fall short in tasks like causal or compositional spatiotemporal reasoning over actions, in which model predictions need to be grounded in fine-grained low-level details, such as object motions and object interactions.In this work, we propose training an LM end-to-end on low-level surrogate tasks, including object detection, re-identification, and tracking, to endow the model with the required low-level visual capabilities. We show that a two-stream video encoder with spatiotemporal attention is effective at capturing the required static and motion-based cues in the video. By leveraging the LM's ability to perform the low-level surrogate tasks, we can cast reasoning in videos as the three-step process of *Look, Remember, Reason*, wherein visual information is extracted using low-level visual skills step-by-step and then integrated to arrive at a final answer. We demonstrate the effectiveness of our framework on diverse visual reasoning tasks from the ACRE, CATER, and Something-Else datasets. Our approach is trainable end-to-end and surpasses state-of-the-art task-specific methods across these tasks by a large margin.",
2067
  "While training large language models (LLMs) from scratch can generate models with distinct functionalities and strengths, it comes at significant costs and may result in redundant capabilities. Alternatively, a cost-effective and compelling approach is to merge existing pre-trained LLMs into a more potent model. However, due to the varying architectures of these LLMs, directly blending their weights is impractical. In this paper, we introduce the notion of knowledge fusion for LLMs, aimed at combining the capabilities of existing LLMs and transferring them into a single LLM. By leveraging the generative distributions of source LLMs, we externalize their collective knowledge and unique strengths, thereby potentially elevating the capabilities of the target model beyond those of any individual source LLM. We validate our approach using three popular LLMs with different architectures\u2014Llama-2, MPT, and OpenLLaMA\u2014across various benchmarks and tasks. Our findings confirm that the fusion of LLMs can improve the performance of the target model across a range of capabilities such as reasoning, commonsense, and code generation. Our code, model weights, and data are public at \\url{https://github.com/fanqiwan/FuseLLM}.",
 
2060
  "We demonstrate the empirical effectiveness of our method by turning the open-source unconditional protein diffusion model Genie into the conditional model with no retraining. Generated proteins exhibit the desired dynamical and structural properties while still being biologically plausible. Our work represents a first step towards incorporating dynamical behaviour in protein design and may open the door to designing more flexible and functional proteins in the future.",
2061
  "Machine learning has demonstrated remarkable performance over finite datasets, yet whether the scores over the fixed benchmarks can sufficiently indicate the model\u2019s performance in the real world is still in discussion. In reality, an ideal robust model will probably behave similarly to the oracle (e.g., the human users), thus a good evaluation protocol is probably to evaluate the models\u2019 behaviors in comparison to the oracle. In this paper, we introduce a new robustness measurement that directly measures the image classification model\u2019s performance compared with a surrogate oracle (i.e., a zoo of foundation models). Besides, we design a simple method that can accomplish the evaluation beyond the scope of the benchmarks. Our method extends the image datasets with new samples that are sufficiently perturbed to be distinct from the ones in the original sets, but are still bounded within the same image-label structure the original test image represents, constrained by a zoo of foundation models pretrained with a large amount of samples. As a result, our new method will offer us a new way to evaluate the models\u2019 robustness performance, free of limitations of fixed benchmarks or constrained perturbations, although scoped by the power of the oracle. In addition to the evaluation results, we also leverage our generated data to understand the behaviors of the model and our new evaluation strategies.",
2062
  "We introduce Clifford Group Equivariant Simplicial Message Passing Networks, a method for steerable $\\mathrm{E}(n)$-equivariant message passing on simplicial complexes. Our method integrates the expressivity of Clifford group-equivariant layers with simplicial message passing, which is topologically more intricate than regular graph message passing. Clifford algebras include higher-order objects such as bivectors and trivectors, which express geometric features (e.g., areas, volumes) derived from vectors. Using this knowledge, we represent simplex features through geometric products of their vertices. To achieve efficient simplicial message passing, we share the parameters of the message network across different dimensions. Additionally, we restrict the final message to an aggregation of the incoming messages from different dimensions, leading to what we term *shared* simplicial message passing. Experimental results show that our method is able to outperform both equivariant and simplicial graph neural networks on a variety of geometric tasks.",
 
 
2063
  "While large language models (LLMs) equipped with techniques like chain-of-thought prompting have demonstrated impressive capabilities, they still fall short in their ability to reason robustly in complex settings. However, evaluating LLM reasoning is challenging because system capabilities continue to grow while benchmark datasets for tasks like logical deduction have remained static. We introduce MuSR, a dataset for evaluating language models on multistep soft reasoning tasks specified in a natural language narrative. This dataset has two crucial features. First, it is created through a novel neurosymbolic synthetic-to-natural generation algorithm, enabling the construction of complex reasoning instances that challenge GPT-4 (e.g., murder mysteries roughly 1000 words in length) and which can be scaled further as more capable LLMs are released. Second, our data instances are free text narratives corresponding to real-world domains of reasoning; this makes it simultaneously much more challenging than other synthetically-crafted benchmarks while remaining realistic and tractable for human annotators to solve with high accuracy. We evaluate a range of LLMs and prompting techniques on this dataset and characterize the gaps that remain for techniques like chain-of-thought to perform robust reasoning.",
2064
  "Multi-modal language models (LM) have recently shown promising performance in high-level reasoning tasks on videos. However, existing methods still fall short in tasks like causal or compositional spatiotemporal reasoning over actions, in which model predictions need to be grounded in fine-grained low-level details, such as object motions and object interactions.In this work, we propose training an LM end-to-end on low-level surrogate tasks, including object detection, re-identification, and tracking, to endow the model with the required low-level visual capabilities. We show that a two-stream video encoder with spatiotemporal attention is effective at capturing the required static and motion-based cues in the video. By leveraging the LM's ability to perform the low-level surrogate tasks, we can cast reasoning in videos as the three-step process of *Look, Remember, Reason*, wherein visual information is extracted using low-level visual skills step-by-step and then integrated to arrive at a final answer. We demonstrate the effectiveness of our framework on diverse visual reasoning tasks from the ACRE, CATER, and Something-Else datasets. Our approach is trainable end-to-end and surpasses state-of-the-art task-specific methods across these tasks by a large margin.",
2065
  "While training large language models (LLMs) from scratch can generate models with distinct functionalities and strengths, it comes at significant costs and may result in redundant capabilities. Alternatively, a cost-effective and compelling approach is to merge existing pre-trained LLMs into a more potent model. However, due to the varying architectures of these LLMs, directly blending their weights is impractical. In this paper, we introduce the notion of knowledge fusion for LLMs, aimed at combining the capabilities of existing LLMs and transferring them into a single LLM. By leveraging the generative distributions of source LLMs, we externalize their collective knowledge and unique strengths, thereby potentially elevating the capabilities of the target model beyond those of any individual source LLM. We validate our approach using three popular LLMs with different architectures\u2014Llama-2, MPT, and OpenLLaMA\u2014across various benchmarks and tasks. Our findings confirm that the fusion of LLMs can improve the performance of the target model across a range of capabilities such as reasoning, commonsense, and code generation. Our code, model weights, and data are public at \\url{https://github.com/fanqiwan/FuseLLM}.",
doclens.0.json CHANGED
@@ -1 +1 @@
1
- [231,87,176,106,217,227,206,108,211,148,227,93,192,189,69,224,185,228,221,193,157,223,100,229,191,158,182,227,226,155,210,86,166,204,104,223,218,77,160,171,232,227,217,125,212,62,205,223,178,226,210,116,202,169,198,109,205,98,185,228,235,199,210,209,121,214,210,174,193,121,214,93,217,73,215,101,224,197,143,229,231,200,164,130,225,93,229,203,126,200,224,183,225,81,225,115,199,155,142,157,217,109,222,214,140,202,214,93,179,91,171,225,195,133,196,205,196,226,233,135,217,226,212,227,236,203,159,198,150,204,219,222,56,171,211,153,224,208,226,106,200,218,201,108,220,220,89,227,225,79,233,200,65,214,146,180,200,191,217,132,167,213,146,201,193,161,216,169,190,85,180,96,156,86,194,103,226,189,118,225,68,207,102,198,81,112,205,85,215,210,72,235,91,195,157,197,201,123,211,209,156,193,207,220,157,210,214,213,70,215,112,145,207,175,121,228,175,172,191,232,196,104,215,212,201,165,208,211,225,154,136,203,100,206,154,216,99,178,212,119,218,228,76,129,231,211,89,190,225,79,213,139,218,215,208,104,187,214,223,129,183,140,197,159,191,118,162,227,174,162,224,206,43,203,176,232,170,185,18,230,90,206,207,181,222,62,207,213,62,227,203,192,175,195,216,225,103,159,216,194,223,194,90,190,150,216,205,216,179,193,84,193,192,209,160,222,70,228,68,155,221,227,195,206,147,171,225,106,147,148,208,109,227,104,194,213,199,125,204,148,196,101,218,81,197,152,219,170,76,108,206,210,172,238,218,188,75,226,208,160,211,229,73,115,224,224,209,46,207,216,223,194,212,101,213,222,52,209,210,175,178,209,158,200,74,146,181,204,81,214,120,190,213,153,218,115,217,86,203,219,76,222,85,227,92,207,225,210,220,187,190,201,195,196,193,190,165,195,219,190,175,194,86,185,220,113,194,65,201,213,219,131,218,221,136,223,209,157,186,115,215,105,190,107,224,137,199,73,188,111,166,114,141,205,239,222,218,168,201,111,225,81,204,212,180,226,90,183,194,138,110,168,208,87,196,71,203,225,225,226,92,106,168,196,227,210,222,133,212,138,230,214,103,235,76,218,197,206,142,239,88,200,208,183,173,221,175,220,93,219,53,221,111,202,92,224,109,216,211,81,181,216,150,211,202,227,75,233,227,92,199,178,218,208,105,210,211,48,227,191,221,115,209,212,187,220,101,225,211,196,214,119,209,39,189,81,205,225,85,233,192,213,112,214,141,213,202,91,211,226,111,195,229,201,68,202,163,218,155,201,191,131,199,229,230,208,154,214,225,228,214,133,186,86,208,226,225,147,203,205,100,177,216,85,206,224,162,191,191,200,207,88,213,128,217,220,90,223,85,185,146,194,112,189,204,185,201,216,76,211,218,129,229,232,206,151,199,117,210,229,92,223,97,215,63,217,185,145,158,193,124,222,178,224,113,208,196,171,198,142,212,218,80,193,82,224,105,161,215,137,176,203,216,128,200,79,226,130,169,189,163,192,166,209,167,222,197,75,220,74,213,97,193,94,197,54,230,209,210,162,192,210,75,225,221,217,96,189,202,220,106,215,121,229,200,163,229,81,219,226,224,186,93,225,214,219,151,233,182,144,221,221,221,214,110,179,142,222,200,110,218,206,196,214,92,216,47,208,221,76,225,141,194,128,219,208,159,218,201,220,195,101,202,157,190,137,167,218,212,108,214,228,224,217,74,226,77,231,102,219,112,215,94,181,226,118,171,211,112,208,163,200,155,208,124,221,179,226,229,100,229,221,79,180,219,213,69,224,19,226,224,188,202,202,80,202,67,210,138,215,69,228,77,208,190,199,156,218,79,224,88,229,53,218,180,112,156,173,218,82,207,196,227,201,222,224,36,212,181,137,124,207,223,171,160,228,121,218,212,207,227,77,198,185,211,172,177,229,213,182,213,220,87,150,202,154,231,111,231,229,77,195,170,168,163,221,139,126,206,179,98,164,225,219,102,180,223,75,182,227,199,174,191,131,210,190,231,161,167,215,200,109,212,82,208,212,166,190,207,150,218,138,230,160,163,145,185,211,67,228,205,111,154,237,78,185,212,134,191,202,181,71,239,161,205,95,191,216,200,129,222,112,222,211,82,173,161,197,160,195,166,221,236,206,26,164,232,214,111,210,78,212,108,177,145,211,187,203,171,201,218,45,211,100,226,210,80,219,118,147,206,194,226,144,219,219,111,224,115,234,210,189,133,198,176,211,203,64,223,76,217,123,196,191,214,193,121,203,212,198,197,138,226,113,208,176,187,70,226,142,193,86,180,214,210,207,212,76,181,193,187,130,173,119,206,124,207,230,85,226,185,194,158,148,219,234,70,210,71,218,217,194,227,37,202,230,223,180,199,92,227,74,234,128,211,117,190,182,230,209,84,129,226,148,209,223,176,209,219,141,187,128,193,224,133,222,193,92,187,84,230,29,205,209,227,171,227,220,90,116,180,209,210,65,206,200,73,218,204,171,202,158,210,182,193,212,135,150,198,101,227,111,230,177,225,227,223,141,225,127,207,224,227,153,190,126,134,197,117,233,104,178,209,175,195,81,214,219,60,225,217,171,200,31,225,185,166,221,150,145,196,215,73,202,220,215,140,209,87,219,71,171,157,213,232,51,225,138,204,231,215,168,196,215,104,200,222,232,110,199,59,213,201,101,219,103,171,215,199,220,204,159,186,186,147,226,181,163,231,82,222,84,204,113,218,145,183,204,208,226,72,212,156,177,171,220,214,173,222,122,214,168,176,187,81,210,132,206,108,234,220,160,208,47,194,150,122,206,184,210,207,199,95,230,154,212,205,167,168,222,191,100,203,167,215,161,222,84,210,78,237,162,193,108,160,209,59,201,148,167,205,185,212,149,213,56,190,108,217,177,132,190,192,132,201,219,222,82,210,165,184,219,169,181,147,216,103,220,101,184,224,195,230,102,196,196,192,196,222,173,212,218,212,209,134,212,76,209,117,220,52,206,147,164,229,106,196,217,200,203,117,219,230,114,184,57,221,91,226,150,182,202,103,180,216,172,129,193,214,217,153,184,85,228,40,216,83,184,200,226,100,163,176,208,189,219,222,229,122,209,157,176,195,218,100,161,220,221,121,191,66,195,206,118,215,216,123,221,188,50,208,78,230,200,207,95,195,85,194,225,75,201,228,50,213,200,71,176,92,213,84,189,203,127,184,201,166,199,106,222,123,185,114,217,211,218,67,214,75,214,145,224,213,200,95,187,82,195,215,113,153,166,202,97,198,215,49,188,216,215,136,191,131,190,218,217,224,175,218,198,97,171,221,229,214,229,181,167,225,120,179,227,222,220,69,210,69,160,233,188,197,190,215,193,147,169,154,207,197,137,148,136,171,227,93,149,175,228,98,207,103,225,229,112,195,127,197,101,211,217,229,116,223,229,212,223,119,202,89,207,227,78,211,205,127,224,194,214,77,152,145,157,223,231,151,145,212,140,220,221,197,213,110,226,191,175,82,222,145,150,217,206,207,137,226,94,221,68,60,227,200,218,193,208,204,111,200,89,188,135,152,167,205,223,221,169,164,212,95,170,93,116,147,115,166,185,219,183,223,230,200,231,232,220,89,183,122,193,188,216,143,154,161,227,228,116,221,216,219,211,230,163,205,188,111,161,218,224,231,117,170,183,175,223,213,176,200,110,128,225,212,50,206,208,63,191,173,170,195,90,209,166,144,195,176,224,222,68,228,132,207,201,135,167,209,166,123,225,213,190,154,195,222,230,212,88,158,208,110,212,127,226,121,213,70,198,146,221,95,193,164,221,157,232,182,219,52,231,203,49,213,224,76,151,223,88,202,113,190,201,144,158,180,201,182,215,164,30,207,70,195,104,202,194,116,217,80,229,162,159,216,212,113,214,149,198,134,224,75,184,166,224,91,158,221,78,170,223,228,77,221,221,218,201,222,78,178,200,113,204,190,185,202,198,141,205,124,187,227,223,164,114,190,117,223,46,198,136,165,181,64,231,166,182,214,63,220,62,210,57,220,192,189,215,224,182,191,208,71,228,151,214,97,208,229,132,199,197,172,142,209,200,215,232,226,184,155,177,208,83,225,67,159,134,117,227,114,172,217,222,148,187,194,82,198,126,167,221,60,126,122,164,184,111,129,194,156,176,211,157,231,83,226,228,130,209,111,187,224,196,213,79,130,192,83,188,221,192,154,179,228,108,201,120,222,141,168,172,172,142,225,193,218,116,223,137,175,77,214,211,127,227,111,216,72,199,97,226,190,209,203,179,126,197,124,116,226,189,202,37,214,53,207,202,196,161,207,168,159,198,65,186,164,207,134,184,215,94,207,220,68,207,102,225,151,220,224,94,208,230,192,201,189,151,157,176,199,219,106,176,155,215,93,186,176,64,188,98,222,108,214,187,137,208,215,224,95,226,220,215,207,219,220,94,222,45,170,169,200,213,88,177,197,131,177,126,229,172,88,213,220,117,231,212,99,156,200,177,188,216,222,127,220,163,209,97,134,207,178,232,99,159,220,168,192,210,113,225,217,223,233,183,186,82,216,221,79,187,195,180,111,162,66,182,199,92,204,79,230,232,232,205,99,143,211,112,207,232,77,229,196,163,149,221,214,222,236,181,230,215,80,180,84,214,122,224,186,223,190,214,211,84,164,130,206,51,216,36,233,222,204,138,208,178,134,213,203,149,211,93,174,228,67,210,50,206,150,227,221,215,66,222,122,226,145,226,196,191,222,205,218,230,162,155,210,216,137,208,118,231,218,206,100,188,108,167,209,81,165,173,220,158,215,80,217,141,214,233,225,212,155,185,167,168,152,186,212,79,230,230,192,208,58,203,102,196,88,189,206,191,201,134,166,104,227,112,211,193,140,174,159,222,82,191,225,67,212,114,169,97,188,146,225,135,214,58,213,103,238,228,205,211,218,216,138,215,139,215,100,217,201,80,205,164,224,169,182,105,181,222,228,194,149,213,96,209,218,101,202,211,139,228,157,192,197,131,176,198,88,221,194,131,152,170,220,62,224,58,198,201,197,181,230,84,219,88,230,206,135,210,108,213,82,203,207,139,217,95,213,81,207,189,160,230,208,134,226,217,83,166,190,117,209,226,226,109,221,147,219,206,214,65,216,193,225,222,223,119,200,217,191,147,223,215,206,93,223,111,218,213,141,158,229,118,192,102,218,227,166,191,200,198,71,141,196,163,185,169,123,143,187,170,95,182,220,119,190,99,225,169,184,132,223,157,205,66,222,156,164,204,158,201,70,222,233,116,179,53,219,132,196,181,224,196,10,196,142,217,164,216,95,122,206,62,209,141,211,94,151,200,202,169,173,174,204,173,209,152,206,105,194,99,161,197,204,73,200,116,213,90,175,171,154,196,216,221,210,57,222,57,212,200,229,196,220,165,166,225,228,203,207,201,232,207,141,170,132,214,195,175,153,220,88,197,191,180,161,198,178,191,222,200,213,140,174,229,50,185,212,128,219,47,200,100,177,188,190,105,191,184,71,190,206,169,187,218,142,218,225,208,134,121,232,177,214,121,183,171,202,76,220,68,192,206,53,219,200,63,175,198,160,224,196,97,198,231,224,86,224,74,137,189,207,91,229,130,160,159,213,90,82,175,172,194,165,158,38,157,216,187,204,90,181,219,199,143,155,129,191,219,184,228,199,186,155,210,208,91,203,192,215,98,177,224,71,225,133,199,234,226,109,201,208,191,209,228,207,205,225,193,215,113,221,141,131,194,85,162,217,101,195,109,227,223,213,195,228,101,168,111,221,197,48,202,171,163,191,209,111,229,225,211,223,207,202,57,232,226,202,227,167,172,228,196,78,222,198,160,212,108,202,136,181,207,171,223,76,211,73,223,174,144,178,184,197,210,77,210,183,177,205,220,204,59,214,109,203,237,203,173,198,115,217,179,176,199,180,152,159,161,85,183,97,227,135,189,154,142,156,217,90,232,188,200,118,222,87,179,170,216,195,126,184,91,188,148,166,199,206,96,182,222,111,202,154,199,192,122,192,220,223,202,215,47,158,168,204,134,153,102,190,165,189,163,223,200,77,180,98,207,124,231,147,230,74,139,170,222,237,167,159,215,232,40,207,138,215,215,219,104,206,142,216,80,205,97,175,120,185,197,224,184,209,81,184,120,174,206,142,231,226,73,217,26,222,94,207,130,151,156,183,212,228,194,151,214,88,227,218,212,174,220,167,149,192,192,213,56,151,220,219,118,181,210,204,64,203,114,217,119,204,210,227,58,231,216,176,227,60,220,205,171,159,226,94,153,210,83,158,172,127,223,185,166,206,224,212,61,207,197,156,178,202,28,228,180,198,148,154,197,211,142,164,212,66,195,139,229,234,229,73,207,203,224,135,195,142,206,202,181,90,188,200,128,201,178,213,225,224,115,198,97,197,208,209,221,70,191,133,118,194,196,85,131,188,39,194,216,198,85,229,133,197,109,232,124,209,171,156,203,199,219,188,189,182,196,66,234,184,221,215,217,101,218,108,199,118,226,137,222,106,215,198,133,216,94,205,162,226,121,224,104,217,111,228,119,196,194,134,211,207,207,202,199,168,129,225,75,177,216,212,85,174,227,84,148,199,160,212,224,195,219,141,186,191,206,215,113,195,211,56,186,209,219,148,212,229,157,203,171,227,72,193,199,191,171,215,223,220,117,198,209,101,213,89,205,201,209,144,178,124,218,218,115,196,167,229,54,209,194,111,206,74,202,222,213,225,184,227,69,207,54,212,214,141,212,176,193,105,211,128,187,213,190,215,89,207,96,220,225,64,200,104,215,78,226,182,191,144,215,126,219,83,179,202,132,193,187,224,191,178,102,229,217,188,165,198,45,232,205,91,173,232,196,163,137,177,177,166,211,77,211,103,216,139,146,208,96,211,188,219,202,195,164,229,132,156,161,155,227,207,217,184,124,153,225,160,189,112,197,203,218,177,168,219,130,197,202,223,217,191,223,109,191,92,217,136,216,20,219,89,181,128,215,207,233,212,205,137,166,186,116,161,140,232,215,168,135,187,54,220,120,171,199,220,198,203,96,212,117]
 
1
+ [231,87,176,106,217,227,206,108,211,148,227,93,192,189,69,224,185,228,221,193,157,223,100,229,191,158,182,227,226,155,210,86,166,204,104,223,218,77,160,171,232,227,217,125,212,62,205,223,178,226,210,116,202,169,198,109,205,98,185,228,235,199,210,209,121,214,210,174,193,121,214,93,217,73,215,101,224,197,143,229,231,200,164,130,225,93,229,203,126,200,224,183,225,81,225,115,199,155,142,157,217,109,222,214,140,202,214,93,179,91,171,225,195,133,196,205,196,226,233,135,217,226,212,227,236,203,159,198,150,204,219,222,56,171,211,153,224,208,226,106,200,218,201,108,220,220,89,227,225,79,233,200,65,214,146,180,200,191,217,132,167,213,146,201,193,161,216,169,190,85,180,96,156,86,194,103,226,189,118,225,68,207,102,198,81,112,205,85,215,210,72,235,91,195,157,197,201,123,211,209,156,193,207,220,157,210,214,213,70,215,112,145,207,175,121,228,175,172,191,232,196,104,215,212,201,165,208,211,225,154,136,203,100,206,154,216,99,178,212,119,218,228,76,129,231,211,89,190,225,79,213,139,218,215,208,104,187,214,223,129,183,140,197,159,191,118,162,227,174,162,224,206,43,203,176,232,170,185,18,230,90,206,207,181,222,62,207,213,62,227,203,192,175,195,216,225,103,159,216,194,223,194,90,190,150,216,205,216,179,193,84,193,192,209,160,222,70,228,68,155,221,227,195,206,147,171,225,106,147,148,208,109,227,104,194,213,199,125,204,148,196,101,218,81,197,152,219,170,76,108,206,210,172,238,218,188,75,226,208,160,211,229,73,115,224,224,209,46,207,216,223,194,212,101,213,222,52,209,210,175,178,209,158,200,74,146,181,204,81,214,120,190,213,153,218,115,217,86,203,219,76,222,85,227,92,207,225,210,220,187,190,201,195,196,193,190,165,195,219,190,175,194,86,185,220,113,194,65,201,213,219,131,218,221,136,223,209,157,186,115,215,105,190,107,224,137,199,73,188,111,166,114,141,205,239,222,218,168,201,111,225,81,204,212,180,226,90,183,194,138,110,168,208,87,196,71,203,225,225,226,92,106,168,196,227,210,222,133,212,138,230,214,103,235,76,218,197,206,142,239,88,200,208,183,173,221,175,220,93,219,53,221,111,202,92,224,109,216,211,81,181,216,150,211,202,227,75,233,227,92,199,178,218,208,105,210,211,48,227,191,221,115,209,212,187,220,101,225,211,196,214,119,209,39,189,81,205,225,85,233,192,213,112,214,141,213,202,91,211,226,111,195,229,201,68,202,163,218,155,201,191,131,199,229,230,208,154,214,225,228,214,133,186,86,208,226,225,147,203,205,100,177,216,85,206,224,162,191,191,200,207,88,213,128,217,220,90,223,85,185,146,194,112,189,204,185,201,216,76,211,218,129,229,232,206,151,199,117,210,229,92,223,97,215,63,217,185,145,158,193,124,222,178,224,113,208,196,171,198,142,212,218,80,193,82,224,105,161,215,137,176,203,216,128,200,79,226,130,169,189,163,192,166,209,167,222,197,75,220,74,213,97,193,94,197,54,230,209,210,162,192,210,75,225,221,217,96,189,202,220,106,215,121,229,200,163,229,81,219,226,224,186,93,225,214,219,151,233,182,144,221,221,221,214,110,179,142,222,200,110,218,206,196,214,92,216,47,208,221,76,225,141,194,128,219,208,159,218,201,220,195,101,202,157,190,137,167,218,212,108,214,228,224,217,74,226,77,231,102,219,112,215,94,181,226,118,171,211,112,208,163,200,155,208,124,221,179,226,229,100,229,221,79,180,219,213,69,224,19,226,224,188,202,202,80,202,67,210,138,215,69,228,77,208,190,199,156,218,79,224,88,229,53,218,180,112,156,173,218,82,207,196,227,201,222,224,36,212,181,137,124,207,223,171,160,228,121,218,212,207,227,77,198,185,211,172,177,229,213,182,213,220,87,150,202,154,231,111,231,229,77,195,170,168,163,221,139,126,206,179,98,164,225,219,102,180,223,75,182,227,199,174,191,131,210,190,231,161,167,215,200,109,212,82,208,212,166,190,207,150,218,138,230,160,163,145,185,211,67,228,205,111,154,237,78,185,212,134,191,202,181,71,239,161,205,95,191,216,200,129,222,112,222,211,82,173,161,197,160,195,166,221,236,206,26,164,232,214,111,210,78,212,108,177,145,211,187,203,171,201,218,45,211,100,226,210,80,219,118,147,206,194,226,144,219,219,111,224,115,234,210,189,133,198,176,211,203,64,223,76,217,123,196,191,214,193,121,203,212,198,197,138,226,113,208,176,187,70,226,142,193,86,180,214,210,207,212,76,181,193,187,130,173,119,206,124,207,230,85,226,185,194,158,148,219,234,70,210,71,218,217,194,227,37,202,230,223,180,199,92,227,74,234,128,211,117,190,182,230,209,84,129,226,148,209,223,176,209,219,141,187,128,193,224,133,222,193,92,187,84,230,29,205,209,227,171,227,220,90,116,180,209,210,65,206,200,73,218,204,171,202,158,210,182,193,212,135,150,198,101,227,111,230,177,225,227,223,141,225,127,207,224,227,153,190,126,134,197,117,233,104,178,209,175,195,81,214,219,60,225,217,171,200,31,225,185,166,221,150,145,196,215,73,202,220,215,140,209,87,219,71,171,157,213,232,51,225,138,204,231,215,168,196,215,104,200,222,232,110,199,59,213,201,101,219,103,171,215,199,220,204,159,186,186,147,226,181,163,231,82,222,84,204,113,218,145,183,204,208,226,72,212,156,177,171,220,214,173,222,122,214,168,176,187,81,210,132,206,108,234,220,160,208,47,194,150,122,206,184,210,207,199,95,230,154,212,205,167,168,222,191,100,203,167,215,161,222,84,210,78,237,162,193,108,160,209,59,201,148,167,205,185,212,149,213,56,190,108,217,177,132,190,192,132,201,219,222,82,210,165,184,219,169,181,147,216,103,220,101,184,224,195,230,102,196,196,192,196,222,173,212,218,212,209,134,212,76,209,117,220,52,206,147,164,229,106,196,217,200,203,117,219,230,114,184,57,221,91,226,150,182,202,103,180,216,172,129,193,214,217,153,184,85,228,40,216,83,184,200,226,100,163,176,208,189,219,222,229,122,209,157,176,195,218,100,161,220,221,121,191,66,195,206,118,215,216,123,221,188,50,208,78,230,200,207,95,195,85,194,225,75,201,228,50,213,200,71,176,92,213,84,189,203,127,184,201,166,199,106,222,123,185,114,217,211,218,67,214,75,214,145,224,213,200,95,187,82,195,215,113,153,166,202,97,198,215,49,188,216,215,136,191,131,190,218,217,224,175,218,198,97,171,221,229,214,229,181,167,225,120,179,227,222,220,69,210,69,160,233,188,197,190,215,193,147,169,154,207,197,137,148,136,171,227,93,149,175,228,98,207,103,225,229,112,195,127,197,101,211,217,229,116,223,229,212,223,119,202,89,207,227,78,211,205,127,224,194,214,77,152,145,157,223,231,151,145,212,140,220,221,197,213,110,226,191,175,82,222,145,150,217,206,207,137,226,94,221,68,60,227,200,218,193,208,204,111,200,89,188,135,152,167,205,223,221,169,164,212,95,170,93,116,147,115,166,185,219,183,223,230,200,231,232,220,89,183,122,193,188,216,143,154,161,227,228,116,221,216,219,211,230,163,205,188,111,161,218,224,231,117,170,183,175,223,213,176,200,110,128,225,212,50,206,208,63,191,173,170,195,90,209,166,144,195,176,224,222,68,228,132,207,201,135,167,209,166,123,225,213,190,154,195,222,230,212,88,158,208,110,212,127,226,121,213,70,198,146,221,95,193,164,221,157,232,182,219,52,231,203,49,213,224,76,151,223,88,202,113,190,201,144,158,180,201,182,215,164,30,207,70,195,104,202,194,116,217,80,229,162,159,216,212,113,214,149,198,134,224,75,184,166,224,91,158,221,78,170,223,228,77,221,221,218,201,222,78,178,200,113,204,190,185,202,198,141,205,124,187,227,223,164,114,190,117,223,46,198,136,165,181,64,231,166,182,214,63,220,62,210,57,220,192,189,215,224,182,191,208,71,228,151,214,97,208,229,132,199,197,172,142,209,200,215,232,226,184,155,177,208,83,225,67,159,134,117,227,114,172,217,222,148,187,194,82,198,126,167,221,60,126,122,164,184,111,129,194,156,176,211,157,231,83,226,228,130,209,111,187,224,196,213,79,130,192,83,188,221,192,154,179,228,108,201,120,222,141,168,172,172,142,225,193,218,116,223,137,175,77,214,211,127,227,111,216,72,199,97,226,190,209,203,179,126,197,124,116,226,189,202,37,214,53,207,202,196,161,207,168,159,198,65,186,164,207,134,184,215,94,207,220,68,207,102,225,151,220,224,94,208,230,192,201,189,151,157,176,199,219,106,176,155,215,93,186,176,64,188,98,222,108,214,187,137,208,215,224,95,226,220,215,207,219,220,94,222,45,170,169,200,213,88,177,197,131,177,126,229,172,88,213,220,117,231,212,99,156,200,177,188,216,222,127,220,163,209,97,134,207,178,232,99,159,220,168,192,210,113,225,217,223,233,183,186,82,216,221,79,187,195,180,111,162,66,182,199,92,204,79,230,232,232,205,99,143,211,112,207,232,77,229,196,221,214,222,236,181,230,215,80,180,84,214,122,224,186,223,190,214,211,84,164,130,206,51,216,36,233,222,204,138,208,178,134,213,203,149,211,93,174,228,67,210,50,206,150,227,221,215,66,222,122,226,145,226,196,191,222,205,218,230,162,155,210,216,137,208,118,231,218,206,100,188,108,167,209,81,165,173,220,158,215,80,217,141,214,233,225,212,155,185,167,168,152,186,212,79,230,230,192,208,58,203,102,196,88,189,206,191,201,134,166,104,227,112,211,193,140,174,159,222,82,191,225,67,212,114,169,97,188,146,225,135,214,58,213,103,238,228,205,211,218,216,138,215,139,215,100,217,201,80,205,164,224,169,182,105,181,222,228,194,149,213,96,209,218,101,202,211,139,228,157,192,197,131,176,198,88,221,194,131,152,170,220,62,224,58,198,201,197,181,230,84,219,88,230,206,135,210,108,213,82,203,207,139,217,95,213,81,207,189,160,230,208,134,226,217,83,166,190,117,209,226,226,109,221,147,219,206,214,65,216,193,225,222,223,119,200,217,191,147,223,215,206,93,223,111,218,213,141,158,229,118,192,102,218,227,166,191,200,198,71,141,196,163,185,169,123,143,187,170,95,182,220,119,190,99,225,169,184,132,223,157,205,66,222,156,164,204,158,201,70,222,233,116,179,53,219,132,196,181,224,196,10,196,142,217,164,216,95,122,206,62,209,141,211,94,151,200,202,169,173,174,204,173,209,152,206,105,194,99,161,197,204,73,200,116,213,90,175,171,154,196,216,221,210,57,222,57,212,200,229,196,220,165,166,225,228,203,207,201,232,207,141,170,132,214,195,175,153,220,88,197,191,180,161,198,178,191,222,200,213,140,174,229,50,185,212,128,219,47,200,100,177,188,190,105,191,184,71,190,206,169,187,218,142,218,225,208,134,121,232,177,214,121,183,171,202,76,220,68,192,206,53,219,200,63,175,198,160,224,196,97,198,231,224,86,224,74,137,189,207,91,229,130,160,159,213,90,82,175,172,194,165,158,38,157,216,187,204,90,181,219,199,143,155,129,191,219,184,228,199,186,155,210,208,91,203,192,215,98,177,224,71,225,133,199,234,226,109,201,208,191,209,228,207,205,225,193,215,113,221,141,131,194,85,162,217,101,195,109,227,223,213,195,228,101,168,111,221,197,48,202,171,163,191,209,111,229,225,211,223,207,202,57,232,226,202,227,167,172,228,196,78,222,198,160,212,108,202,136,181,207,171,223,76,211,73,223,174,144,178,184,197,210,77,210,183,177,205,220,204,59,214,109,203,237,203,173,198,115,217,179,176,199,180,152,159,161,85,183,97,227,135,189,154,142,156,217,90,232,188,200,118,222,87,179,170,216,195,126,184,91,188,148,166,199,206,96,182,222,111,202,154,199,192,122,192,220,223,202,215,47,158,168,204,134,153,102,190,165,189,163,223,200,77,180,98,207,124,231,147,230,74,139,170,222,237,167,159,215,232,40,207,138,215,215,219,104,206,142,216,80,205,97,175,120,185,197,224,184,209,81,184,120,174,206,142,231,226,73,217,26,222,94,207,130,151,156,183,212,228,194,151,214,88,227,218,212,174,220,167,149,192,192,213,56,151,220,219,118,181,210,204,64,203,114,217,119,204,210,227,58,231,216,176,227,60,220,205,171,159,226,94,153,210,83,158,172,127,223,185,166,206,224,212,61,207,197,156,178,202,28,228,180,198,148,154,197,211,142,164,212,66,195,139,229,234,229,73,207,203,224,135,195,142,206,202,181,90,188,200,128,201,178,213,225,224,115,198,97,197,208,209,221,70,191,133,118,194,196,85,131,188,39,194,216,198,85,229,133,197,109,232,124,209,171,156,203,199,219,188,189,182,196,66,234,184,221,215,217,101,218,108,199,118,226,137,222,106,215,198,133,216,94,205,162,226,121,224,104,217,111,228,119,196,194,134,211,207,207,202,199,168,129,225,75,177,216,212,85,174,227,84,148,199,160,212,224,195,219,141,186,191,206,215,113,195,211,56,186,209,219,148,212,229,157,203,171,227,72,193,199,191,171,215,223,220,117,198,209,101,213,89,205,201,209,144,178,124,218,218,115,196,167,229,54,209,194,111,206,74,202,222,213,225,184,227,69,207,54,212,214,141,212,176,193,105,211,128,187,213,190,215,89,207,96,220,225,64,200,104,215,78,226,182,191,144,215,126,219,83,179,202,132,193,187,224,191,178,102,229,217,188,165,198,45,232,205,91,173,232,196,163,137,177,177,166,211,77,211,103,216,139,146,208,96,211,188,219,202,195,164,229,132,156,161,155,227,207,217,184,124,153,225,160,189,112,197,203,218,177,168,219,130,197,202,223,217,191,223,109,191,92,217,136,216,20,219,89,181,128,215,207,233,212,205,137,166,186,116,161,140,232,215,168,135,187,54,220,120,171,199,220,198,203,96,212,117]
ivf.pid.pt CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:27ca6198055302a7ef73cfcbe3a98b07ab2e4d7d3de3a94d3a8bd58897fdcce7
3
- size 1388248
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:eeddafcbee4bf3aa00230f753fc9dfdc985ca0667b4f3813b302938aec68d447
3
+ size 1389080
metadata.json CHANGED
@@ -37,7 +37,7 @@
37
  "checkpoint":"colbert-ir/colbertv2.0",
38
  "triples":"/future/u/okhattab/root/unit/experiments/2021.10/downstream.distillation.round2.2_score/round2.nway6.cosine.ib/examples.64.json",
39
  "collection":[
40
- "list with 3150 elements starting with...",
41
  [
42
  "Image restoration poses a garners substantial interest due to the exponential surge in demands for recovering high-quality images from diverse mobile camera devices, adverse lighting conditions, suboptimal shooting environments, and frequent image compression for efficient transmission purposes. Yet this problem gathers significant challenges as people are blind to the type of restoration the images suffer, which, is usually the case in real-day scenarios and is most urgent to solve for this field. Current research, however, heavily relies on prior knowledge of the restoration type, either explicitly through rules or implicitly through the availability of degraded-clean image pairs to define the restoration process, and consumes considerable effort to collect image pairs of vast degradation types. This paper introduces DreamClean, a training-free method that needs no degradation prior knowledge but yields high-fidelity and generality towards various types of image degradation. DreamClean embeds the degraded image back to the latent of pre-trained diffusion models and re-sample it through a carefully designed diffusion process that mimics those generating clean images. Thanks to the rich image prior in diffusion models and our novel Variance Preservation Sampling (VPS) technique, DreamClean manages to handle various different degradation types at one time and reaches far more satisfied final quality than previous competitors.",
43
  "Thanks to the rich image prior in diffusion models and our novel Variance Preservation Sampling (VPS) technique, DreamClean manages to handle various different degradation types at one time and reaches far more satisfied final quality than previous competitors. DreamClean relies on elegant theoretical supports to assure its convergence to clean image when VPS has appropriate parameters, and also enjoys superior experimental performance over various challenging tasks that could be overwhelming for previous methods when degradation prior is unavailable.",
@@ -50,7 +50,7 @@
50
  "root":".ragatouille/",
51
  "experiment":"colbert",
52
  "index_root":null,
53
- "name":"2024-04/04/04.50.55",
54
  "rank":0,
55
  "nranks":1,
56
  "amp":true,
@@ -59,8 +59,8 @@
59
  },
60
  "num_chunks":1,
61
  "num_partitions":8192,
62
- "num_embeddings":547570,
63
- "avg_doclen":173.8317460317,
64
  "RAGatouille":{
65
  "index_config":{
66
  "index_type":"PLAID",
 
37
  "checkpoint":"colbert-ir/colbertv2.0",
38
  "triples":"/future/u/okhattab/root/unit/experiments/2021.10/downstream.distillation.round2.2_score/round2.nway6.cosine.ib/examples.64.json",
39
  "collection":[
40
+ "list with 3148 elements starting with...",
41
  [
42
  "Image restoration poses a garners substantial interest due to the exponential surge in demands for recovering high-quality images from diverse mobile camera devices, adverse lighting conditions, suboptimal shooting environments, and frequent image compression for efficient transmission purposes. Yet this problem gathers significant challenges as people are blind to the type of restoration the images suffer, which, is usually the case in real-day scenarios and is most urgent to solve for this field. Current research, however, heavily relies on prior knowledge of the restoration type, either explicitly through rules or implicitly through the availability of degraded-clean image pairs to define the restoration process, and consumes considerable effort to collect image pairs of vast degradation types. This paper introduces DreamClean, a training-free method that needs no degradation prior knowledge but yields high-fidelity and generality towards various types of image degradation. DreamClean embeds the degraded image back to the latent of pre-trained diffusion models and re-sample it through a carefully designed diffusion process that mimics those generating clean images. Thanks to the rich image prior in diffusion models and our novel Variance Preservation Sampling (VPS) technique, DreamClean manages to handle various different degradation types at one time and reaches far more satisfied final quality than previous competitors.",
43
  "Thanks to the rich image prior in diffusion models and our novel Variance Preservation Sampling (VPS) technique, DreamClean manages to handle various different degradation types at one time and reaches far more satisfied final quality than previous competitors. DreamClean relies on elegant theoretical supports to assure its convergence to clean image when VPS has appropriate parameters, and also enjoys superior experimental performance over various challenging tasks that could be overwhelming for previous methods when degradation prior is unavailable.",
 
50
  "root":".ragatouille/",
51
  "experiment":"colbert",
52
  "index_root":null,
53
+ "name":"2024-05/04/00.30.44",
54
  "rank":0,
55
  "nranks":1,
56
  "amp":true,
 
59
  },
60
  "num_chunks":1,
61
  "num_partitions":8192,
62
+ "num_embeddings":547258,
63
+ "avg_doclen":173.8430749682,
64
  "RAGatouille":{
65
  "index_config":{
66
  "index_type":"PLAID",
pid_docid_map.json CHANGED
@@ -2060,1093 +2060,1091 @@
2060
  "2058":"18018",
2061
  "2059":"18017",
2062
  "2060":"18381",
2063
- "2061":"18016",
2064
- "2062":"18016",
2065
- "2063":"18015",
2066
- "2064":"18014",
2067
- "2065":"18013",
2068
- "2066":"18379",
2069
- "2067":"18375",
2070
- "2068":"18012",
2071
- "2069":"18011",
2072
- "2070":"18011",
2073
- "2071":"18348",
2074
- "2072":"18348",
2075
- "2073":"18006",
2076
- "2074":"18006",
2077
- "2075":"18003",
2078
- "2076":"18312",
2079
- "2077":"18308",
2080
- "2078":"18002",
2081
  "2079":"18305",
2082
- "2080":"18305",
2083
- "2081":"18305",
2084
- "2082":"18298",
2085
- "2083":"18298",
2086
- "2084":"18297",
2087
- "2085":"18297",
2088
- "2086":"18000",
2089
- "2087":"18000",
2090
- "2088":"17999",
2091
- "2089":"17996",
2092
- "2090":"18292",
2093
- "2091":"18292",
2094
- "2092":"18285",
2095
- "2093":"18285",
2096
- "2094":"18284",
2097
- "2095":"17995",
2098
- "2096":"17994",
2099
- "2097":"17994",
2100
- "2098":"18277",
2101
- "2099":"18277",
2102
- "2100":"17993",
2103
- "2101":"17991",
2104
- "2102":"17991",
2105
- "2103":"18269",
2106
- "2104":"18269",
2107
- "2105":"17990",
2108
- "2106":"17989",
2109
- "2107":"17988",
2110
- "2108":"18256",
2111
- "2109":"18254",
2112
- "2110":"18254",
2113
- "2111":"17984",
2114
- "2112":"17984",
2115
- "2113":"18242",
2116
- "2114":"18242",
2117
- "2115":"17983",
2118
- "2116":"17981",
2119
- "2117":"17979",
2120
- "2118":"17978",
2121
- "2119":"17975",
2122
- "2120":"17974",
2123
- "2121":"17973",
2124
- "2122":"17972",
2125
- "2123":"18222",
2126
- "2124":"17970",
2127
- "2125":"17969",
2128
- "2126":"17969",
2129
- "2127":"17968",
2130
- "2128":"17968",
2131
- "2129":"18199",
2132
- "2130":"18197",
2133
- "2131":"17740",
2134
- "2132":"17740",
2135
- "2133":"17966",
2136
- "2134":"17966",
2137
- "2135":"17965",
2138
- "2136":"17964",
2139
- "2137":"17964",
2140
- "2138":"17961",
2141
- "2139":"17962",
2142
- "2140":"18186",
2143
- "2141":"18186",
2144
- "2142":"18179",
2145
- "2143":"18179",
2146
- "2144":"17960",
2147
- "2145":"18177",
2148
- "2146":"18165",
2149
- "2147":"18163",
2150
  "2148":"17959",
2151
- "2149":"17959",
2152
- "2150":"17959",
2153
- "2151":"18152",
2154
- "2152":"17958",
2155
- "2153":"18143",
2156
- "2154":"17957",
2157
- "2155":"17956",
2158
- "2156":"17955",
2159
- "2157":"17955",
2160
- "2158":"17954",
2161
- "2159":"18131",
2162
- "2160":"17952",
2163
- "2161":"17951",
2164
- "2162":"17951",
2165
- "2163":"17950",
2166
- "2164":"17950",
2167
- "2165":"18124",
2168
- "2166":"18124",
2169
- "2167":"17949",
2170
- "2168":"17945",
2171
- "2169":"17945",
2172
- "2170":"17946",
2173
- "2171":"17946",
2174
- "2172":"17944",
2175
- "2173":"17944",
2176
- "2174":"17941",
2177
- "2175":"17941",
2178
- "2176":"17942",
2179
- "2177":"17942",
2180
- "2178":"17940",
2181
- "2179":"17940",
2182
- "2180":"17939",
2183
- "2181":"18103",
2184
- "2182":"18103",
2185
- "2183":"17938",
2186
- "2184":"17937",
2187
- "2185":"17937",
2188
- "2186":"17935",
2189
- "2187":"17935",
2190
- "2188":"18096",
2191
- "2189":"18096",
2192
- "2190":"17933",
2193
- "2191":"17932",
2194
- "2192":"17930",
2195
- "2193":"18084",
2196
- "2194":"18079",
2197
- "2195":"18079",
2198
- "2196":"17929",
2199
- "2197":"17929",
2200
- "2198":"17928",
2201
- "2199":"17927",
2202
- "2200":"17925",
2203
- "2201":"17924",
2204
- "2202":"17922",
2205
- "2203":"17920",
2206
- "2204":"17920",
2207
- "2205":"18051",
2208
- "2206":"17917",
2209
- "2207":"18048",
2210
- "2208":"18048",
2211
- "2209":"18047",
2212
- "2210":"17915",
2213
- "2211":"17915",
2214
- "2212":"18044",
2215
- "2213":"18044",
2216
- "2214":"18043",
2217
- "2215":"17914",
2218
- "2216":"17913",
2219
- "2217":"17913",
2220
- "2218":"17912",
2221
- "2219":"17911",
2222
- "2220":"17910",
2223
- "2221":"17739",
2224
- "2222":"17739",
2225
- "2223":"17909",
2226
- "2224":"17909",
2227
- "2225":"17908",
2228
- "2226":"17906",
2229
- "2227":"17906",
2230
- "2228":"18009",
2231
- "2229":"17905",
2232
- "2230":"17905",
2233
- "2231":"18008",
2234
- "2232":"17901",
2235
- "2233":"17997",
2236
- "2234":"17986",
2237
- "2235":"17986",
2238
- "2236":"17985",
2239
- "2237":"17980",
2240
- "2238":"17980",
2241
- "2239":"17900",
2242
- "2240":"17977",
2243
- "2241":"17977",
2244
- "2242":"17976",
2245
- "2243":"17899",
2246
- "2244":"17898",
2247
- "2245":"17898",
2248
- "2246":"17971",
2249
- "2247":"17971",
2250
- "2248":"17963",
2251
- "2249":"17897",
2252
- "2250":"17896",
2253
- "2251":"17896",
2254
- "2252":"17895",
2255
- "2253":"17895",
2256
- "2254":"17893",
2257
- "2255":"17893",
2258
- "2256":"17892",
2259
- "2257":"17948",
2260
- "2258":"17948",
2261
- "2259":"17943",
2262
- "2260":"17943",
2263
- "2261":"17931",
2264
- "2262":"17931",
2265
- "2263":"17890",
2266
- "2264":"17921",
2267
- "2265":"17887",
2268
- "2266":"17886",
2269
- "2267":"17886",
2270
- "2268":"17885",
2271
- "2269":"17885",
2272
- "2270":"17883",
2273
- "2271":"17881",
2274
- "2272":"17880",
2275
- "2273":"17916",
2276
- "2274":"17907",
2277
- "2275":"17907",
2278
- "2276":"17904",
2279
- "2277":"17903",
2280
- "2278":"17903",
2281
- "2279":"17879",
2282
- "2280":"17888",
2283
- "2281":"17888",
2284
- "2282":"17884",
2285
- "2283":"17882",
2286
- "2284":"17878",
2287
- "2285":"17878",
2288
- "2286":"17876",
2289
- "2287":"17876",
2290
- "2288":"17877",
2291
- "2289":"17875",
2292
- "2290":"17874",
2293
- "2291":"17874",
2294
- "2292":"17873",
2295
- "2293":"17872",
2296
- "2294":"17871",
2297
- "2295":"17870",
2298
- "2296":"17869",
2299
- "2297":"17869",
2300
- "2298":"17868",
2301
- "2299":"17866",
2302
- "2300":"17863",
2303
- "2301":"17862",
2304
- "2302":"17861",
2305
- "2303":"17860",
2306
- "2304":"17858",
2307
- "2305":"17858",
2308
- "2306":"17857",
2309
- "2307":"17857",
2310
- "2308":"17856",
2311
- "2309":"17855",
2312
- "2310":"17855",
2313
- "2311":"17854",
2314
- "2312":"17853",
2315
- "2313":"17853",
2316
- "2314":"17852",
2317
- "2315":"17852",
2318
- "2316":"17851",
2319
- "2317":"17850",
2320
- "2318":"17849",
2321
- "2319":"17848",
2322
  "2320":"17845",
2323
- "2321":"17845",
2324
- "2322":"17845",
2325
- "2323":"17844",
2326
- "2324":"17844",
2327
- "2325":"17843",
2328
- "2326":"17842",
2329
- "2327":"17840",
2330
- "2328":"17839",
2331
- "2329":"17835",
2332
- "2330":"17834",
2333
- "2331":"17833",
2334
- "2332":"17833",
2335
- "2333":"17832",
2336
- "2334":"17831",
2337
- "2335":"17831",
2338
- "2336":"17830",
2339
- "2337":"17829",
2340
- "2338":"17828",
2341
- "2339":"17828",
2342
- "2340":"17826",
2343
- "2341":"17826",
2344
- "2342":"17825",
2345
- "2343":"17825",
2346
- "2344":"17824",
2347
- "2345":"17824",
2348
- "2346":"17823",
2349
- "2347":"17823",
2350
- "2348":"17822",
2351
- "2349":"17820",
2352
- "2350":"17819",
2353
- "2351":"17818",
2354
- "2352":"17818",
2355
- "2353":"17816",
2356
- "2354":"17815",
2357
- "2355":"17815",
2358
- "2356":"17814",
2359
- "2357":"17814",
2360
- "2358":"17813",
2361
- "2359":"17812",
2362
- "2360":"17811",
2363
- "2361":"17810",
2364
  "2362":"17809",
2365
- "2363":"17809",
2366
- "2364":"17809",
2367
- "2365":"17808",
2368
- "2366":"17807",
2369
- "2367":"17806",
2370
- "2368":"19725",
2371
- "2369":"17804",
2372
- "2370":"17804",
2373
- "2371":"17803",
2374
- "2372":"17801",
2375
- "2373":"17801",
2376
- "2374":"17800",
2377
- "2375":"17800",
2378
- "2376":"17799",
2379
- "2377":"17799",
2380
- "2378":"17798",
2381
- "2379":"17795",
2382
- "2380":"17792",
2383
- "2381":"17791",
2384
- "2382":"17789",
2385
- "2383":"17788",
2386
- "2384":"17787",
2387
- "2385":"17787",
2388
- "2386":"17786",
2389
- "2387":"17786",
2390
- "2388":"17785",
2391
- "2389":"17785",
2392
- "2390":"17784",
2393
- "2391":"17784",
2394
- "2392":"17783",
2395
- "2393":"17782",
2396
- "2394":"17781",
2397
- "2395":"17781",
2398
- "2396":"17780",
2399
- "2397":"17780",
2400
- "2398":"17778",
2401
- "2399":"17778",
2402
- "2400":"17777",
2403
- "2401":"17776",
2404
- "2402":"17775",
2405
- "2403":"17774",
2406
- "2404":"17773",
2407
- "2405":"17772",
2408
- "2406":"17770",
2409
- "2407":"17770",
2410
- "2408":"17769",
2411
- "2409":"17769",
2412
- "2410":"17790",
2413
- "2411":"17768",
2414
- "2412":"17767",
2415
- "2413":"17766",
2416
- "2414":"17765",
2417
- "2415":"17793",
2418
- "2416":"17793",
2419
- "2417":"17764",
2420
- "2418":"17794",
2421
- "2419":"17762",
2422
- "2420":"17762",
2423
- "2421":"17761",
2424
- "2422":"17796",
2425
- "2423":"17797",
2426
- "2424":"17797",
2427
- "2425":"17760",
2428
- "2426":"17760",
2429
- "2427":"17759",
2430
- "2428":"17758",
2431
- "2429":"17757",
2432
- "2430":"17757",
2433
- "2431":"17755",
2434
- "2432":"17754",
2435
- "2433":"17753",
2436
- "2434":"17752",
2437
- "2435":"17752",
2438
- "2436":"17751",
2439
- "2437":"17749",
2440
- "2438":"17750",
2441
- "2439":"17750",
2442
- "2440":"17748",
2443
- "2441":"17738",
2444
- "2442":"17737",
2445
- "2443":"17737",
2446
- "2444":"17736",
2447
- "2445":"17735",
2448
- "2446":"17735",
2449
- "2447":"17734",
2450
- "2448":"17733",
2451
- "2449":"17733",
2452
- "2450":"17732",
2453
- "2451":"17732",
2454
- "2452":"17731",
2455
- "2453":"17731",
2456
- "2454":"17730",
2457
  "2455":"17729",
2458
- "2456":"17729",
2459
- "2457":"17729",
2460
- "2458":"17728",
2461
- "2459":"17727",
2462
- "2460":"17727",
2463
- "2461":"17726",
2464
- "2462":"17725",
2465
- "2463":"17724",
2466
- "2464":"17722",
2467
- "2465":"17721",
2468
- "2466":"17721",
2469
- "2467":"17720",
2470
- "2468":"17719",
2471
- "2469":"17718",
2472
- "2470":"17717",
2473
- "2471":"17716",
2474
- "2472":"17715",
2475
- "2473":"17714",
2476
- "2474":"17713",
2477
- "2475":"17713",
2478
- "2476":"17712",
2479
- "2477":"17712",
2480
- "2478":"17711",
2481
- "2479":"17711",
2482
- "2480":"17710",
2483
- "2481":"17710",
2484
- "2482":"17709",
2485
- "2483":"17708",
2486
- "2484":"17707",
2487
- "2485":"17707",
2488
- "2486":"17706",
2489
- "2487":"17706",
2490
- "2488":"17705",
2491
- "2489":"17704",
2492
- "2490":"17704",
2493
- "2491":"17703",
2494
- "2492":"17702",
2495
- "2493":"17702",
2496
- "2494":"17701",
2497
- "2495":"17699",
2498
- "2496":"17698",
2499
- "2497":"17698",
2500
- "2498":"17697",
2501
- "2499":"17697",
2502
- "2500":"17696",
2503
- "2501":"17695",
2504
- "2502":"17694",
2505
- "2503":"17694",
2506
- "2504":"17692",
2507
- "2505":"17692",
2508
- "2506":"17691",
2509
- "2507":"17690",
2510
- "2508":"17689",
2511
- "2509":"17689",
2512
- "2510":"17688",
2513
- "2511":"17688",
2514
- "2512":"17687",
2515
- "2513":"17686",
2516
  "2514":"17685",
2517
- "2515":"17685",
2518
- "2516":"17685",
2519
- "2517":"17684",
2520
- "2518":"17683",
2521
- "2519":"17682",
2522
- "2520":"17680",
2523
- "2521":"17680",
2524
- "2522":"17679",
2525
- "2523":"17678",
2526
- "2524":"17677",
2527
- "2525":"17677",
2528
- "2526":"17676",
2529
- "2527":"17674",
2530
- "2528":"17673",
2531
- "2529":"17827",
2532
- "2530":"17672",
2533
- "2531":"17671",
2534
- "2532":"17670",
2535
- "2533":"17669",
2536
- "2534":"17668",
2537
- "2535":"17667",
2538
- "2536":"17666",
2539
- "2537":"17666",
2540
- "2538":"17665",
2541
- "2539":"17664",
2542
- "2540":"17663",
2543
- "2541":"17663",
2544
- "2542":"17661",
2545
- "2543":"17659",
2546
- "2544":"17659",
2547
- "2545":"17658",
2548
- "2546":"17658",
2549
- "2547":"17656",
2550
- "2548":"17655",
2551
- "2549":"17654",
2552
- "2550":"17654",
2553
- "2551":"17653",
2554
- "2552":"17652",
2555
- "2553":"17651",
2556
- "2554":"17650",
2557
- "2555":"17649",
2558
- "2556":"17648",
2559
- "2557":"17647",
2560
- "2558":"17646",
2561
- "2559":"17645",
2562
- "2560":"17637",
2563
- "2561":"17637",
2564
- "2562":"17644",
2565
- "2563":"17644",
2566
- "2564":"17625",
2567
- "2565":"17624",
2568
- "2566":"17624",
2569
- "2567":"17623",
2570
- "2568":"17622",
2571
- "2569":"17622",
2572
- "2570":"17621",
2573
- "2571":"17621",
2574
- "2572":"17620",
2575
- "2573":"17619",
2576
- "2574":"17618",
2577
- "2575":"17617",
2578
- "2576":"17616",
2579
- "2577":"17616",
2580
- "2578":"17615",
2581
- "2579":"17615",
2582
- "2580":"17613",
2583
- "2581":"17612",
2584
- "2582":"17612",
2585
- "2583":"17611",
2586
- "2584":"17609",
2587
- "2585":"17609",
2588
  "2586":"17608",
2589
- "2587":"17608",
2590
- "2588":"17608",
2591
- "2589":"17607",
2592
- "2590":"17606",
2593
- "2591":"17606",
2594
- "2592":"17605",
2595
- "2593":"17604",
2596
- "2594":"17603",
2597
- "2595":"17603",
2598
- "2596":"17602",
2599
- "2597":"17601",
2600
- "2598":"17600",
2601
- "2599":"17598",
2602
- "2600":"17597",
2603
- "2601":"17596",
2604
- "2602":"17595",
2605
- "2603":"17594",
2606
- "2604":"17594",
2607
- "2605":"17593",
2608
- "2606":"17592",
2609
- "2607":"17591",
2610
- "2608":"17590",
2611
- "2609":"17590",
2612
- "2610":"17589",
2613
- "2611":"17589",
2614
- "2612":"17588",
2615
- "2613":"17587",
2616
- "2614":"17585",
2617
- "2615":"17586",
2618
- "2616":"17586",
2619
- "2617":"17584",
2620
- "2618":"17584",
2621
- "2619":"17583",
2622
- "2620":"17582",
2623
- "2621":"17581",
2624
- "2622":"17580",
2625
- "2623":"17579",
2626
- "2624":"17578",
2627
- "2625":"17577",
2628
- "2626":"17577",
2629
- "2627":"17576",
2630
- "2628":"17576",
2631
- "2629":"17574",
2632
- "2630":"17573",
2633
- "2631":"17572",
2634
- "2632":"17571",
2635
- "2633":"17571",
2636
- "2634":"17570",
2637
- "2635":"17570",
2638
- "2636":"17569",
2639
- "2637":"17568",
2640
- "2638":"17565",
2641
- "2639":"17564",
2642
- "2640":"17563",
2643
- "2641":"17563",
2644
- "2642":"17561",
2645
- "2643":"17560",
2646
- "2644":"18364",
2647
- "2645":"17559",
2648
- "2646":"17558",
2649
- "2647":"17556",
2650
  "2648":"17555",
2651
  "2649":"17555",
2652
  "2650":"17555",
2653
- "2651":"17555",
2654
- "2652":"17555",
2655
- "2653":"17554",
2656
- "2654":"17554",
2657
- "2655":"17552",
2658
- "2656":"17551",
2659
- "2657":"17551",
2660
- "2658":"17548",
2661
- "2659":"17547",
2662
- "2660":"17547",
2663
- "2661":"17546",
2664
- "2662":"17549",
2665
- "2663":"17544",
2666
- "2664":"17544",
2667
- "2665":"17543",
2668
- "2666":"17543",
2669
- "2667":"17542",
2670
- "2668":"17541",
2671
- "2669":"17540",
2672
- "2670":"17539",
2673
  "2671":"17537",
2674
- "2672":"17537",
2675
- "2673":"17537",
2676
- "2674":"17536",
2677
- "2675":"17535",
2678
- "2676":"17534",
2679
- "2677":"17532",
2680
- "2678":"17531",
2681
- "2679":"17531",
2682
- "2680":"17530",
2683
- "2681":"17529",
2684
- "2682":"17529",
2685
- "2683":"17528",
2686
- "2684":"17527",
2687
- "2685":"17526",
2688
- "2686":"17525",
2689
- "2687":"17525",
2690
- "2688":"17523",
2691
- "2689":"17522",
2692
- "2690":"17520",
2693
- "2691":"17521",
2694
- "2692":"17519",
2695
- "2693":"17519",
2696
- "2694":"17518",
2697
- "2695":"17518",
2698
- "2696":"17517",
2699
  "2697":"17516",
2700
- "2698":"17516",
2701
- "2699":"17516",
2702
- "2700":"17515",
2703
- "2701":"17515",
2704
- "2702":"17514",
2705
- "2703":"17513",
2706
- "2704":"17512",
2707
- "2705":"17511",
2708
- "2706":"17511",
2709
- "2707":"17510",
2710
- "2708":"17510",
2711
- "2709":"17509",
2712
- "2710":"17509",
2713
- "2711":"17507",
2714
- "2712":"17506",
2715
- "2713":"17505",
2716
- "2714":"17505",
2717
- "2715":"17504",
2718
- "2716":"17503",
2719
- "2717":"17502",
2720
- "2718":"17501",
2721
- "2719":"17500",
2722
- "2720":"17499",
2723
- "2721":"17498",
2724
- "2722":"17497",
2725
- "2723":"17497",
2726
- "2724":"17495",
2727
- "2725":"17495",
2728
- "2726":"17494",
2729
- "2727":"17493",
2730
- "2728":"17492",
2731
- "2729":"17492",
2732
- "2730":"17491",
2733
- "2731":"17491",
2734
- "2732":"17490",
2735
- "2733":"17490",
2736
- "2734":"17553",
2737
- "2735":"17553",
2738
- "2736":"17489",
2739
- "2737":"17488",
2740
- "2738":"17486",
2741
- "2739":"17485",
2742
- "2740":"17487",
2743
- "2741":"17484",
2744
- "2742":"17483",
2745
- "2743":"17483",
2746
- "2744":"17557",
2747
- "2745":"17557",
2748
- "2746":"17482",
2749
- "2747":"17481",
2750
- "2748":"17481",
2751
- "2749":"17480",
2752
- "2750":"17479",
2753
- "2751":"17479",
2754
- "2752":"17478",
2755
- "2753":"17478",
2756
- "2754":"17475",
2757
- "2755":"17475",
2758
- "2756":"17474",
2759
- "2757":"17473",
2760
- "2758":"17473",
2761
- "2759":"17472",
2762
- "2760":"17471",
2763
- "2761":"17470",
2764
- "2762":"17469",
2765
- "2763":"17468",
2766
- "2764":"17467",
2767
- "2765":"17465",
2768
- "2766":"17465",
2769
- "2767":"17464",
2770
- "2768":"17463",
2771
- "2769":"17463",
2772
- "2770":"17462",
2773
- "2771":"17461",
2774
  "2772":"17460",
2775
- "2773":"17460",
2776
- "2774":"17460",
2777
- "2775":"17459",
2778
- "2776":"17458",
2779
- "2777":"17458",
2780
- "2778":"17457",
2781
- "2779":"17456",
2782
- "2780":"17455",
2783
- "2781":"17455",
2784
- "2782":"17453",
2785
- "2783":"17452",
2786
- "2784":"17451",
2787
- "2785":"17451",
2788
- "2786":"17450",
2789
- "2787":"17450",
2790
- "2788":"18550",
2791
- "2789":"18550",
2792
- "2790":"17449",
2793
- "2791":"17448",
2794
- "2792":"17447",
2795
- "2793":"17447",
2796
- "2794":"17446",
2797
- "2795":"17444",
2798
- "2796":"17445",
2799
- "2797":"17566",
2800
- "2798":"17566",
2801
- "2799":"17443",
2802
- "2800":"17567",
2803
- "2801":"17442",
2804
- "2802":"17441",
2805
- "2803":"17440",
2806
- "2804":"17440",
2807
- "2805":"17439",
2808
- "2806":"17437",
2809
- "2807":"17437",
2810
- "2808":"17436",
2811
- "2809":"17435",
2812
- "2810":"17435",
2813
- "2811":"17433",
2814
- "2812":"17431",
2815
- "2813":"17431",
2816
- "2814":"17430",
2817
- "2815":"17429",
2818
- "2816":"17428",
2819
- "2817":"17428",
2820
- "2818":"17425",
2821
- "2819":"17426",
2822
- "2820":"17424",
2823
- "2821":"17423",
2824
- "2822":"17422",
2825
- "2823":"17422",
2826
- "2824":"17421",
2827
- "2825":"17420",
2828
- "2826":"17419",
2829
- "2827":"17418",
2830
- "2828":"17418",
2831
- "2829":"17417",
2832
- "2830":"17415",
2833
- "2831":"17415",
2834
- "2832":"17413",
2835
- "2833":"17412",
2836
- "2834":"17412",
2837
- "2835":"17411",
2838
- "2836":"17411",
2839
- "2837":"17410",
2840
- "2838":"17846",
2841
- "2839":"17409",
2842
- "2840":"17409",
2843
- "2841":"17407",
2844
- "2842":"17408",
2845
- "2843":"17406",
2846
- "2844":"17406",
2847
- "2845":"17405",
2848
- "2846":"17405",
2849
- "2847":"17402",
2850
- "2848":"17400",
2851
- "2849":"17399",
2852
- "2850":"17399",
2853
- "2851":"17398",
2854
- "2852":"17397",
2855
- "2853":"17397",
2856
- "2854":"17396",
2857
- "2855":"17394",
2858
- "2856":"17393",
2859
- "2857":"17392",
2860
- "2858":"17390",
2861
- "2859":"17390",
2862
- "2860":"17389",
2863
- "2861":"17389",
2864
- "2862":"17385",
2865
- "2863":"17384",
2866
- "2864":"17382",
2867
- "2865":"17383",
2868
- "2866":"17383",
2869
- "2867":"17380",
2870
- "2868":"17380",
2871
- "2869":"17378",
2872
- "2870":"17376",
2873
- "2871":"17373",
2874
- "2872":"17373",
2875
- "2873":"17372",
2876
- "2874":"17371",
2877
- "2875":"17371",
2878
- "2876":"17370",
2879
- "2877":"17369",
2880
- "2878":"17368",
2881
- "2879":"17368",
2882
- "2880":"17367",
2883
- "2881":"17367",
2884
- "2882":"17366",
2885
- "2883":"17366",
2886
- "2884":"18268",
2887
- "2885":"18268",
2888
- "2886":"17859",
2889
- "2887":"17867",
2890
- "2888":"17867",
2891
- "2889":"17889",
2892
- "2890":"17889",
2893
- "2891":"17894",
2894
- "2892":"17923",
2895
- "2893":"17936",
2896
- "2894":"17947",
2897
- "2895":"19331",
2898
- "2896":"19331",
2899
- "2897":"17982",
2900
- "2898":"17998",
2901
- "2899":"18001",
2902
- "2900":"18063",
2903
- "2901":"18792",
2904
- "2902":"18792",
2905
- "2903":"18109",
2906
- "2904":"18109",
2907
- "2905":"18111",
2908
- "2906":"18111",
2909
- "2907":"18149",
2910
- "2908":"18149",
2911
- "2909":"18159",
2912
- "2910":"18159",
2913
- "2911":"18166",
2914
- "2912":"18171",
2915
- "2913":"18171",
2916
- "2914":"18180",
2917
- "2915":"18180",
2918
- "2916":"18210",
2919
- "2917":"18213",
2920
- "2918":"18219",
2921
- "2919":"18219",
2922
- "2920":"18233",
2923
- "2921":"18233",
2924
- "2922":"18484",
2925
- "2923":"18484",
2926
- "2924":"18249",
2927
- "2925":"18249",
2928
- "2926":"18262",
2929
- "2927":"18280",
2930
- "2928":"18280",
2931
- "2929":"18332",
2932
- "2930":"18342",
2933
- "2931":"18362",
2934
- "2932":"18365",
2935
- "2933":"18376",
2936
- "2934":"18387",
2937
- "2935":"18387",
2938
- "2936":"18391",
2939
- "2937":"18391",
2940
- "2938":"18392",
2941
- "2939":"18395",
2942
- "2940":"18417",
2943
- "2941":"18417",
2944
- "2942":"18428",
2945
- "2943":"18438",
2946
- "2944":"18438",
2947
- "2945":"18456",
2948
- "2946":"18457",
2949
- "2947":"18463",
2950
- "2948":"18464",
2951
- "2949":"18467",
2952
- "2950":"18487",
2953
- "2951":"18489",
2954
- "2952":"18489",
2955
- "2953":"18501",
2956
- "2954":"18533",
2957
- "2955":"18536",
2958
- "2956":"18560",
2959
- "2957":"18560",
2960
- "2958":"18584",
2961
- "2959":"18613",
2962
- "2960":"18613",
2963
- "2961":"18620",
2964
- "2962":"18630",
2965
- "2963":"18697",
2966
- "2964":"18701",
2967
- "2965":"18701",
2968
- "2966":"18710",
2969
- "2967":"18722",
2970
- "2968":"18727",
2971
- "2969":"18774",
2972
- "2970":"18788",
2973
- "2971":"18788",
2974
- "2972":"18790",
2975
- "2973":"18839",
2976
- "2974":"18840",
2977
- "2975":"18845",
2978
- "2976":"18858",
2979
- "2977":"18868",
2980
- "2978":"18913",
2981
- "2979":"18913",
2982
- "2980":"18923",
2983
- "2981":"18960",
2984
- "2982":"18960",
2985
- "2983":"18961",
2986
- "2984":"18961",
2987
- "2985":"18966",
2988
- "2986":"18966",
2989
- "2987":"18974",
2990
- "2988":"18981",
2991
- "2989":"18995",
2992
- "2990":"18995",
2993
- "2991":"18997",
2994
- "2992":"19011",
2995
- "2993":"19011",
2996
- "2994":"19022",
2997
- "2995":"19030",
2998
- "2996":"19045",
2999
- "2997":"19045",
3000
- "2998":"19056",
3001
- "2999":"19070",
3002
- "3000":"19070",
3003
- "3001":"19096",
3004
- "3002":"19096",
3005
- "3003":"19131",
3006
- "3004":"19140",
3007
- "3005":"19195",
3008
- "3006":"19204",
3009
- "3007":"19222",
3010
- "3008":"19227",
3011
- "3009":"19227",
3012
- "3010":"19265",
3013
- "3011":"19265",
3014
- "3012":"19278",
3015
- "3013":"17401",
3016
- "3014":"17401",
3017
- "3015":"19307",
3018
- "3016":"19320",
3019
- "3017":"19328",
3020
- "3018":"19328",
3021
- "3019":"19337",
3022
- "3020":"19337",
3023
- "3021":"19351",
3024
- "3022":"19365",
3025
- "3023":"19368",
3026
- "3024":"19420",
3027
- "3025":"19420",
3028
- "3026":"19426",
3029
- "3027":"19426",
3030
- "3028":"19440",
3031
- "3029":"19451",
3032
- "3030":"19451",
3033
- "3031":"19487",
3034
- "3032":"19487",
3035
- "3033":"19504",
3036
- "3034":"19504",
3037
- "3035":"19507",
3038
- "3036":"19516",
3039
- "3037":"17374",
3040
- "3038":"17387",
3041
- "3039":"17391",
3042
- "3040":"17391",
3043
- "3041":"17403",
3044
- "3042":"17403",
3045
- "3043":"17414",
3046
- "3044":"17432",
3047
- "3045":"17432",
3048
- "3046":"17476",
3049
- "3047":"17476",
3050
- "3048":"17477",
3051
- "3049":"17496",
3052
- "3050":"18819",
3053
- "3051":"18819",
3054
- "3052":"19188",
3055
- "3053":"19439",
3056
- "3054":"17538",
3057
- "3055":"17538",
3058
- "3056":"18599",
3059
- "3057":"18599",
3060
- "3058":"19437",
3061
- "3059":"17747",
3062
- "3060":"17747",
3063
- "3061":"19435",
3064
- "3062":"19434",
3065
- "3063":"19434",
3066
- "3064":"19433",
3067
- "3065":"19432",
3068
- "3066":"19432",
3069
- "3067":"19431",
3070
- "3068":"19430",
3071
- "3069":"19429",
3072
- "3070":"19429",
3073
- "3071":"19428",
3074
- "3072":"19428",
3075
- "3073":"19137",
3076
- "3074":"19137",
3077
- "3075":"18168",
3078
- "3076":"19427",
3079
- "3077":"19427",
3080
- "3078":"17746",
3081
- "3079":"19425",
3082
- "3080":"19424",
3083
- "3081":"19423",
3084
- "3082":"19421",
3085
- "3083":"19421",
3086
- "3084":"19793",
3087
- "3085":"19793",
3088
- "3086":"19418",
3089
- "3087":"19417",
3090
- "3088":"17533",
3091
- "3089":"19275",
3092
- "3090":"19350",
3093
- "3091":"19415",
3094
- "3092":"18873",
3095
- "3093":"18873",
3096
- "3094":"19414",
3097
- "3095":"19413",
3098
- "3096":"19412",
3099
- "3097":"19033",
3100
- "3098":"19033",
3101
- "3099":"19411",
3102
- "3100":"19410",
3103
- "3101":"18875",
3104
- "3102":"19408",
3105
- "3103":"19407",
3106
- "3104":"19406",
3107
- "3105":"19406",
3108
- "3106":"19405",
3109
- "3107":"17643",
3110
- "3108":"19496",
3111
- "3109":"19404",
3112
- "3110":"19403",
3113
- "3111":"17836",
3114
- "3112":"17836",
3115
- "3113":"18077",
3116
- "3114":"18077",
3117
- "3115":"18556",
3118
- "3116":"18556",
3119
- "3117":"19526",
3120
- "3118":"19526",
3121
- "3119":"17802",
3122
- "3120":"17802",
3123
- "3121":"19616",
3124
- "3122":"19616",
3125
- "3123":"17365",
3126
- "3124":"19082",
3127
- "3125":"19443",
3128
  "3126":"18058",
3129
- "3127":"18058",
3130
- "3128":"18058",
3131
- "3129":"19778",
3132
- "3130":"17427",
3133
- "3131":"17427",
3134
- "3132":"19250",
3135
- "3133":"17629",
3136
- "3134":"18691",
3137
- "3135":"18144",
3138
- "3136":"18144",
3139
- "3137":"18327",
3140
- "3138":"17700",
3141
- "3139":"17700",
3142
- "3140":"19449",
3143
- "3141":"19559",
3144
- "3142":"18643",
3145
- "3143":"19546",
3146
- "3144":"19470",
3147
- "3145":"18917",
3148
- "3146":"19773",
3149
- "3147":"19773",
3150
- "3148":"18921",
3151
- "3149":"18921"
3152
  }
 
2060
  "2058":"18018",
2061
  "2059":"18017",
2062
  "2060":"18381",
2063
+ "2061":"18015",
2064
+ "2062":"18014",
2065
+ "2063":"18013",
2066
+ "2064":"18379",
2067
+ "2065":"18375",
2068
+ "2066":"18012",
2069
+ "2067":"18011",
2070
+ "2068":"18011",
2071
+ "2069":"18348",
2072
+ "2070":"18348",
2073
+ "2071":"18006",
2074
+ "2072":"18006",
2075
+ "2073":"18003",
2076
+ "2074":"18312",
2077
+ "2075":"18308",
2078
+ "2076":"18002",
2079
+ "2077":"18305",
2080
+ "2078":"18305",
2081
  "2079":"18305",
2082
+ "2080":"18298",
2083
+ "2081":"18298",
2084
+ "2082":"18297",
2085
+ "2083":"18297",
2086
+ "2084":"18000",
2087
+ "2085":"18000",
2088
+ "2086":"17999",
2089
+ "2087":"17996",
2090
+ "2088":"18292",
2091
+ "2089":"18292",
2092
+ "2090":"18285",
2093
+ "2091":"18285",
2094
+ "2092":"18284",
2095
+ "2093":"17995",
2096
+ "2094":"17994",
2097
+ "2095":"17994",
2098
+ "2096":"18277",
2099
+ "2097":"18277",
2100
+ "2098":"17993",
2101
+ "2099":"17991",
2102
+ "2100":"17991",
2103
+ "2101":"18269",
2104
+ "2102":"18269",
2105
+ "2103":"17990",
2106
+ "2104":"17989",
2107
+ "2105":"17988",
2108
+ "2106":"18256",
2109
+ "2107":"18254",
2110
+ "2108":"18254",
2111
+ "2109":"17984",
2112
+ "2110":"17984",
2113
+ "2111":"18242",
2114
+ "2112":"18242",
2115
+ "2113":"17983",
2116
+ "2114":"17981",
2117
+ "2115":"17979",
2118
+ "2116":"17978",
2119
+ "2117":"17975",
2120
+ "2118":"17974",
2121
+ "2119":"17973",
2122
+ "2120":"17972",
2123
+ "2121":"18222",
2124
+ "2122":"17970",
2125
+ "2123":"17969",
2126
+ "2124":"17969",
2127
+ "2125":"17968",
2128
+ "2126":"17968",
2129
+ "2127":"18199",
2130
+ "2128":"18197",
2131
+ "2129":"17740",
2132
+ "2130":"17740",
2133
+ "2131":"17966",
2134
+ "2132":"17966",
2135
+ "2133":"17965",
2136
+ "2134":"17964",
2137
+ "2135":"17964",
2138
+ "2136":"17961",
2139
+ "2137":"17962",
2140
+ "2138":"18186",
2141
+ "2139":"18186",
2142
+ "2140":"18179",
2143
+ "2141":"18179",
2144
+ "2142":"17960",
2145
+ "2143":"18177",
2146
+ "2144":"18165",
2147
+ "2145":"18163",
2148
+ "2146":"17959",
2149
+ "2147":"17959",
2150
  "2148":"17959",
2151
+ "2149":"18152",
2152
+ "2150":"17958",
2153
+ "2151":"18143",
2154
+ "2152":"17957",
2155
+ "2153":"17956",
2156
+ "2154":"17955",
2157
+ "2155":"17955",
2158
+ "2156":"17954",
2159
+ "2157":"18131",
2160
+ "2158":"17952",
2161
+ "2159":"17951",
2162
+ "2160":"17951",
2163
+ "2161":"17950",
2164
+ "2162":"17950",
2165
+ "2163":"18124",
2166
+ "2164":"18124",
2167
+ "2165":"17949",
2168
+ "2166":"17945",
2169
+ "2167":"17945",
2170
+ "2168":"17946",
2171
+ "2169":"17946",
2172
+ "2170":"17944",
2173
+ "2171":"17944",
2174
+ "2172":"17941",
2175
+ "2173":"17941",
2176
+ "2174":"17942",
2177
+ "2175":"17942",
2178
+ "2176":"17940",
2179
+ "2177":"17940",
2180
+ "2178":"17939",
2181
+ "2179":"18103",
2182
+ "2180":"18103",
2183
+ "2181":"17938",
2184
+ "2182":"17937",
2185
+ "2183":"17937",
2186
+ "2184":"17935",
2187
+ "2185":"17935",
2188
+ "2186":"18096",
2189
+ "2187":"18096",
2190
+ "2188":"17933",
2191
+ "2189":"17932",
2192
+ "2190":"17930",
2193
+ "2191":"18084",
2194
+ "2192":"18079",
2195
+ "2193":"18079",
2196
+ "2194":"17929",
2197
+ "2195":"17929",
2198
+ "2196":"17928",
2199
+ "2197":"17927",
2200
+ "2198":"17925",
2201
+ "2199":"17924",
2202
+ "2200":"17922",
2203
+ "2201":"17920",
2204
+ "2202":"17920",
2205
+ "2203":"18051",
2206
+ "2204":"17917",
2207
+ "2205":"18048",
2208
+ "2206":"18048",
2209
+ "2207":"18047",
2210
+ "2208":"17915",
2211
+ "2209":"17915",
2212
+ "2210":"18044",
2213
+ "2211":"18044",
2214
+ "2212":"18043",
2215
+ "2213":"17914",
2216
+ "2214":"17913",
2217
+ "2215":"17913",
2218
+ "2216":"17912",
2219
+ "2217":"17911",
2220
+ "2218":"17910",
2221
+ "2219":"17739",
2222
+ "2220":"17739",
2223
+ "2221":"17909",
2224
+ "2222":"17909",
2225
+ "2223":"17908",
2226
+ "2224":"17906",
2227
+ "2225":"17906",
2228
+ "2226":"18009",
2229
+ "2227":"17905",
2230
+ "2228":"17905",
2231
+ "2229":"18008",
2232
+ "2230":"17901",
2233
+ "2231":"17997",
2234
+ "2232":"17986",
2235
+ "2233":"17986",
2236
+ "2234":"17985",
2237
+ "2235":"17980",
2238
+ "2236":"17980",
2239
+ "2237":"17900",
2240
+ "2238":"17977",
2241
+ "2239":"17977",
2242
+ "2240":"17976",
2243
+ "2241":"17899",
2244
+ "2242":"17898",
2245
+ "2243":"17898",
2246
+ "2244":"17971",
2247
+ "2245":"17971",
2248
+ "2246":"17963",
2249
+ "2247":"17897",
2250
+ "2248":"17896",
2251
+ "2249":"17896",
2252
+ "2250":"17895",
2253
+ "2251":"17895",
2254
+ "2252":"17893",
2255
+ "2253":"17893",
2256
+ "2254":"17892",
2257
+ "2255":"17948",
2258
+ "2256":"17948",
2259
+ "2257":"17943",
2260
+ "2258":"17943",
2261
+ "2259":"17931",
2262
+ "2260":"17931",
2263
+ "2261":"17890",
2264
+ "2262":"17921",
2265
+ "2263":"17887",
2266
+ "2264":"17886",
2267
+ "2265":"17886",
2268
+ "2266":"17885",
2269
+ "2267":"17885",
2270
+ "2268":"17883",
2271
+ "2269":"17881",
2272
+ "2270":"17880",
2273
+ "2271":"17916",
2274
+ "2272":"17907",
2275
+ "2273":"17907",
2276
+ "2274":"17904",
2277
+ "2275":"17903",
2278
+ "2276":"17903",
2279
+ "2277":"17879",
2280
+ "2278":"17888",
2281
+ "2279":"17888",
2282
+ "2280":"17884",
2283
+ "2281":"17882",
2284
+ "2282":"17878",
2285
+ "2283":"17878",
2286
+ "2284":"17876",
2287
+ "2285":"17876",
2288
+ "2286":"17877",
2289
+ "2287":"17875",
2290
+ "2288":"17874",
2291
+ "2289":"17874",
2292
+ "2290":"17873",
2293
+ "2291":"17872",
2294
+ "2292":"17871",
2295
+ "2293":"17870",
2296
+ "2294":"17869",
2297
+ "2295":"17869",
2298
+ "2296":"17868",
2299
+ "2297":"17866",
2300
+ "2298":"17863",
2301
+ "2299":"17862",
2302
+ "2300":"17861",
2303
+ "2301":"17860",
2304
+ "2302":"17858",
2305
+ "2303":"17858",
2306
+ "2304":"17857",
2307
+ "2305":"17857",
2308
+ "2306":"17856",
2309
+ "2307":"17855",
2310
+ "2308":"17855",
2311
+ "2309":"17854",
2312
+ "2310":"17853",
2313
+ "2311":"17853",
2314
+ "2312":"17852",
2315
+ "2313":"17852",
2316
+ "2314":"17851",
2317
+ "2315":"17850",
2318
+ "2316":"17849",
2319
+ "2317":"17848",
2320
+ "2318":"17845",
2321
+ "2319":"17845",
2322
  "2320":"17845",
2323
+ "2321":"17844",
2324
+ "2322":"17844",
2325
+ "2323":"17843",
2326
+ "2324":"17842",
2327
+ "2325":"17840",
2328
+ "2326":"17839",
2329
+ "2327":"17835",
2330
+ "2328":"17834",
2331
+ "2329":"17833",
2332
+ "2330":"17833",
2333
+ "2331":"17832",
2334
+ "2332":"17831",
2335
+ "2333":"17831",
2336
+ "2334":"17830",
2337
+ "2335":"17829",
2338
+ "2336":"17828",
2339
+ "2337":"17828",
2340
+ "2338":"17826",
2341
+ "2339":"17826",
2342
+ "2340":"17825",
2343
+ "2341":"17825",
2344
+ "2342":"17824",
2345
+ "2343":"17824",
2346
+ "2344":"17823",
2347
+ "2345":"17823",
2348
+ "2346":"17822",
2349
+ "2347":"17820",
2350
+ "2348":"17819",
2351
+ "2349":"17818",
2352
+ "2350":"17818",
2353
+ "2351":"17816",
2354
+ "2352":"17815",
2355
+ "2353":"17815",
2356
+ "2354":"17814",
2357
+ "2355":"17814",
2358
+ "2356":"17813",
2359
+ "2357":"17812",
2360
+ "2358":"17811",
2361
+ "2359":"17810",
2362
+ "2360":"17809",
2363
+ "2361":"17809",
2364
  "2362":"17809",
2365
+ "2363":"17808",
2366
+ "2364":"17807",
2367
+ "2365":"17806",
2368
+ "2366":"19725",
2369
+ "2367":"17804",
2370
+ "2368":"17804",
2371
+ "2369":"17803",
2372
+ "2370":"17801",
2373
+ "2371":"17801",
2374
+ "2372":"17800",
2375
+ "2373":"17800",
2376
+ "2374":"17799",
2377
+ "2375":"17799",
2378
+ "2376":"17798",
2379
+ "2377":"17795",
2380
+ "2378":"17792",
2381
+ "2379":"17791",
2382
+ "2380":"17789",
2383
+ "2381":"17788",
2384
+ "2382":"17787",
2385
+ "2383":"17787",
2386
+ "2384":"17786",
2387
+ "2385":"17786",
2388
+ "2386":"17785",
2389
+ "2387":"17785",
2390
+ "2388":"17784",
2391
+ "2389":"17784",
2392
+ "2390":"17783",
2393
+ "2391":"17782",
2394
+ "2392":"17781",
2395
+ "2393":"17781",
2396
+ "2394":"17780",
2397
+ "2395":"17780",
2398
+ "2396":"17778",
2399
+ "2397":"17778",
2400
+ "2398":"17777",
2401
+ "2399":"17776",
2402
+ "2400":"17775",
2403
+ "2401":"17774",
2404
+ "2402":"17773",
2405
+ "2403":"17772",
2406
+ "2404":"17770",
2407
+ "2405":"17770",
2408
+ "2406":"17769",
2409
+ "2407":"17769",
2410
+ "2408":"17790",
2411
+ "2409":"17768",
2412
+ "2410":"17767",
2413
+ "2411":"17766",
2414
+ "2412":"17765",
2415
+ "2413":"17793",
2416
+ "2414":"17793",
2417
+ "2415":"17764",
2418
+ "2416":"17794",
2419
+ "2417":"17762",
2420
+ "2418":"17762",
2421
+ "2419":"17761",
2422
+ "2420":"17796",
2423
+ "2421":"17797",
2424
+ "2422":"17797",
2425
+ "2423":"17760",
2426
+ "2424":"17760",
2427
+ "2425":"17759",
2428
+ "2426":"17758",
2429
+ "2427":"17757",
2430
+ "2428":"17757",
2431
+ "2429":"17755",
2432
+ "2430":"17754",
2433
+ "2431":"17753",
2434
+ "2432":"17752",
2435
+ "2433":"17752",
2436
+ "2434":"17751",
2437
+ "2435":"17749",
2438
+ "2436":"17750",
2439
+ "2437":"17750",
2440
+ "2438":"17748",
2441
+ "2439":"17738",
2442
+ "2440":"17737",
2443
+ "2441":"17737",
2444
+ "2442":"17736",
2445
+ "2443":"17735",
2446
+ "2444":"17735",
2447
+ "2445":"17734",
2448
+ "2446":"17733",
2449
+ "2447":"17733",
2450
+ "2448":"17732",
2451
+ "2449":"17732",
2452
+ "2450":"17731",
2453
+ "2451":"17731",
2454
+ "2452":"17730",
2455
+ "2453":"17729",
2456
+ "2454":"17729",
2457
  "2455":"17729",
2458
+ "2456":"17728",
2459
+ "2457":"17727",
2460
+ "2458":"17727",
2461
+ "2459":"17726",
2462
+ "2460":"17725",
2463
+ "2461":"17724",
2464
+ "2462":"17722",
2465
+ "2463":"17721",
2466
+ "2464":"17721",
2467
+ "2465":"17720",
2468
+ "2466":"17719",
2469
+ "2467":"17718",
2470
+ "2468":"17717",
2471
+ "2469":"17716",
2472
+ "2470":"17715",
2473
+ "2471":"17714",
2474
+ "2472":"17713",
2475
+ "2473":"17713",
2476
+ "2474":"17712",
2477
+ "2475":"17712",
2478
+ "2476":"17711",
2479
+ "2477":"17711",
2480
+ "2478":"17710",
2481
+ "2479":"17710",
2482
+ "2480":"17709",
2483
+ "2481":"17708",
2484
+ "2482":"17707",
2485
+ "2483":"17707",
2486
+ "2484":"17706",
2487
+ "2485":"17706",
2488
+ "2486":"17705",
2489
+ "2487":"17704",
2490
+ "2488":"17704",
2491
+ "2489":"17703",
2492
+ "2490":"17702",
2493
+ "2491":"17702",
2494
+ "2492":"17701",
2495
+ "2493":"17699",
2496
+ "2494":"17698",
2497
+ "2495":"17698",
2498
+ "2496":"17697",
2499
+ "2497":"17697",
2500
+ "2498":"17696",
2501
+ "2499":"17695",
2502
+ "2500":"17694",
2503
+ "2501":"17694",
2504
+ "2502":"17692",
2505
+ "2503":"17692",
2506
+ "2504":"17691",
2507
+ "2505":"17690",
2508
+ "2506":"17689",
2509
+ "2507":"17689",
2510
+ "2508":"17688",
2511
+ "2509":"17688",
2512
+ "2510":"17687",
2513
+ "2511":"17686",
2514
+ "2512":"17685",
2515
+ "2513":"17685",
2516
  "2514":"17685",
2517
+ "2515":"17684",
2518
+ "2516":"17683",
2519
+ "2517":"17682",
2520
+ "2518":"17680",
2521
+ "2519":"17680",
2522
+ "2520":"17679",
2523
+ "2521":"17678",
2524
+ "2522":"17677",
2525
+ "2523":"17677",
2526
+ "2524":"17676",
2527
+ "2525":"17674",
2528
+ "2526":"17673",
2529
+ "2527":"17827",
2530
+ "2528":"17672",
2531
+ "2529":"17671",
2532
+ "2530":"17670",
2533
+ "2531":"17669",
2534
+ "2532":"17668",
2535
+ "2533":"17667",
2536
+ "2534":"17666",
2537
+ "2535":"17666",
2538
+ "2536":"17665",
2539
+ "2537":"17664",
2540
+ "2538":"17663",
2541
+ "2539":"17663",
2542
+ "2540":"17661",
2543
+ "2541":"17659",
2544
+ "2542":"17659",
2545
+ "2543":"17658",
2546
+ "2544":"17658",
2547
+ "2545":"17656",
2548
+ "2546":"17655",
2549
+ "2547":"17654",
2550
+ "2548":"17654",
2551
+ "2549":"17653",
2552
+ "2550":"17652",
2553
+ "2551":"17651",
2554
+ "2552":"17650",
2555
+ "2553":"17649",
2556
+ "2554":"17648",
2557
+ "2555":"17647",
2558
+ "2556":"17646",
2559
+ "2557":"17645",
2560
+ "2558":"17637",
2561
+ "2559":"17637",
2562
+ "2560":"17644",
2563
+ "2561":"17644",
2564
+ "2562":"17625",
2565
+ "2563":"17624",
2566
+ "2564":"17624",
2567
+ "2565":"17623",
2568
+ "2566":"17622",
2569
+ "2567":"17622",
2570
+ "2568":"17621",
2571
+ "2569":"17621",
2572
+ "2570":"17620",
2573
+ "2571":"17619",
2574
+ "2572":"17618",
2575
+ "2573":"17617",
2576
+ "2574":"17616",
2577
+ "2575":"17616",
2578
+ "2576":"17615",
2579
+ "2577":"17615",
2580
+ "2578":"17613",
2581
+ "2579":"17612",
2582
+ "2580":"17612",
2583
+ "2581":"17611",
2584
+ "2582":"17609",
2585
+ "2583":"17609",
2586
+ "2584":"17608",
2587
+ "2585":"17608",
2588
  "2586":"17608",
2589
+ "2587":"17607",
2590
+ "2588":"17606",
2591
+ "2589":"17606",
2592
+ "2590":"17605",
2593
+ "2591":"17604",
2594
+ "2592":"17603",
2595
+ "2593":"17603",
2596
+ "2594":"17602",
2597
+ "2595":"17601",
2598
+ "2596":"17600",
2599
+ "2597":"17598",
2600
+ "2598":"17597",
2601
+ "2599":"17596",
2602
+ "2600":"17595",
2603
+ "2601":"17594",
2604
+ "2602":"17594",
2605
+ "2603":"17593",
2606
+ "2604":"17592",
2607
+ "2605":"17591",
2608
+ "2606":"17590",
2609
+ "2607":"17590",
2610
+ "2608":"17589",
2611
+ "2609":"17589",
2612
+ "2610":"17588",
2613
+ "2611":"17587",
2614
+ "2612":"17585",
2615
+ "2613":"17586",
2616
+ "2614":"17586",
2617
+ "2615":"17584",
2618
+ "2616":"17584",
2619
+ "2617":"17583",
2620
+ "2618":"17582",
2621
+ "2619":"17581",
2622
+ "2620":"17580",
2623
+ "2621":"17579",
2624
+ "2622":"17578",
2625
+ "2623":"17577",
2626
+ "2624":"17577",
2627
+ "2625":"17576",
2628
+ "2626":"17576",
2629
+ "2627":"17574",
2630
+ "2628":"17573",
2631
+ "2629":"17572",
2632
+ "2630":"17571",
2633
+ "2631":"17571",
2634
+ "2632":"17570",
2635
+ "2633":"17570",
2636
+ "2634":"17569",
2637
+ "2635":"17568",
2638
+ "2636":"17565",
2639
+ "2637":"17564",
2640
+ "2638":"17563",
2641
+ "2639":"17563",
2642
+ "2640":"17561",
2643
+ "2641":"17560",
2644
+ "2642":"18364",
2645
+ "2643":"17559",
2646
+ "2644":"17558",
2647
+ "2645":"17556",
2648
+ "2646":"17555",
2649
+ "2647":"17555",
2650
  "2648":"17555",
2651
  "2649":"17555",
2652
  "2650":"17555",
2653
+ "2651":"17554",
2654
+ "2652":"17554",
2655
+ "2653":"17552",
2656
+ "2654":"17551",
2657
+ "2655":"17551",
2658
+ "2656":"17548",
2659
+ "2657":"17547",
2660
+ "2658":"17547",
2661
+ "2659":"17546",
2662
+ "2660":"17549",
2663
+ "2661":"17544",
2664
+ "2662":"17544",
2665
+ "2663":"17543",
2666
+ "2664":"17543",
2667
+ "2665":"17542",
2668
+ "2666":"17541",
2669
+ "2667":"17540",
2670
+ "2668":"17539",
2671
+ "2669":"17537",
2672
+ "2670":"17537",
2673
  "2671":"17537",
2674
+ "2672":"17536",
2675
+ "2673":"17535",
2676
+ "2674":"17534",
2677
+ "2675":"17532",
2678
+ "2676":"17531",
2679
+ "2677":"17531",
2680
+ "2678":"17530",
2681
+ "2679":"17529",
2682
+ "2680":"17529",
2683
+ "2681":"17528",
2684
+ "2682":"17527",
2685
+ "2683":"17526",
2686
+ "2684":"17525",
2687
+ "2685":"17525",
2688
+ "2686":"17523",
2689
+ "2687":"17522",
2690
+ "2688":"17520",
2691
+ "2689":"17521",
2692
+ "2690":"17519",
2693
+ "2691":"17519",
2694
+ "2692":"17518",
2695
+ "2693":"17518",
2696
+ "2694":"17517",
2697
+ "2695":"17516",
2698
+ "2696":"17516",
2699
  "2697":"17516",
2700
+ "2698":"17515",
2701
+ "2699":"17515",
2702
+ "2700":"17514",
2703
+ "2701":"17513",
2704
+ "2702":"17512",
2705
+ "2703":"17511",
2706
+ "2704":"17511",
2707
+ "2705":"17510",
2708
+ "2706":"17510",
2709
+ "2707":"17509",
2710
+ "2708":"17509",
2711
+ "2709":"17507",
2712
+ "2710":"17506",
2713
+ "2711":"17505",
2714
+ "2712":"17505",
2715
+ "2713":"17504",
2716
+ "2714":"17503",
2717
+ "2715":"17502",
2718
+ "2716":"17501",
2719
+ "2717":"17500",
2720
+ "2718":"17499",
2721
+ "2719":"17498",
2722
+ "2720":"17497",
2723
+ "2721":"17497",
2724
+ "2722":"17495",
2725
+ "2723":"17495",
2726
+ "2724":"17494",
2727
+ "2725":"17493",
2728
+ "2726":"17492",
2729
+ "2727":"17492",
2730
+ "2728":"17491",
2731
+ "2729":"17491",
2732
+ "2730":"17490",
2733
+ "2731":"17490",
2734
+ "2732":"17553",
2735
+ "2733":"17553",
2736
+ "2734":"17489",
2737
+ "2735":"17488",
2738
+ "2736":"17486",
2739
+ "2737":"17485",
2740
+ "2738":"17487",
2741
+ "2739":"17484",
2742
+ "2740":"17483",
2743
+ "2741":"17483",
2744
+ "2742":"17557",
2745
+ "2743":"17557",
2746
+ "2744":"17482",
2747
+ "2745":"17481",
2748
+ "2746":"17481",
2749
+ "2747":"17480",
2750
+ "2748":"17479",
2751
+ "2749":"17479",
2752
+ "2750":"17478",
2753
+ "2751":"17478",
2754
+ "2752":"17475",
2755
+ "2753":"17475",
2756
+ "2754":"17474",
2757
+ "2755":"17473",
2758
+ "2756":"17473",
2759
+ "2757":"17472",
2760
+ "2758":"17471",
2761
+ "2759":"17470",
2762
+ "2760":"17469",
2763
+ "2761":"17468",
2764
+ "2762":"17467",
2765
+ "2763":"17465",
2766
+ "2764":"17465",
2767
+ "2765":"17464",
2768
+ "2766":"17463",
2769
+ "2767":"17463",
2770
+ "2768":"17462",
2771
+ "2769":"17461",
2772
+ "2770":"17460",
2773
+ "2771":"17460",
2774
  "2772":"17460",
2775
+ "2773":"17459",
2776
+ "2774":"17458",
2777
+ "2775":"17458",
2778
+ "2776":"17457",
2779
+ "2777":"17456",
2780
+ "2778":"17455",
2781
+ "2779":"17455",
2782
+ "2780":"17453",
2783
+ "2781":"17452",
2784
+ "2782":"17451",
2785
+ "2783":"17451",
2786
+ "2784":"17450",
2787
+ "2785":"17450",
2788
+ "2786":"18550",
2789
+ "2787":"18550",
2790
+ "2788":"17449",
2791
+ "2789":"17448",
2792
+ "2790":"17447",
2793
+ "2791":"17447",
2794
+ "2792":"17446",
2795
+ "2793":"17444",
2796
+ "2794":"17445",
2797
+ "2795":"17566",
2798
+ "2796":"17566",
2799
+ "2797":"17443",
2800
+ "2798":"17567",
2801
+ "2799":"17442",
2802
+ "2800":"17441",
2803
+ "2801":"17440",
2804
+ "2802":"17440",
2805
+ "2803":"17439",
2806
+ "2804":"17437",
2807
+ "2805":"17437",
2808
+ "2806":"17436",
2809
+ "2807":"17435",
2810
+ "2808":"17435",
2811
+ "2809":"17433",
2812
+ "2810":"17431",
2813
+ "2811":"17431",
2814
+ "2812":"17430",
2815
+ "2813":"17429",
2816
+ "2814":"17428",
2817
+ "2815":"17428",
2818
+ "2816":"17425",
2819
+ "2817":"17426",
2820
+ "2818":"17424",
2821
+ "2819":"17423",
2822
+ "2820":"17422",
2823
+ "2821":"17422",
2824
+ "2822":"17421",
2825
+ "2823":"17420",
2826
+ "2824":"17419",
2827
+ "2825":"17418",
2828
+ "2826":"17418",
2829
+ "2827":"17417",
2830
+ "2828":"17415",
2831
+ "2829":"17415",
2832
+ "2830":"17413",
2833
+ "2831":"17412",
2834
+ "2832":"17412",
2835
+ "2833":"17411",
2836
+ "2834":"17411",
2837
+ "2835":"17410",
2838
+ "2836":"17846",
2839
+ "2837":"17409",
2840
+ "2838":"17409",
2841
+ "2839":"17407",
2842
+ "2840":"17408",
2843
+ "2841":"17406",
2844
+ "2842":"17406",
2845
+ "2843":"17405",
2846
+ "2844":"17405",
2847
+ "2845":"17402",
2848
+ "2846":"17400",
2849
+ "2847":"17399",
2850
+ "2848":"17399",
2851
+ "2849":"17398",
2852
+ "2850":"17397",
2853
+ "2851":"17397",
2854
+ "2852":"17396",
2855
+ "2853":"17394",
2856
+ "2854":"17393",
2857
+ "2855":"17392",
2858
+ "2856":"17390",
2859
+ "2857":"17390",
2860
+ "2858":"17389",
2861
+ "2859":"17389",
2862
+ "2860":"17385",
2863
+ "2861":"17384",
2864
+ "2862":"17382",
2865
+ "2863":"17383",
2866
+ "2864":"17383",
2867
+ "2865":"17380",
2868
+ "2866":"17380",
2869
+ "2867":"17378",
2870
+ "2868":"17376",
2871
+ "2869":"17373",
2872
+ "2870":"17373",
2873
+ "2871":"17372",
2874
+ "2872":"17371",
2875
+ "2873":"17371",
2876
+ "2874":"17370",
2877
+ "2875":"17369",
2878
+ "2876":"17368",
2879
+ "2877":"17368",
2880
+ "2878":"17367",
2881
+ "2879":"17367",
2882
+ "2880":"17366",
2883
+ "2881":"17366",
2884
+ "2882":"18268",
2885
+ "2883":"18268",
2886
+ "2884":"17859",
2887
+ "2885":"17867",
2888
+ "2886":"17867",
2889
+ "2887":"17889",
2890
+ "2888":"17889",
2891
+ "2889":"17894",
2892
+ "2890":"17923",
2893
+ "2891":"17936",
2894
+ "2892":"17947",
2895
+ "2893":"19331",
2896
+ "2894":"19331",
2897
+ "2895":"17982",
2898
+ "2896":"17998",
2899
+ "2897":"18001",
2900
+ "2898":"18063",
2901
+ "2899":"18792",
2902
+ "2900":"18792",
2903
+ "2901":"18109",
2904
+ "2902":"18109",
2905
+ "2903":"18111",
2906
+ "2904":"18111",
2907
+ "2905":"18149",
2908
+ "2906":"18149",
2909
+ "2907":"18159",
2910
+ "2908":"18159",
2911
+ "2909":"18166",
2912
+ "2910":"18171",
2913
+ "2911":"18171",
2914
+ "2912":"18180",
2915
+ "2913":"18180",
2916
+ "2914":"18210",
2917
+ "2915":"18213",
2918
+ "2916":"18219",
2919
+ "2917":"18219",
2920
+ "2918":"18233",
2921
+ "2919":"18233",
2922
+ "2920":"18484",
2923
+ "2921":"18484",
2924
+ "2922":"18249",
2925
+ "2923":"18249",
2926
+ "2924":"18262",
2927
+ "2925":"18280",
2928
+ "2926":"18280",
2929
+ "2927":"18332",
2930
+ "2928":"18342",
2931
+ "2929":"18362",
2932
+ "2930":"18365",
2933
+ "2931":"18376",
2934
+ "2932":"18387",
2935
+ "2933":"18387",
2936
+ "2934":"18391",
2937
+ "2935":"18391",
2938
+ "2936":"18392",
2939
+ "2937":"18395",
2940
+ "2938":"18417",
2941
+ "2939":"18417",
2942
+ "2940":"18428",
2943
+ "2941":"18438",
2944
+ "2942":"18438",
2945
+ "2943":"18456",
2946
+ "2944":"18457",
2947
+ "2945":"18463",
2948
+ "2946":"18464",
2949
+ "2947":"18467",
2950
+ "2948":"18487",
2951
+ "2949":"18489",
2952
+ "2950":"18489",
2953
+ "2951":"18501",
2954
+ "2952":"18533",
2955
+ "2953":"18536",
2956
+ "2954":"18560",
2957
+ "2955":"18560",
2958
+ "2956":"18584",
2959
+ "2957":"18613",
2960
+ "2958":"18613",
2961
+ "2959":"18620",
2962
+ "2960":"18630",
2963
+ "2961":"18697",
2964
+ "2962":"18701",
2965
+ "2963":"18701",
2966
+ "2964":"18710",
2967
+ "2965":"18722",
2968
+ "2966":"18727",
2969
+ "2967":"18774",
2970
+ "2968":"18788",
2971
+ "2969":"18788",
2972
+ "2970":"18790",
2973
+ "2971":"18839",
2974
+ "2972":"18840",
2975
+ "2973":"18845",
2976
+ "2974":"18858",
2977
+ "2975":"18868",
2978
+ "2976":"18913",
2979
+ "2977":"18913",
2980
+ "2978":"18923",
2981
+ "2979":"18960",
2982
+ "2980":"18960",
2983
+ "2981":"18961",
2984
+ "2982":"18961",
2985
+ "2983":"18966",
2986
+ "2984":"18966",
2987
+ "2985":"18974",
2988
+ "2986":"18981",
2989
+ "2987":"18995",
2990
+ "2988":"18995",
2991
+ "2989":"18997",
2992
+ "2990":"19011",
2993
+ "2991":"19011",
2994
+ "2992":"19022",
2995
+ "2993":"19030",
2996
+ "2994":"19045",
2997
+ "2995":"19045",
2998
+ "2996":"19056",
2999
+ "2997":"19070",
3000
+ "2998":"19070",
3001
+ "2999":"19096",
3002
+ "3000":"19096",
3003
+ "3001":"19131",
3004
+ "3002":"19140",
3005
+ "3003":"19195",
3006
+ "3004":"19204",
3007
+ "3005":"19222",
3008
+ "3006":"19227",
3009
+ "3007":"19227",
3010
+ "3008":"19265",
3011
+ "3009":"19265",
3012
+ "3010":"19278",
3013
+ "3011":"17401",
3014
+ "3012":"17401",
3015
+ "3013":"19307",
3016
+ "3014":"19320",
3017
+ "3015":"19328",
3018
+ "3016":"19328",
3019
+ "3017":"19337",
3020
+ "3018":"19337",
3021
+ "3019":"19351",
3022
+ "3020":"19365",
3023
+ "3021":"19368",
3024
+ "3022":"19420",
3025
+ "3023":"19420",
3026
+ "3024":"19426",
3027
+ "3025":"19426",
3028
+ "3026":"19440",
3029
+ "3027":"19451",
3030
+ "3028":"19451",
3031
+ "3029":"19487",
3032
+ "3030":"19487",
3033
+ "3031":"19504",
3034
+ "3032":"19504",
3035
+ "3033":"19507",
3036
+ "3034":"19516",
3037
+ "3035":"17374",
3038
+ "3036":"17387",
3039
+ "3037":"17391",
3040
+ "3038":"17391",
3041
+ "3039":"17403",
3042
+ "3040":"17403",
3043
+ "3041":"17414",
3044
+ "3042":"17432",
3045
+ "3043":"17432",
3046
+ "3044":"17476",
3047
+ "3045":"17476",
3048
+ "3046":"17477",
3049
+ "3047":"17496",
3050
+ "3048":"18819",
3051
+ "3049":"18819",
3052
+ "3050":"19188",
3053
+ "3051":"19439",
3054
+ "3052":"17538",
3055
+ "3053":"17538",
3056
+ "3054":"18599",
3057
+ "3055":"18599",
3058
+ "3056":"19437",
3059
+ "3057":"17747",
3060
+ "3058":"17747",
3061
+ "3059":"19435",
3062
+ "3060":"19434",
3063
+ "3061":"19434",
3064
+ "3062":"19433",
3065
+ "3063":"19432",
3066
+ "3064":"19432",
3067
+ "3065":"19431",
3068
+ "3066":"19430",
3069
+ "3067":"19429",
3070
+ "3068":"19429",
3071
+ "3069":"19428",
3072
+ "3070":"19428",
3073
+ "3071":"19137",
3074
+ "3072":"19137",
3075
+ "3073":"18168",
3076
+ "3074":"19427",
3077
+ "3075":"19427",
3078
+ "3076":"17746",
3079
+ "3077":"19425",
3080
+ "3078":"19424",
3081
+ "3079":"19423",
3082
+ "3080":"19421",
3083
+ "3081":"19421",
3084
+ "3082":"19793",
3085
+ "3083":"19793",
3086
+ "3084":"19418",
3087
+ "3085":"19417",
3088
+ "3086":"17533",
3089
+ "3087":"19275",
3090
+ "3088":"19350",
3091
+ "3089":"19415",
3092
+ "3090":"18873",
3093
+ "3091":"18873",
3094
+ "3092":"19414",
3095
+ "3093":"19413",
3096
+ "3094":"19412",
3097
+ "3095":"19033",
3098
+ "3096":"19033",
3099
+ "3097":"19411",
3100
+ "3098":"19410",
3101
+ "3099":"18875",
3102
+ "3100":"19408",
3103
+ "3101":"19407",
3104
+ "3102":"19406",
3105
+ "3103":"19406",
3106
+ "3104":"19405",
3107
+ "3105":"17643",
3108
+ "3106":"19496",
3109
+ "3107":"19404",
3110
+ "3108":"19403",
3111
+ "3109":"17836",
3112
+ "3110":"17836",
3113
+ "3111":"18077",
3114
+ "3112":"18077",
3115
+ "3113":"18556",
3116
+ "3114":"18556",
3117
+ "3115":"19526",
3118
+ "3116":"19526",
3119
+ "3117":"17802",
3120
+ "3118":"17802",
3121
+ "3119":"19616",
3122
+ "3120":"19616",
3123
+ "3121":"17365",
3124
+ "3122":"19082",
3125
+ "3123":"19443",
3126
+ "3124":"18058",
3127
+ "3125":"18058",
3128
  "3126":"18058",
3129
+ "3127":"19778",
3130
+ "3128":"17427",
3131
+ "3129":"17427",
3132
+ "3130":"19250",
3133
+ "3131":"17629",
3134
+ "3132":"18691",
3135
+ "3133":"18144",
3136
+ "3134":"18144",
3137
+ "3135":"18327",
3138
+ "3136":"17700",
3139
+ "3137":"17700",
3140
+ "3138":"19449",
3141
+ "3139":"19559",
3142
+ "3140":"18643",
3143
+ "3141":"19546",
3144
+ "3142":"19470",
3145
+ "3143":"18917",
3146
+ "3144":"19773",
3147
+ "3145":"19773",
3148
+ "3146":"18921",
3149
+ "3147":"18921"
 
 
3150
  }
plan.json CHANGED
@@ -37,7 +37,7 @@
37
  "checkpoint": "colbert-ir\/colbertv2.0",
38
  "triples": "\/future\/u\/okhattab\/root\/unit\/experiments\/2021.10\/downstream.distillation.round2.2_score\/round2.nway6.cosine.ib\/examples.64.json",
39
  "collection": [
40
- "list with 3150 elements starting with...",
41
  [
42
  "Image restoration poses a garners substantial interest due to the exponential surge in demands for recovering high-quality images from diverse mobile camera devices, adverse lighting conditions, suboptimal shooting environments, and frequent image compression for efficient transmission purposes. Yet this problem gathers significant challenges as people are blind to the type of restoration the images suffer, which, is usually the case in real-day scenarios and is most urgent to solve for this field. Current research, however, heavily relies on prior knowledge of the restoration type, either explicitly through rules or implicitly through the availability of degraded-clean image pairs to define the restoration process, and consumes considerable effort to collect image pairs of vast degradation types. This paper introduces DreamClean, a training-free method that needs no degradation prior knowledge but yields high-fidelity and generality towards various types of image degradation. DreamClean embeds the degraded image back to the latent of pre-trained diffusion models and re-sample it through a carefully designed diffusion process that mimics those generating clean images. Thanks to the rich image prior in diffusion models and our novel Variance Preservation Sampling (VPS) technique, DreamClean manages to handle various different degradation types at one time and reaches far more satisfied final quality than previous competitors.",
43
  "Thanks to the rich image prior in diffusion models and our novel Variance Preservation Sampling (VPS) technique, DreamClean manages to handle various different degradation types at one time and reaches far more satisfied final quality than previous competitors. DreamClean relies on elegant theoretical supports to assure its convergence to clean image when VPS has appropriate parameters, and also enjoys superior experimental performance over various challenging tasks that could be overwhelming for previous methods when degradation prior is unavailable.",
@@ -50,7 +50,7 @@
50
  "root": ".ragatouille\/",
51
  "experiment": "colbert",
52
  "index_root": null,
53
- "name": "2024-04\/04\/04.50.55",
54
  "rank": 0,
55
  "nranks": 1,
56
  "amp": true,
@@ -59,6 +59,6 @@
59
  },
60
  "num_chunks": 1,
61
  "num_partitions": 8192,
62
- "num_embeddings_est": 547569.9851989746,
63
- "avg_doclen_est": 173.8317413330078
64
  }
 
37
  "checkpoint": "colbert-ir\/colbertv2.0",
38
  "triples": "\/future\/u\/okhattab\/root\/unit\/experiments\/2021.10\/downstream.distillation.round2.2_score\/round2.nway6.cosine.ib\/examples.64.json",
39
  "collection": [
40
+ "list with 3148 elements starting with...",
41
  [
42
  "Image restoration poses a garners substantial interest due to the exponential surge in demands for recovering high-quality images from diverse mobile camera devices, adverse lighting conditions, suboptimal shooting environments, and frequent image compression for efficient transmission purposes. Yet this problem gathers significant challenges as people are blind to the type of restoration the images suffer, which, is usually the case in real-day scenarios and is most urgent to solve for this field. Current research, however, heavily relies on prior knowledge of the restoration type, either explicitly through rules or implicitly through the availability of degraded-clean image pairs to define the restoration process, and consumes considerable effort to collect image pairs of vast degradation types. This paper introduces DreamClean, a training-free method that needs no degradation prior knowledge but yields high-fidelity and generality towards various types of image degradation. DreamClean embeds the degraded image back to the latent of pre-trained diffusion models and re-sample it through a carefully designed diffusion process that mimics those generating clean images. Thanks to the rich image prior in diffusion models and our novel Variance Preservation Sampling (VPS) technique, DreamClean manages to handle various different degradation types at one time and reaches far more satisfied final quality than previous competitors.",
43
  "Thanks to the rich image prior in diffusion models and our novel Variance Preservation Sampling (VPS) technique, DreamClean manages to handle various different degradation types at one time and reaches far more satisfied final quality than previous competitors. DreamClean relies on elegant theoretical supports to assure its convergence to clean image when VPS has appropriate parameters, and also enjoys superior experimental performance over various challenging tasks that could be overwhelming for previous methods when degradation prior is unavailable.",
 
50
  "root": ".ragatouille\/",
51
  "experiment": "colbert",
52
  "index_root": null,
53
+ "name": "2024-05\/04\/00.30.44",
54
  "rank": 0,
55
  "nranks": 1,
56
  "amp": true,
 
59
  },
60
  "num_chunks": 1,
61
  "num_partitions": 8192,
62
+ "num_embeddings_est": 547258.0114746094,
63
+ "avg_doclen_est": 173.84307861328125
64
  }