Upload folder using huggingface_hub
Browse files- 0.codes.pt +2 -2
- 0.metadata.json +2 -2
- 0.residuals.pt +2 -2
- buckets.pt +1 -1
- centroids.pt +1 -1
- collection.json +4 -1
- doclens.0.json +1 -1
- ivf.pid.pt +2 -2
- metadata.json +4 -4
- pid_docid_map.json +4 -1
- plan.json +4 -4
0.codes.pt
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:af2a90e4343caa81ef612086922c4e5327a6a7e7538cff0b52a4f801905b1652
|
3 |
+
size 2705500
|
0.metadata.json
CHANGED
@@ -1,6 +1,6 @@
|
|
1 |
{
|
2 |
"passage_offset": 0,
|
3 |
-
"num_passages":
|
4 |
-
"num_embeddings":
|
5 |
"embedding_offset": 0
|
6 |
}
|
|
|
1 |
{
|
2 |
"passage_offset": 0,
|
3 |
+
"num_passages": 3936,
|
4 |
+
"num_embeddings": 676085,
|
5 |
"embedding_offset": 0
|
6 |
}
|
0.residuals.pt
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:65442c87d1096b528d3efe2237e2f247e12763205b56cddf2ca05ca36775d596
|
3 |
+
size 86540080
|
buckets.pt
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 2904
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:f912cb06aeda038490a940ef593e3f21632a39fbd5a2310089f79b7b63e06048
|
3 |
size 2904
|
centroids.pt
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 2098342
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:57686b8aba67af6bb8be62a3be72ecfd815ac0bea57bd7cb4ef28dcf33c75f83
|
3 |
size 2098342
|
collection.json
CHANGED
@@ -3931,5 +3931,8 @@
|
|
3931 |
"Methods of computational quantum chemistry provide accurate approximations of molecular properties crucial for computer-aided drug discovery and other areas of chemical science. However, high computational complexity limits the scalability of their applications. Neural network potentials (NNPs) are a promising alternative to quantum chemistry methods, but they require large and diverse datasets for training. This work presents a new dataset and benchmark called nabla^2DFT that is based on the nablaDFT. It contains twice as much molecular structures, three times more conformations, new data types and tasks, and state-of-the-art models. The dataset includes energies, forces, 17 molecular properties, Hamiltonian and overlap matrices, and a wavefunction object. All calculations were performed at the DFT level (omegaB97X-D/def2-SVP) for each conformation. Moreover, nabla^2DFT is the first dataset that contains relaxation trajectories for a substantial number of drug-like molecules. We also introduce a novel benchmark for evaluating NNPs in molecular property prediction, Hamiltonian prediction, and conformational optimization tasks. Finally, we propose an extendable framework for training NNPs and implement 10 models within it.",
|
3932 |
"Human priors play a crucial role in efficiently utilizing data in deep learning. However, with the development of large language models (LLMs), there is an increasing emphasis on scaling both model size and data volume, which often diminishes the importance of human priors in data construction. Influenced by these trends, existing Small Language Models (SLMs) mainly rely on web-scraped large-scale training data, neglecting the proper incorporation of human priors. This oversight limits the training efficiency of language models in resource-constrained settings. In this paper, we propose a principle to leverage human priors for data construction. This principle emphasizes achieving high-performance SLMs by training on a concise dataset that accommodates both semantic diversity and data quality consistency, while avoiding benchmark data leakage. Following this principle, we train an SLM named HARE-1.1B. Extensive experiments on large-scale benchmark datasets demonstrate that HARE-1.1B performs favorably against state-of-the-art SLMs, validating the effectiveness of the proposed principle. Additionally, this provides new insights into efficient language model training in resource-constrained environments from the view of human priors.",
|
3933 |
"Diffusion distillation represents a highly promising direction for achieving faithful text-to-image generation in a few sampling steps. However, despite recent successes, existing distilled models still do not provide the full spectrum of diffusion abilities, such as real image inversion, which enables many precise image manipulation methods. This work aims to enrich distilled text-to-image diffusion models with the ability to effectively encode real images into their latent space. To this end, we introduce invertible Consistency Distillation (iCD), a generalized consistency distillation framework that facilitates both high-quality image synthesis and accurate image encoding in only 3-4 inference steps. Though the inversion problem for text-to-image diffusion models gets exacerbated by high classifier-free guidance scales, we notice that dynamic guidance significantly reduces reconstruction errors without noticeable degradation in generation performance. As a result, we demonstrate that iCD equipped with dynamic guidance may serve as a highly effective tool for zero-shot text-guided image editing, competing with more expensive state-of-the-art alternatives.",
|
3934 |
-
"In this paper, we introduce a novel low-latency inference framework for large language models (LLMs) inference which enables LLMs to perform inferences with incomplete prompts. By reallocating computational processes to prompt input phase, we achieve a substantial reduction in latency, thereby significantly enhancing the interactive experience for users of LLMs. The framework adeptly manages the visibility of the streaming prompt to the model, allowing it to infer from incomplete prompts or await additional prompts. Compared with traditional inference methods that utilize complete prompts, our approach demonstrates an average reduction of 59% in response latency on the MMLU-Pro dataset, while maintaining comparable accuracy. Additionally, our framework facilitates collaborative inference and output across different models. By employing an LLM for inference and a small language model (SLM) for output, we achieve an average 68% reduction in response latency, alongside a 5.5% improvement in accuracy on the MMLU-Pro dataset compared with the SLM baseline. For long prompts exceeding 20 sentences, the response latency can be reduced by up to 93%."
|
|
|
|
|
|
|
3935 |
]
|
|
|
3931 |
"Methods of computational quantum chemistry provide accurate approximations of molecular properties crucial for computer-aided drug discovery and other areas of chemical science. However, high computational complexity limits the scalability of their applications. Neural network potentials (NNPs) are a promising alternative to quantum chemistry methods, but they require large and diverse datasets for training. This work presents a new dataset and benchmark called nabla^2DFT that is based on the nablaDFT. It contains twice as much molecular structures, three times more conformations, new data types and tasks, and state-of-the-art models. The dataset includes energies, forces, 17 molecular properties, Hamiltonian and overlap matrices, and a wavefunction object. All calculations were performed at the DFT level (omegaB97X-D/def2-SVP) for each conformation. Moreover, nabla^2DFT is the first dataset that contains relaxation trajectories for a substantial number of drug-like molecules. We also introduce a novel benchmark for evaluating NNPs in molecular property prediction, Hamiltonian prediction, and conformational optimization tasks. Finally, we propose an extendable framework for training NNPs and implement 10 models within it.",
|
3932 |
"Human priors play a crucial role in efficiently utilizing data in deep learning. However, with the development of large language models (LLMs), there is an increasing emphasis on scaling both model size and data volume, which often diminishes the importance of human priors in data construction. Influenced by these trends, existing Small Language Models (SLMs) mainly rely on web-scraped large-scale training data, neglecting the proper incorporation of human priors. This oversight limits the training efficiency of language models in resource-constrained settings. In this paper, we propose a principle to leverage human priors for data construction. This principle emphasizes achieving high-performance SLMs by training on a concise dataset that accommodates both semantic diversity and data quality consistency, while avoiding benchmark data leakage. Following this principle, we train an SLM named HARE-1.1B. Extensive experiments on large-scale benchmark datasets demonstrate that HARE-1.1B performs favorably against state-of-the-art SLMs, validating the effectiveness of the proposed principle. Additionally, this provides new insights into efficient language model training in resource-constrained environments from the view of human priors.",
|
3933 |
"Diffusion distillation represents a highly promising direction for achieving faithful text-to-image generation in a few sampling steps. However, despite recent successes, existing distilled models still do not provide the full spectrum of diffusion abilities, such as real image inversion, which enables many precise image manipulation methods. This work aims to enrich distilled text-to-image diffusion models with the ability to effectively encode real images into their latent space. To this end, we introduce invertible Consistency Distillation (iCD), a generalized consistency distillation framework that facilitates both high-quality image synthesis and accurate image encoding in only 3-4 inference steps. Though the inversion problem for text-to-image diffusion models gets exacerbated by high classifier-free guidance scales, we notice that dynamic guidance significantly reduces reconstruction errors without noticeable degradation in generation performance. As a result, we demonstrate that iCD equipped with dynamic guidance may serve as a highly effective tool for zero-shot text-guided image editing, competing with more expensive state-of-the-art alternatives.",
|
3934 |
+
"In this paper, we introduce a novel low-latency inference framework for large language models (LLMs) inference which enables LLMs to perform inferences with incomplete prompts. By reallocating computational processes to prompt input phase, we achieve a substantial reduction in latency, thereby significantly enhancing the interactive experience for users of LLMs. The framework adeptly manages the visibility of the streaming prompt to the model, allowing it to infer from incomplete prompts or await additional prompts. Compared with traditional inference methods that utilize complete prompts, our approach demonstrates an average reduction of 59% in response latency on the MMLU-Pro dataset, while maintaining comparable accuracy. Additionally, our framework facilitates collaborative inference and output across different models. By employing an LLM for inference and a small language model (SLM) for output, we achieve an average 68% reduction in response latency, alongside a 5.5% improvement in accuracy on the MMLU-Pro dataset compared with the SLM baseline. For long prompts exceeding 20 sentences, the response latency can be reduced by up to 93%.",
|
3935 |
+
"One core capability of large language models (LLMs) is to follow natural language instructions. However, the issue of automatically constructing high-quality training data to enhance the complex instruction-following abilities of LLMs without manual annotation remains unresolved. In this paper, we introduce AutoIF, the first scalable and reliable method for automatically generating instruction-following training data. AutoIF transforms the validation of instruction-following data quality into code verification, requiring LLMs to generate instructions, the corresponding code to check the correctness of the instruction responses, and unit test samples to verify the code's correctness. Then, execution feedback-based rejection sampling can generate data for Supervised Fine-Tuning (SFT) and Reinforcement Learning from Human Feedback (RLHF) training. AutoIF achieves significant improvements across three training algorithms, SFT, Offline DPO, and Online DPO, when applied to the top open-source LLMs, Qwen2 and LLaMA3, in self-alignment and strong-to-weak distillation settings. Our code is publicly available at https://github.com/QwenLM/AutoIF.",
|
3936 |
+
"Commonsense reasoning is fundamentally based on multimodal knowledge. However, existing large language models (LLMs) are primarily trained using textual data only, limiting their ability to incorporate essential visual information. In contrast, Visual Language Models, which excel at visually-oriented tasks, often fail at non-visual tasks such as basic commonsense reasoning. This divergence highlights a critical challenge - the integration of robust visual understanding with foundational text-based language reasoning. To this end, we introduce a method aimed at enhancing LLMs' visual commonsense. Specifically, our method generates multiple images based on the input text prompt and integrates these into the model's decision-making process by mixing their prediction probabilities. To facilitate multimodal grounded language modeling, we employ a late-fusion layer that combines the projected visual features with the output of a pre-trained LLM conditioned on text only. This late-fusion layer enables predictions based on comprehensive image-text knowledge as well as text only when this is required. We evaluate our approach using several visual commonsense reasoning tasks together with traditional NLP tasks, including common sense reasoning and reading comprehension. Our experimental results demonstrate significant superiority over existing baselines.",
|
3937 |
+
"This late-fusion layer enables predictions based on comprehensive image-text knowledge as well as text only when this is required. We evaluate our approach using several visual commonsense reasoning tasks together with traditional NLP tasks, including common sense reasoning and reading comprehension. Our experimental results demonstrate significant superiority over existing baselines. When applied to recent state-of-the-art LLMs (e.g., Llama3), we observe improvements not only in visual common sense but also in traditional NLP benchmarks. Code and models are available under https://github.com/guyyariv/vLMIG."
|
3938 |
]
|
doclens.0.json
CHANGED
@@ -1 +1 @@
|
|
1 |
-
[178,205,218,148,184,163,221,185,200,228,172,155,210,222,88,206,226,67,132,212,91,206,104,212,174,205,132,159,230,175,216,198,227,190,212,198,122,213,169,204,92,197,118,191,191,224,69,219,197,72,218,77,175,111,155,217,220,170,231,91,221,217,95,146,177,123,195,205,151,209,207,36,202,200,226,176,232,53,167,199,89,184,213,104,154,153,216,214,215,174,205,72,211,78,221,212,232,223,73,158,220,158,202,222,189,165,205,175,222,132,126,179,219,110,209,158,208,98,176,192,226,34,158,205,126,178,224,182,227,100,152,191,169,195,163,172,208,117,199,217,167,217,157,163,194,217,200,217,23,221,209,146,150,204,200,125,215,232,68,147,212,41,223,178,152,173,210,139,198,182,196,207,95,176,205,223,83,216,85,207,210,52,177,178,230,197,119,226,99,182,210,212,77,138,199,123,179,111,219,69,223,65,204,215,83,197,87,211,132,216,135,178,157,166,216,85,170,195,208,190,175,134,220,200,67,221,211,66,227,222,226,190,209,205,67,207,139,208,127,186,205,168,221,179,223,117,148,221,216,80,189,125,199,202,64,218,77,195,190,221,181,98,143,214,220,97,187,127,219,122,216,87,138,212,194,112,219,227,101,220,100,164,234,109,221,102,223,89,184,205,219,77,188,223,172,171,175,152,175,137,213,197,114,205,221,138,181,174,227,73,147,144,178,147,215,152,182,204,80,210,123,211,121,209,224,224,219,211,163,133,187,148,151,163,221,94,133,213,72,187,224,216,162,154,224,184,118,204,220,154,117,220,162,202,223,195,110,197,151,224,88,182,217,221,214,118,218,118,164,205,97,221,183,154,206,197,74,170,219,103,230,215,192,224,78,184,72,201,227,221,191,181,104,190,224,221,99,123,206,102,228,202,74,195,96,225,176,232,231,96,225,206,61,173,101,190,211,90,213,199,203,184,209,173,160,207,203,99,201,195,132,195,214,148,211,227,192,215,212,127,162,213,114,178,111,207,71,129,182,212,75,176,209,137,213,224,220,70,232,116,228,179,200,184,222,93,202,71,219,123,213,119,204,173,135,118,207,96,216,107,210,202,189,172,211,76,146,195,169,83,227,62,219,223,125,158,202,226,98,214,225,76,156,211,204,110,190,224,200,64,228,223,72,213,102,214,178,219,128,211,127,187,213,160,216,134,202,150,186,201,195,230,199,206,193,205,219,216,84,213,223,192,222,109,187,209,79,153,204,224,72,167,190,183,183,228,133,227,103,197,225,203,205,201,207,213,210,78,138,195,133,195,180,190,185,207,189,100,191,229,198,129,226,135,205,164,226,190,73,200,231,209,216,147,221,228,165,213,209,181,157,222,117,216,58,148,219,92,222,219,63,191,218,186,197,158,172,204,109,204,102,223,229,91,192,217,186,204,144,173,209,95,186,222,225,178,183,222,71,193,182,232,63,227,228,82,209,162,219,183,222,122,212,195,60,219,130,223,69,216,111,198,224,80,223,74,164,148,227,193,216,139,207,198,148,227,199,169,230,225,86,138,165,206,91,213,164,215,229,173,216,210,92,166,205,102,196,78,202,74,187,202,229,191,222,57,206,115,183,94,225,173,198,109,212,224,126,215,44,221,80,196,95,177,218,217,198,109,218,191,221,214,227,71,207,123,206,90,219,218,108,181,214,123,150,204,214,208,225,199,191,192,213,93,217,149,178,199,106,204,80,208,60,131,132,232,218,213,115,202,94,175,227,217,114,200,222,178,180,185,192,193,205,173,226,217,222,197,184,205,211,136,214,221,62,213,89,169,184,135,198,98,216,225,71,230,133,213,154,216,189,122,224,108,204,88,207,227,93,191,204,214,66,153,219,209,203,218,237,229,222,148,174,218,107,214,169,223,216,173,220,148,199,188,175,203,142,188,210,98,167,174,224,103,156,97,220,111,216,70,130,221,216,115,166,164,162,210,216,69,173,203,77,216,175,172,210,98,199,213,88,221,82,184,131,201,220,133,211,116,156,219,213,227,128,221,237,172,217,64,180,221,70,169,220,216,134,215,177,190,197,229,221,226,74,189,190,219,86,222,74,213,106,184,182,205,221,229,143,201,198,162,209,128,221,209,60,216,146,228,106,215,154,214,202,82,194,141,168,217,213,80,140,114,220,149,229,215,150,211,170,199,109,218,108,206,125,204,164,183,148,193,153,188,143,160,199,210,124,225,105,178,75,222,112,205,75,199,83,222,215,158,104,205,70,215,223,158,220,99,124,194,91,219,65,209,149,234,188,222,218,200,66,208,101,208,188,201,207,198,160,205,219,118,170,195,141,128,182,189,211,51,179,216,112,152,196,224,80,223,177,218,127,186,113,151,117,159,184,200,231,212,134,222,68,216,204,90,151,202,158,173,146,144,209,222,89,171,188,206,215,117,191,223,80,215,68,151,159,210,74,208,105,204,105,174,228,67,147,202,214,41,222,163,183,142,227,188,200,208,132,157,191,195,94,166,201,155,214,132,185,195,165,214,219,156,217,100,147,220,105,220,223,225,117,176,166,207,134,218,202,172,183,217,76,229,139,217,87,208,181,214,169,154,199,231,208,200,157,213,222,49,207,191,139,219,218,82,122,154,137,190,176,192,180,231,80,208,136,185,112,189,187,192,208,219,110,208,102,186,225,220,218,235,214,96,133,198,199,95,152,83,219,190,124,148,222,213,207,145,217,168,200,32,208,198,171,205,218,104,218,136,230,77,139,178,212,214,129,211,233,93,221,95,155,153,221,151,201,108,193,216,172,208,72,177,217,208,225,60,81,172,234,180,146,184,180,199,222,224,182,106,183,105,221,208,81,221,137,211,44,198,224,205,82,187,164,147,213,163,229,192,219,218,190,228,221,218,116,231,97,182,126,213,193,173,207,103,207,122,161,156,193,153,192,92,198,155,223,224,106,223,170,177,225,40,206,203,221,152,208,116,203,105,197,221,65,226,111,224,215,215,142,154,211,84,184,135,169,212,213,76,229,225,130,207,85,209,177,116,140,150,229,209,105,164,186,214,219,62,148,203,230,118,207,216,217,211,114,201,134,217,87,197,124,209,186,184,164,189,163,212,85,219,106,209,220,168,221,212,112,189,183,229,200,192,106,224,114,220,95,141,223,222,219,226,79,187,202,212,222,193,128,223,209,160,178,197,222,111,152,114,164,206,179,189,183,122,212,149,150,130,214,115,195,118,204,208,120,195,163,159,220,125,222,174,211,144,140,220,80,173,204,87,215,149,227,97,229,221,119,213,139,212,82,215,68,213,136,186,168,202,85,213,93,168,218,63,222,121,213,90,218,94,197,221,182,220,112,224,221,83,215,194,138,181,197,221,91,186,194,99,179,213,210,76,188,214,216,79,213,92,220,94,201,108,213,131,221,153,219,74,159,172,218,217,219,108,209,35,217,112,157,206,148,211,126,214,107,191,105,164,224,220,126,204,201,211,119,203,129,165,205,130,218,137,211,178,204,215,206,119,209,75,193,182,115,178,207,217,220,184,199,167,189,230,140,208,167,173,195,136,220,205,64,186,159,194,75,213,189,110,189,208,120,176,182,162,196,183,137,134,198,172,193,184,172,212,189,214,200,95,167,199,110,195,116,180,204,66,217,126,222,226,193,166,143,121,218,224,196,140,226,142,206,66,214,140,218,144,201,221,152,204,154,221,101,225,69,216,212,128,181,200,180,206,136,193,226,226,219,141,168,202,209,105,178,128,208,197,145,163,218,124,220,147,220,156,211,69,212,226,132,223,165,211,110,229,194,193,204,208,90,218,106,152,215,134,180,182,204,211,80,211,108,204,102,208,102,192,94,191,206,203,140,165,213,197,105,186,186,210,62,208,165,213,174,186,115,211,164,193,128,201,169,217,167,217,174,222,167,129,170,217,218,182,159,113,208,85,219,126,219,89,172,213,212,69,205,135,124,205,67,208,52,186,160,225,125,190,65,227,168,180,166,151,194,205,88,138,204,110,124,223,80,193,106,179,183,163,220,148,205,218,200,221,198,202,190,213,87,227,239,81,213,115,211,219,201,204,105,216,83,210,95,170,219,197,86,228,86,209,125,231,83,198,194,171,144,183,222,216,216,226,204,109,216,115,180,227,116,207,223,84,190,215,117,212,79,214,230,218,134,208,92,209,99,181,220,61,212,85,199,185,174,171,220,221,57,146,224,214,81,183,218,152,163,193,72,203,111,130,222,204,217,229,209,201,189,220,213,72,192,214,221,157,197,124,209,100,223,162,223,73,85,221,68,210,46,205,76,118,160,180,200,85,218,34,196,170,204,88,168,224,71,188,207,166,181,210,101,148,215,76,219,217,104,199,180,210,208,69,201,202,216,224,89,206,88,128,125,196,203,106,183,182,84,228,200,120,171,166,153,215,210,82,210,65,163,131,213,83,180,205,92,152,228,99,212,212,209,94,215,218,95,206,179,236,219,204,102,196,110,160,218,140,228,102,218,177,172,193,95,209,215,203,220,78,213,191,189,168,204,218,192,89,213,223,208,230,168,202,169,222,56,155,203,73,197,156,208,222,127,212,223,160,217,201,223,205,217,100,221,208,143,218,22,208,224,86,209,90,220,144,182,139,219,79,194,183,225,216,209,190,102,221,220,203,152,215,113,230,106,212,221,106,197,111,185,211,83,162,200,208,219,124,210,92,161,216,221,79,208,166,211,142,179,222,197,79,185,226,182,193,195,170,202,113,192,231,228,197,180,152,223,155,215,97,213,85,205,145,200,89,209,201,139,217,163,133,206,174,112,205,213,217,123,201,136,132,221,206,105,200,139,183,160,215,134,221,208,181,208,101,210,225,116,183,217,80,186,120,162,194,201,97,188,209,174,122,229,214,205,120,218,228,228,83,134,201,119,171,210,55,227,107,225,64,174,203,130,213,226,95,206,157,139,154,171,139,224,219,154,190,206,218,217,221,86,135,217,213,183,118,187,186,128,212,79,184,186,156,188,134,216,172,155,195,210,176,223,125,128,217,213,214,217,94,189,192,133,167,85,187,217,60,158,221,122,217,117,224,95,206,89,199,74,205,210,70,209,83,196,77,169,188,228,114,188,105,229,182,174,223,59,181,205,127,225,222,128,190,219,112,197,115,222,205,115,184,214,210,238,196,212,85,207,86,183,215,178,187,112,194,176,212,90,217,80,164,179,187,138,221,214,203,96,227,198,82,209,207,96,208,76,228,224,205,207,186,178,221,224,189,143,229,182,229,132,213,201,186,218,67,222,221,83,216,44,220,68,214,103,137,210,48,210,211,88,222,169,225,155,181,200,207,62,219,152,179,130,215,86,183,90,202,164,179,205,128,219,224,148,190,117,213,221,184,175,146,223,216,82,230,92,201,231,218,172,207,209,189,150,216,142,226,46,197,219,92,228,168,216,111,221,204,216,190,180,189,206,108,204,101,224,74,223,65,181,142,211,203,122,220,214,66,179,185,222,116,120,209,222,101,172,209,232,56,223,177,212,107,191,168,193,193,105,212,105,163,209,191,117,194,170,211,190,121,221,90,161,207,67,226,90,216,60,171,219,89,220,184,214,194,144,182,214,208,196,133,226,214,77,190,215,63,115,208,92,227,204,68,197,77,178,188,204,216,202,194,152,175,181,208,131,196,223,128,177,116,207,102,220,121,216,199,202,207,74,211,196,217,177,214,218,221,93,219,192,215,143,139,183,226,83,102,219,211,100,194,47,119,204,160,143,180,149,183,226,96,172,212,186,198,207,182,207,221,164,146,224,61,200,140,147,196,214,60,226,215,111,216,186,180,161,234,196,124,208,214,221,189,178,112,168,208,136,169,181,202,124,201,74,211,143,192,175,119,163,202,67,191,172,148,179,223,122,217,117,180,190,186,209,72,193,182,218,80,156,138,201,202,218,208,149,190,109,224,64,192,114,219,199,210,208,97,232,205,197,188,189,87,213,152,196,164,131,225,94,219,205,163,191,172,196,189,206,110,201,73,191,122,210,173,208,73,221,136,222,185,224,74,213,86,185,221,170,210,69,165,156,210,102,211,210,221,197,209,154,127,212,180,208,139,231,207,50,166,96,150,158,206,228,214,202,95,162,220,191,136,217,54,155,201,140,179,191,91,213,220,74,145,216,232,45,208,217,209,182,160,182,100,221,155,219,227,160,180,209,147,174,212,67,209,82,213,148,208,51,176,88,210,71,117,174,218,82,211,188,170,210,186,136,176,220,157,189,167,190,212,160,212,135,201,219,49,162,165,209,117,175,213,152,176,220,221,124,150,204,220,73,228,194,218,57,195,173,159,173,175,206,176,229,79,164,201,203,152,214,116,137,219,66,222,214,80,175,202,90,225,113,219,206,78,190,89,214,50,209,154,188,227,194,157,195,74,186,206,130,198,73,212,60,204,122,222,99,205,196,229,213,83,230,108,171,126,192,104,216,207,117,217,197,214,79,207,110,221,79,217,144,206,160,206,172,197,183,207,217,207,113,210,221,71,161,221,164,227,214,142,177,185,180,103,130,198,123,205,74,216,102,219,160,217,75,204,114,192,213,166,188,118,222,227,92,195,219,161,200,221,69,203,143,198,198,217,198,66,212,50,208,116,199,125,210,207,167,225,116,207,97,184,99,220,184,203,184,219,177,167,202,214,55,207,161,197,122,212,226,187,96,216,201,188,135,224,207,139,225,230,220,121,221,107,212,66,170,169,210,199,102,220,94,159,184,207,92,207,231,214,125,227,220,205,58,193,203,215,223,229,78,196,170,185,196,162,234,56,201,123,171,231,196,86,162,199,213,220,68,200,68,205,88,225,135,220,82,182,215,222,79,152,230,62,162,218,184,224,67,206,99,189,124,214,197,73,204,105,221,179,102,218,232,80,214,181,170,204,165,216,207,217,212,195,176,215,106,192,160,221,182,217,57,211,88,198,233,113,171,204,138,193,209,225,59,176,184,134,223,151,193,200,217,100,225,79,180,142,190,123,222,80,232,216,133,216,148,211,110,198,96,187,224,95,208,112,178,227,94,171,96,181,209,170,225,196,206,94,216,87,217,171,191,82,218,127,227,176,219,207,230,79,214,203,105,213,143,174,188,125,193,220,60,215,172,214,101,211,110,161,117,187,180,125,218,220,62,208,203,217,87,198,156,216,226,161,161,223,224,72,178,198,213,195,219,208,140,175,217,74,201,201,66,186,154,229,89,226,169,204,87,184,85,161,133,201,80,176,188,114,224,77,207,126,202,83,219,200,125,172,169,190,216,80,88,221,68,218,133,216,117,217,157,217,170,190,124,214,210,156,231,84,207,204,113,200,70,222,162,208,227,92,223,136,167,195,221,221,77,173,213,109,214,117,211,217,89,217,91,210,152,194,206,202,110,216,177,190,207,227,185,172,230,172,207,171,199,234,207,149,194,192,179,212,209,210,101,198,225,85,164,211,110,194,182,211,224,65,228,218,79,224,81,122,208,154,129,206,92,193,171,148,188,221,80,220,161,165,166,161,214,99,210,64,174,224,221,105,200,122,230,216,94,223,128,225,161,219,126,187,137,191,222,214,148,151,198,218,210,110,208,228,184,211,35,202,218,195,216,115,212,95,177,199,101,184,208,202,212,134,193,129,192,81,182,223,70,226,230,134,167,183,198,222,227,227,226,63,213,109,187,177,219,223,203,144,179,209,103,177,181,158,221,90,222,166,207,175,230,207,99,205,234,210,210,168,223,143,210,187,209,204,150,209,213,208,193,221,214,77,215,199,81,197,82,177,190,210,231,79,179,221,64,182,199,82,204,204,95,172,187,178,209,86,222,220,118,192,223,88,220,77,174,104,224,137,182,186,96,207,198,74,152,196,217,206,79,214,208,204,180,94,215,81,177,160,201,164,173,205,76,199,220,228,91,215,155,226,79,133,181,136,182,226,96,221,109,209,223,71,202,95,217,87,202,204,183,210,187,212,81,226,184,224,88,170,214,198,226,142,212,81,209,189,172,192,221,216,123,221,126,204,218,222,76,205,73,225,221,73,204,108,201,88,174,197,136,223,90,189,56,207,147,206,212,73,201,83,204,112,137,227,67,208,137,219,225,65,200,186,99,214,97,215,74,203,65,199,216,108,216,80,206,219,104,226,180,225,199,186,197,226,157,102,177,107,231,156,141,226,70,220,216,223,64,214,66,201,174,170,207,46,202,131,173,218,125,217,157,234,192,159,174,209,95,196,224,59,220,69,211,130,203,222,88,208,86,198,127,219,228,75,218,170,168,198,128,215,54,211,167,186,117,211,162,221,219,105,223,99,223,127,202,218,213,143,194,181,200,180,230,224,97,181,132,173,202,221,57,151,220,77,220,160,206,188,101,197,72,213,95,193,212,189,105,226,100,205,201,56,211,93,178,212,88,208,83,213,165,219,183,236,121,220,210,94,212,171,186,218,137,212,129,175,203,223,134,194,95,193,191,105,229,208,102,196,120,191,221,217,65,206,200,74,168,180,199,217,119,223,68,211,125,204,105,180,164,215,227,128,211,166,218,86,185,74,214,57,200,171,111,185,73,199,220,213,192,216,107,211,115,219,227,192,221,101,203,65,211,51,216,84,193,121,214,86,195,115,179,229,90,215,92,207,63,179,212,38,202,104,182,125,179,99,147,184,210,166,227,232,164,120,218,169,203,154,192,224,217,122,160,205,206,221,80,191,217,166,202,78,206,147,202,155,195,76,204,136,191,112,195,160,147,226,91,224,216,212,177,188,165,174,130,203,221,220,133,209,147,216,69,159,155,143,213,94,227,139,209,163,183,199,112,217,213,98,217,96,185,158,173,229,51,209,195,227,214,161,213,83,168,229,209,118,221,224,59,179,161,220,209,193,199]
|
|
|
1 |
+
[178,205,218,148,184,163,221,185,200,228,172,155,210,222,88,206,226,67,132,212,91,206,104,212,174,205,132,159,230,175,216,198,227,190,212,198,122,213,169,204,92,197,118,191,191,224,69,219,197,72,218,77,175,111,155,217,220,170,231,91,221,217,95,146,177,123,195,205,151,209,207,36,202,200,226,176,232,53,167,199,89,184,213,104,154,153,216,214,215,174,205,72,211,78,221,212,232,223,73,158,220,158,202,222,189,165,205,175,222,132,126,179,219,110,209,158,208,98,176,192,226,34,158,205,126,178,224,182,227,100,152,191,169,195,163,172,208,117,199,217,167,217,157,163,194,217,200,217,23,221,209,146,150,204,200,125,215,232,68,147,212,41,223,178,152,173,210,139,198,182,196,207,95,176,205,223,83,216,85,207,210,52,177,178,230,197,119,226,99,182,210,212,77,138,199,123,179,111,219,69,223,65,204,215,83,197,87,211,132,216,135,178,157,166,216,85,170,195,208,190,175,134,220,200,67,221,211,66,227,222,226,190,209,205,67,207,139,208,127,186,205,168,221,179,223,117,148,221,216,80,189,125,199,202,64,218,77,195,190,221,181,98,143,214,220,97,187,127,219,122,216,87,138,212,194,112,219,227,101,220,100,164,234,109,221,102,223,89,184,205,219,77,188,223,172,171,175,152,175,137,213,197,114,205,221,138,181,174,227,73,147,144,178,147,215,152,182,204,80,210,123,211,121,209,224,224,219,211,163,133,187,148,151,163,221,94,133,213,72,187,224,216,162,154,224,184,118,204,220,154,117,220,162,202,223,195,110,197,151,224,88,182,217,221,214,118,218,118,164,205,97,221,183,154,206,197,74,170,219,103,230,215,192,224,78,184,72,201,227,221,191,181,104,190,224,221,99,123,206,102,228,202,74,195,96,225,176,232,231,96,225,206,61,173,101,190,211,90,213,199,203,184,209,173,160,207,203,99,201,195,132,195,214,148,211,227,192,215,212,127,162,213,114,178,111,207,71,129,182,212,75,176,209,137,213,224,220,70,232,116,228,179,200,184,222,93,202,71,219,123,213,119,204,173,135,118,207,96,216,107,210,202,189,172,211,76,146,195,169,83,227,62,219,223,125,158,202,226,98,214,225,76,156,211,204,110,190,224,200,64,228,223,72,213,102,214,178,219,128,211,127,187,213,160,216,134,202,150,186,201,195,230,199,206,193,205,219,216,84,213,223,192,222,109,187,209,79,153,204,224,72,167,190,183,183,228,133,227,103,197,225,203,205,201,207,213,210,78,138,195,133,195,180,190,185,207,189,100,191,229,198,129,226,135,205,164,226,190,73,200,231,209,216,147,221,228,165,213,209,181,157,222,117,216,58,148,219,92,222,219,63,191,218,186,197,158,172,204,109,204,102,223,229,91,192,217,186,204,144,173,209,95,186,222,225,178,183,222,71,193,182,232,63,227,228,82,209,162,219,183,222,122,212,195,60,219,130,223,69,216,111,198,224,80,223,74,164,148,227,193,216,139,207,198,148,227,199,169,230,225,86,138,165,206,91,213,164,215,229,173,216,210,92,166,205,102,196,78,202,74,187,202,229,191,222,57,206,115,183,94,225,173,198,109,212,224,126,215,44,221,80,196,95,177,218,217,198,109,218,191,221,214,227,71,207,123,206,90,219,218,108,181,214,123,150,204,214,208,225,199,191,192,213,93,217,149,178,199,106,204,80,208,60,131,132,232,218,213,115,202,94,175,227,217,114,200,222,178,180,185,192,193,205,173,226,217,222,197,184,205,211,136,214,221,62,213,89,169,184,135,198,98,216,225,71,230,133,213,154,216,189,122,224,108,204,88,207,227,93,191,204,214,66,153,219,209,203,218,237,229,222,148,174,218,107,214,169,223,216,173,220,148,199,188,175,203,142,188,210,98,167,174,224,103,156,97,220,111,216,70,130,221,216,115,166,164,162,210,216,69,173,203,77,216,175,172,210,98,199,213,88,221,82,184,131,201,220,133,211,116,156,219,213,227,128,221,237,172,217,64,180,221,70,169,220,216,134,215,177,190,197,229,221,226,74,189,190,219,86,222,74,213,106,184,182,205,221,229,143,201,198,162,209,128,221,209,60,216,146,228,106,215,154,214,202,82,194,141,168,217,213,80,140,114,220,149,229,215,150,211,170,199,109,218,108,206,125,204,164,183,148,193,153,188,143,160,199,210,124,225,105,178,75,222,112,205,75,199,83,222,215,158,104,205,70,215,223,158,220,99,124,194,91,219,65,209,149,234,188,222,218,200,66,208,101,208,188,201,207,198,160,205,219,118,170,195,141,128,182,189,211,51,179,216,112,152,196,224,80,223,177,218,127,186,113,151,117,159,184,200,231,212,134,222,68,216,204,90,151,202,158,173,146,144,209,222,89,171,188,206,215,117,191,223,80,215,68,151,159,210,74,208,105,204,105,174,228,67,147,202,214,41,222,163,183,142,227,188,200,208,132,157,191,195,94,166,201,155,214,132,185,195,165,214,219,156,217,100,147,220,105,220,223,225,117,176,166,207,134,218,202,172,183,217,76,229,139,217,87,208,181,214,169,154,199,231,208,200,157,213,222,49,207,191,139,219,218,82,122,154,137,190,176,192,180,231,80,208,136,185,112,189,187,192,208,219,110,208,102,186,225,220,218,235,214,96,133,198,199,95,152,83,219,190,124,148,222,213,207,145,217,168,200,32,208,198,171,205,218,104,218,136,230,77,139,178,212,214,129,211,233,93,221,95,155,153,221,151,201,108,193,216,172,208,72,177,217,208,225,60,81,172,234,180,146,184,180,199,222,224,182,106,183,105,221,208,81,221,137,211,44,198,224,205,82,187,164,147,213,163,229,192,219,218,190,228,221,218,116,231,97,182,126,213,193,173,207,103,207,122,161,156,193,153,192,92,198,155,223,224,106,223,170,177,225,40,206,203,221,152,208,116,203,105,197,221,65,226,111,224,215,215,142,154,211,84,184,135,169,212,213,76,229,225,130,207,85,209,177,116,140,150,229,209,105,164,186,214,219,62,148,203,230,118,207,216,217,211,114,201,134,217,87,197,124,209,186,184,164,189,163,212,85,219,106,209,220,168,221,212,112,189,183,229,200,192,106,224,114,220,95,141,223,222,219,226,79,187,202,212,222,193,128,223,209,160,178,197,222,111,152,114,164,206,179,189,183,122,212,149,150,130,214,115,195,118,204,208,120,195,163,159,220,125,222,174,211,144,140,220,80,173,204,87,215,149,227,97,229,221,119,213,139,212,82,215,68,213,136,186,168,202,85,213,93,168,218,63,222,121,213,90,218,94,197,221,182,220,112,224,221,83,215,194,138,181,197,221,91,186,194,99,179,213,210,76,188,214,216,79,213,92,220,94,201,108,213,131,221,153,219,74,159,172,218,217,219,108,209,35,217,112,157,206,148,211,126,214,107,191,105,164,224,220,126,204,201,211,119,203,129,165,205,130,218,137,211,178,204,215,206,119,209,75,193,182,115,178,207,217,220,184,199,167,189,230,140,208,167,173,195,136,220,205,64,186,159,194,75,213,189,110,189,208,120,176,182,162,196,183,137,134,198,172,193,184,172,212,189,214,200,95,167,199,110,195,116,180,204,66,217,126,222,226,193,166,143,121,218,224,196,140,226,142,206,66,214,140,218,144,201,221,152,204,154,221,101,225,69,216,212,128,181,200,180,206,136,193,226,226,219,141,168,202,209,105,178,128,208,197,145,163,218,124,220,147,220,156,211,69,212,226,132,223,165,211,110,229,194,193,204,208,90,218,106,152,215,134,180,182,204,211,80,211,108,204,102,208,102,192,94,191,206,203,140,165,213,197,105,186,186,210,62,208,165,213,174,186,115,211,164,193,128,201,169,217,167,217,174,222,167,129,170,217,218,182,159,113,208,85,219,126,219,89,172,213,212,69,205,135,124,205,67,208,52,186,160,225,125,190,65,227,168,180,166,151,194,205,88,138,204,110,124,223,80,193,106,179,183,163,220,148,205,218,200,221,198,202,190,213,87,227,239,81,213,115,211,219,201,204,105,216,83,210,95,170,219,197,86,228,86,209,125,231,83,198,194,171,144,183,222,216,216,226,204,109,216,115,180,227,116,207,223,84,190,215,117,212,79,214,230,218,134,208,92,209,99,181,220,61,212,85,199,185,174,171,220,221,57,146,224,214,81,183,218,152,163,193,72,203,111,130,222,204,217,229,209,201,189,220,213,72,192,214,221,157,197,124,209,100,223,162,223,73,85,221,68,210,46,205,76,118,160,180,200,85,218,34,196,170,204,88,168,224,71,188,207,166,181,210,101,148,215,76,219,217,104,199,180,210,208,69,201,202,216,224,89,206,88,128,125,196,203,106,183,182,84,228,200,120,171,166,153,215,210,82,210,65,163,131,213,83,180,205,92,152,228,99,212,212,209,94,215,218,95,206,179,236,219,204,102,196,110,160,218,140,228,102,218,177,172,193,95,209,215,203,220,78,213,191,189,168,204,218,192,89,213,223,208,230,168,202,169,222,56,155,203,73,197,156,208,222,127,212,223,160,217,201,223,205,217,100,221,208,143,218,22,208,224,86,209,90,220,144,182,139,219,79,194,183,225,216,209,190,102,221,220,203,152,215,113,230,106,212,221,106,197,111,185,211,83,162,200,208,219,124,210,92,161,216,221,79,208,166,211,142,179,222,197,79,185,226,182,193,195,170,202,113,192,231,228,197,180,152,223,155,215,97,213,85,205,145,200,89,209,201,139,217,163,133,206,174,112,205,213,217,123,201,136,132,221,206,105,200,139,183,160,215,134,221,208,181,208,101,210,225,116,183,217,80,186,120,162,194,201,97,188,209,174,122,229,214,205,120,218,228,228,83,134,201,119,171,210,55,227,107,225,64,174,203,130,213,226,95,206,157,139,154,171,139,224,219,154,190,206,218,217,221,86,135,217,213,183,118,187,186,128,212,79,184,186,156,188,134,216,172,155,195,210,176,223,125,128,217,213,214,217,94,189,192,133,167,85,187,217,60,158,221,122,217,117,224,95,206,89,199,74,205,210,70,209,83,196,77,169,188,228,114,188,105,229,182,174,223,59,181,205,127,225,222,128,190,219,112,197,115,222,205,115,184,214,210,238,196,212,85,207,86,183,215,178,187,112,194,176,212,90,217,80,164,179,187,138,221,214,203,96,227,198,82,209,207,96,208,76,228,224,205,207,186,178,221,224,189,143,229,182,229,132,213,201,186,218,67,222,221,83,216,44,220,68,214,103,137,210,48,210,211,88,222,169,225,155,181,200,207,62,219,152,179,130,215,86,183,90,202,164,179,205,128,219,224,148,190,117,213,221,184,175,146,223,216,82,230,92,201,231,218,172,207,209,189,150,216,142,226,46,197,219,92,228,168,216,111,221,204,216,190,180,189,206,108,204,101,224,74,223,65,181,142,211,203,122,220,214,66,179,185,222,116,120,209,222,101,172,209,232,56,223,177,212,107,191,168,193,193,105,212,105,163,209,191,117,194,170,211,190,121,221,90,161,207,67,226,90,216,60,171,219,89,220,184,214,194,144,182,214,208,196,133,226,214,77,190,215,63,115,208,92,227,204,68,197,77,178,188,204,216,202,194,152,175,181,208,131,196,223,128,177,116,207,102,220,121,216,199,202,207,74,211,196,217,177,214,218,221,93,219,192,215,143,139,183,226,83,102,219,211,100,194,47,119,204,160,143,180,149,183,226,96,172,212,186,198,207,182,207,221,164,146,224,61,200,140,147,196,214,60,226,215,111,216,186,180,161,234,196,124,208,214,221,189,178,112,168,208,136,169,181,202,124,201,74,211,143,192,175,119,163,202,67,191,172,148,179,223,122,217,117,180,190,186,209,72,193,182,218,80,156,138,201,202,218,208,149,190,109,224,64,192,114,219,199,210,208,97,232,205,197,188,189,87,213,152,196,164,131,225,94,219,205,163,191,172,196,189,206,110,201,73,191,122,210,173,208,73,221,136,222,185,224,74,213,86,185,221,170,210,69,165,156,210,102,211,210,221,197,209,154,127,212,180,208,139,231,207,50,166,96,150,158,206,228,214,202,95,162,220,191,136,217,54,155,201,140,179,191,91,213,220,74,145,216,232,45,208,217,209,182,160,182,100,221,155,219,227,160,180,209,147,174,212,67,209,82,213,148,208,51,176,88,210,71,117,174,218,82,211,188,170,210,186,136,176,220,157,189,167,190,212,160,212,135,201,219,49,162,165,209,117,175,213,152,176,220,221,124,150,204,220,73,228,194,218,57,195,173,159,173,175,206,176,229,79,164,201,203,152,214,116,137,219,66,222,214,80,175,202,90,225,113,219,206,78,190,89,214,50,209,154,188,227,194,157,195,74,186,206,130,198,73,212,60,204,122,222,99,205,196,229,213,83,230,108,171,126,192,104,216,207,117,217,197,214,79,207,110,221,79,217,144,206,160,206,172,197,183,207,217,207,113,210,221,71,161,221,164,227,214,142,177,185,180,103,130,198,123,205,74,216,102,219,160,217,75,204,114,192,213,166,188,118,222,227,92,195,219,161,200,221,69,203,143,198,198,217,198,66,212,50,208,116,199,125,210,207,167,225,116,207,97,184,99,220,184,203,184,219,177,167,202,214,55,207,161,197,122,212,226,187,96,216,201,188,135,224,207,139,225,230,220,121,221,107,212,66,170,169,210,199,102,220,94,159,184,207,92,207,231,214,125,227,220,205,58,193,203,215,223,229,78,196,170,185,196,162,234,56,201,123,171,231,196,86,162,199,213,220,68,200,68,205,88,225,135,220,82,182,215,222,79,152,230,62,162,218,184,224,67,206,99,189,124,214,197,73,204,105,221,179,102,218,232,80,214,181,170,204,165,216,207,217,212,195,176,215,106,192,160,221,182,217,57,211,88,198,233,113,171,204,138,193,209,225,59,176,184,134,223,151,193,200,217,100,225,79,180,142,190,123,222,80,232,216,133,216,148,211,110,198,96,187,224,95,208,112,178,227,94,171,96,181,209,170,225,196,206,94,216,87,217,171,191,82,218,127,227,176,219,207,230,79,214,203,105,213,143,174,188,125,193,220,60,215,172,214,101,211,110,161,117,187,180,125,218,220,62,208,203,217,87,198,156,216,226,161,161,223,224,72,178,198,213,195,219,208,140,175,217,74,201,201,66,186,154,229,89,226,169,204,87,184,85,161,133,201,80,176,188,114,224,77,207,126,202,83,219,200,125,172,169,190,216,80,88,221,68,218,133,216,117,217,157,217,170,190,124,214,210,156,231,84,207,204,113,200,70,222,162,208,227,92,223,136,167,195,221,221,77,173,213,109,214,117,211,217,89,217,91,210,152,194,206,202,110,216,177,190,207,227,185,172,230,172,207,171,199,234,207,149,194,192,179,212,209,210,101,198,225,85,164,211,110,194,182,211,224,65,228,218,79,224,81,122,208,154,129,206,92,193,171,148,188,221,80,220,161,165,166,161,214,99,210,64,174,224,221,105,200,122,230,216,94,223,128,225,161,219,126,187,137,191,222,214,148,151,198,218,210,110,208,228,184,211,35,202,218,195,216,115,212,95,177,199,101,184,208,202,212,134,193,129,192,81,182,223,70,226,230,134,167,183,198,222,227,227,226,63,213,109,187,177,219,223,203,144,179,209,103,177,181,158,221,90,222,166,207,175,230,207,99,205,234,210,210,168,223,143,210,187,209,204,150,209,213,208,193,221,214,77,215,199,81,197,82,177,190,210,231,79,179,221,64,182,199,82,204,204,95,172,187,178,209,86,222,220,118,192,223,88,220,77,174,104,224,137,182,186,96,207,198,74,152,196,217,206,79,214,208,204,180,94,215,81,177,160,201,164,173,205,76,199,220,228,91,215,155,226,79,133,181,136,182,226,96,221,109,209,223,71,202,95,217,87,202,204,183,210,187,212,81,226,184,224,88,170,214,198,226,142,212,81,209,189,172,192,221,216,123,221,126,204,218,222,76,205,73,225,221,73,204,108,201,88,174,197,136,223,90,189,56,207,147,206,212,73,201,83,204,112,137,227,67,208,137,219,225,65,200,186,99,214,97,215,74,203,65,199,216,108,216,80,206,219,104,226,180,225,199,186,197,226,157,102,177,107,231,156,141,226,70,220,216,223,64,214,66,201,174,170,207,46,202,131,173,218,125,217,157,234,192,159,174,209,95,196,224,59,220,69,211,130,203,222,88,208,86,198,127,219,228,75,218,170,168,198,128,215,54,211,167,186,117,211,162,221,219,105,223,99,223,127,202,218,213,143,194,181,200,180,230,224,97,181,132,173,202,221,57,151,220,77,220,160,206,188,101,197,72,213,95,193,212,189,105,226,100,205,201,56,211,93,178,212,88,208,83,213,165,219,183,236,121,220,210,94,212,171,186,218,137,212,129,175,203,223,134,194,95,193,191,105,229,208,102,196,120,191,221,217,65,206,200,74,168,180,199,217,119,223,68,211,125,204,105,180,164,215,227,128,211,166,218,86,185,74,214,57,200,171,111,185,73,199,220,213,192,216,107,211,115,219,227,192,221,101,203,65,211,51,216,84,193,121,214,86,195,115,179,229,90,215,92,207,63,179,212,38,202,104,182,125,179,99,147,184,210,166,227,232,164,120,218,169,203,154,192,224,217,122,160,205,206,221,80,191,217,166,202,78,206,147,202,155,195,76,204,136,191,112,195,160,147,226,91,224,216,212,177,188,165,174,130,203,221,220,133,209,147,216,69,159,155,143,213,94,227,139,209,163,183,199,112,217,213,98,217,96,185,158,173,229,51,209,195,227,214,161,213,83,168,229,209,118,221,224,59,179,161,220,209,193,199,199,212,107]
|
ivf.pid.pt
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:0d939a7e7d8bcb004af28daca01092ae0ccf65ae2414477449087fed664b2c3c
|
3 |
+
size 1777816
|
metadata.json
CHANGED
@@ -37,7 +37,7 @@
|
|
37 |
"checkpoint":"colbert-ir/colbertv2.0",
|
38 |
"triples":"/future/u/okhattab/root/unit/experiments/2021.10/downstream.distillation.round2.2_score/round2.nway6.cosine.ib/examples.64.json",
|
39 |
"collection":[
|
40 |
-
"list with
|
41 |
[
|
42 |
"Deep neural networks have demonstrated remarkable performance in supervised learning tasks but require large amounts of labeled data. Self-supervised learning offers an alternative paradigm, enabling the model to learn from data without explicit labels. Information theory has been instrumental in understanding and optimizing deep neural networks. Specifically, the information bottleneck principle has been applied to optimize the trade-off between compression and relevant information preservation in supervised settings. However, the optimal information objective in self-supervised learning remains unclear. In this paper, we review various approaches to self-supervised learning from an information-theoretic standpoint and present a unified framework that formalizes the self-supervised information-theoretic learning problem. We integrate existing research into a coherent framework, examine recent self-supervised methods, and identify research opportunities and challenges. Moreover, we discuss empirical measurement of information-theoretic quantities and their estimators. This paper offers a comprehensive review of the intersection between information theory, self-supervised learning, and deep neural networks.",
|
43 |
"Pre-trained large language models (LLMs) capture procedural knowledge about the world. Recent work has leveraged LLM's ability to generate abstract plans to simplify challenging control tasks, either by action scoring, or action modeling (fine-tuning). However, the transformer architecture inherits several constraints that make it difficult for the LLM to directly serve as the agent: e.g. limited input lengths, fine-tuning inefficiency, bias from pre-training, and incompatibility with non-text environments. To maintain compatibility with a low-level trainable actor, we propose to instead use the knowledge in LLMs to simplify the control problem, rather than solving it. We propose the Plan, Eliminate, and Track (PET) framework. The Plan module translates a task description into a list of high-level sub-tasks. The Eliminate module masks out irrelevant objects and receptacles from the observation for the current sub-task. Finally, the Track module determines whether the agent has accomplished each sub-task. On the AlfWorld instruction following benchmark, the PET framework leads to a significant 15% improvement over SOTA for generalization to human goal specifications.",
|
@@ -50,7 +50,7 @@
|
|
50 |
"root":".ragatouille/",
|
51 |
"experiment":"colbert",
|
52 |
"index_root":null,
|
53 |
-
"name":"2024-06/21/
|
54 |
"rank":0,
|
55 |
"nranks":1,
|
56 |
"amp":true,
|
@@ -59,8 +59,8 @@
|
|
59 |
},
|
60 |
"num_chunks":1,
|
61 |
"num_partitions":8192,
|
62 |
-
"num_embeddings":
|
63 |
-
"avg_doclen":171.
|
64 |
"RAGatouille":{
|
65 |
"index_config":{
|
66 |
"index_type":"PLAID",
|
|
|
37 |
"checkpoint":"colbert-ir/colbertv2.0",
|
38 |
"triples":"/future/u/okhattab/root/unit/experiments/2021.10/downstream.distillation.round2.2_score/round2.nway6.cosine.ib/examples.64.json",
|
39 |
"collection":[
|
40 |
+
"list with 3936 elements starting with...",
|
41 |
[
|
42 |
"Deep neural networks have demonstrated remarkable performance in supervised learning tasks but require large amounts of labeled data. Self-supervised learning offers an alternative paradigm, enabling the model to learn from data without explicit labels. Information theory has been instrumental in understanding and optimizing deep neural networks. Specifically, the information bottleneck principle has been applied to optimize the trade-off between compression and relevant information preservation in supervised settings. However, the optimal information objective in self-supervised learning remains unclear. In this paper, we review various approaches to self-supervised learning from an information-theoretic standpoint and present a unified framework that formalizes the self-supervised information-theoretic learning problem. We integrate existing research into a coherent framework, examine recent self-supervised methods, and identify research opportunities and challenges. Moreover, we discuss empirical measurement of information-theoretic quantities and their estimators. This paper offers a comprehensive review of the intersection between information theory, self-supervised learning, and deep neural networks.",
|
43 |
"Pre-trained large language models (LLMs) capture procedural knowledge about the world. Recent work has leveraged LLM's ability to generate abstract plans to simplify challenging control tasks, either by action scoring, or action modeling (fine-tuning). However, the transformer architecture inherits several constraints that make it difficult for the LLM to directly serve as the agent: e.g. limited input lengths, fine-tuning inefficiency, bias from pre-training, and incompatibility with non-text environments. To maintain compatibility with a low-level trainable actor, we propose to instead use the knowledge in LLMs to simplify the control problem, rather than solving it. We propose the Plan, Eliminate, and Track (PET) framework. The Plan module translates a task description into a list of high-level sub-tasks. The Eliminate module masks out irrelevant objects and receptacles from the observation for the current sub-task. Finally, the Track module determines whether the agent has accomplished each sub-task. On the AlfWorld instruction following benchmark, the PET framework leads to a significant 15% improvement over SOTA for generalization to human goal specifications.",
|
|
|
50 |
"root":".ragatouille/",
|
51 |
"experiment":"colbert",
|
52 |
"index_root":null,
|
53 |
+
"name":"2024-06/21/11.54.10",
|
54 |
"rank":0,
|
55 |
"nranks":1,
|
56 |
"amp":true,
|
|
|
59 |
},
|
60 |
"num_chunks":1,
|
61 |
"num_partitions":8192,
|
62 |
+
"num_embeddings":676085,
|
63 |
+
"avg_doclen":171.7695630081,
|
64 |
"RAGatouille":{
|
65 |
"index_config":{
|
66 |
"index_type":"PLAID",
|
pid_docid_map.json
CHANGED
@@ -3931,5 +3931,8 @@
|
|
3931 |
"3929":"2406.14347",
|
3932 |
"3930":"2406.11410",
|
3933 |
"3931":"2406.14539",
|
3934 |
-
"3932":"2406.14319"
|
|
|
|
|
|
|
3935 |
}
|
|
|
3931 |
"3929":"2406.14347",
|
3932 |
"3930":"2406.11410",
|
3933 |
"3931":"2406.14539",
|
3934 |
+
"3932":"2406.14319",
|
3935 |
+
"3933":"2406.13542",
|
3936 |
+
"3934":"2406.13621",
|
3937 |
+
"3935":"2406.13621"
|
3938 |
}
|
plan.json
CHANGED
@@ -37,7 +37,7 @@
|
|
37 |
"checkpoint": "colbert-ir\/colbertv2.0",
|
38 |
"triples": "\/future\/u\/okhattab\/root\/unit\/experiments\/2021.10\/downstream.distillation.round2.2_score\/round2.nway6.cosine.ib\/examples.64.json",
|
39 |
"collection": [
|
40 |
-
"list with
|
41 |
[
|
42 |
"Deep neural networks have demonstrated remarkable performance in supervised learning tasks but require large amounts of labeled data. Self-supervised learning offers an alternative paradigm, enabling the model to learn from data without explicit labels. Information theory has been instrumental in understanding and optimizing deep neural networks. Specifically, the information bottleneck principle has been applied to optimize the trade-off between compression and relevant information preservation in supervised settings. However, the optimal information objective in self-supervised learning remains unclear. In this paper, we review various approaches to self-supervised learning from an information-theoretic standpoint and present a unified framework that formalizes the self-supervised information-theoretic learning problem. We integrate existing research into a coherent framework, examine recent self-supervised methods, and identify research opportunities and challenges. Moreover, we discuss empirical measurement of information-theoretic quantities and their estimators. This paper offers a comprehensive review of the intersection between information theory, self-supervised learning, and deep neural networks.",
|
43 |
"Pre-trained large language models (LLMs) capture procedural knowledge about the world. Recent work has leveraged LLM's ability to generate abstract plans to simplify challenging control tasks, either by action scoring, or action modeling (fine-tuning). However, the transformer architecture inherits several constraints that make it difficult for the LLM to directly serve as the agent: e.g. limited input lengths, fine-tuning inefficiency, bias from pre-training, and incompatibility with non-text environments. To maintain compatibility with a low-level trainable actor, we propose to instead use the knowledge in LLMs to simplify the control problem, rather than solving it. We propose the Plan, Eliminate, and Track (PET) framework. The Plan module translates a task description into a list of high-level sub-tasks. The Eliminate module masks out irrelevant objects and receptacles from the observation for the current sub-task. Finally, the Track module determines whether the agent has accomplished each sub-task. On the AlfWorld instruction following benchmark, the PET framework leads to a significant 15% improvement over SOTA for generalization to human goal specifications.",
|
@@ -50,7 +50,7 @@
|
|
50 |
"root": ".ragatouille\/",
|
51 |
"experiment": "colbert",
|
52 |
"index_root": null,
|
53 |
-
"name": "2024-06\/21\/
|
54 |
"rank": 0,
|
55 |
"nranks": 1,
|
56 |
"amp": true,
|
@@ -59,6 +59,6 @@
|
|
59 |
},
|
60 |
"num_chunks": 1,
|
61 |
"num_partitions": 8192,
|
62 |
-
"num_embeddings_est":
|
63 |
-
"avg_doclen_est": 171.
|
64 |
}
|
|
|
37 |
"checkpoint": "colbert-ir\/colbertv2.0",
|
38 |
"triples": "\/future\/u\/okhattab\/root\/unit\/experiments\/2021.10\/downstream.distillation.round2.2_score\/round2.nway6.cosine.ib\/examples.64.json",
|
39 |
"collection": [
|
40 |
+
"list with 3936 elements starting with...",
|
41 |
[
|
42 |
"Deep neural networks have demonstrated remarkable performance in supervised learning tasks but require large amounts of labeled data. Self-supervised learning offers an alternative paradigm, enabling the model to learn from data without explicit labels. Information theory has been instrumental in understanding and optimizing deep neural networks. Specifically, the information bottleneck principle has been applied to optimize the trade-off between compression and relevant information preservation in supervised settings. However, the optimal information objective in self-supervised learning remains unclear. In this paper, we review various approaches to self-supervised learning from an information-theoretic standpoint and present a unified framework that formalizes the self-supervised information-theoretic learning problem. We integrate existing research into a coherent framework, examine recent self-supervised methods, and identify research opportunities and challenges. Moreover, we discuss empirical measurement of information-theoretic quantities and their estimators. This paper offers a comprehensive review of the intersection between information theory, self-supervised learning, and deep neural networks.",
|
43 |
"Pre-trained large language models (LLMs) capture procedural knowledge about the world. Recent work has leveraged LLM's ability to generate abstract plans to simplify challenging control tasks, either by action scoring, or action modeling (fine-tuning). However, the transformer architecture inherits several constraints that make it difficult for the LLM to directly serve as the agent: e.g. limited input lengths, fine-tuning inefficiency, bias from pre-training, and incompatibility with non-text environments. To maintain compatibility with a low-level trainable actor, we propose to instead use the knowledge in LLMs to simplify the control problem, rather than solving it. We propose the Plan, Eliminate, and Track (PET) framework. The Plan module translates a task description into a list of high-level sub-tasks. The Eliminate module masks out irrelevant objects and receptacles from the observation for the current sub-task. Finally, the Track module determines whether the agent has accomplished each sub-task. On the AlfWorld instruction following benchmark, the PET framework leads to a significant 15% improvement over SOTA for generalization to human goal specifications.",
|
|
|
50 |
"root": ".ragatouille\/",
|
51 |
"experiment": "colbert",
|
52 |
"index_root": null,
|
53 |
+
"name": "2024-06\/21\/11.54.10",
|
54 |
"rank": 0,
|
55 |
"nranks": 1,
|
56 |
"amp": true,
|
|
|
59 |
},
|
60 |
"num_chunks": 1,
|
61 |
"num_partitions": 8192,
|
62 |
+
"num_embeddings_est": 676084.9951171875,
|
63 |
+
"avg_doclen_est": 171.76956176757812
|
64 |
}
|