File size: 303 Bytes
8a490b5 07c5be9 8a490b5 1628a96 6b46d8c |
1 2 3 4 5 6 7 8 9 |
---
pipeline_tag: text-generation
datasets:
- cerebras/SlimPajama-627B
language:
- en
---
This repo contains the trained 1.3 billion parameter LLAMA-2 architecture model checkpoints for the work [Multi-Agent Collaborative Data Selection for Efficient LLM Pretraining](https://arxiv.org/pdf/2410.08102). |