--- license: apache-2.0 pipeline_tag: time-series-forecasting library_name: grok tags: - grok-3 - not-for-all-audiences datasets: - HuggingFaceFW/fineweb-2 new_version: Qwen/QwQ-32B-Preview --- # Grok-3 This repository contains the weights of the Grok-3 open-weights model. You can find the code in the [GitHub Repository](https://github.com/xai-org/grok-3/tree/main). # Download instruction Clone the repo & download the `int8` checkpoint to the `checkpoints` directory by executing this command in the repo root directory: ```shell git clone https://github.com/xai-org/grok-3.git && cd grok-1 pip install huggingface_hub[hf_transfer] huggingface-cli download xai-org/grok-1 --repo-type model --include ckpt-0/* --local-dir checkpoints --local-dir-use-symlinks False ``` Then, you can run: ```shell pip install -r requirements.txt python run.py ``` You should be seeing output from the language model. Due to the large size of the model (314B parameters), a multi-GPU machine is required to test the model with the example code. p.s. we're hiring: https://x.ai/careers