MEG-GPT / README.md
sungjun-cho's picture
edit: model card tags
c6bf317 verified
metadata
license: mit
language:
  - en
tags:
  - neuroscience
  - MEG
  - large-brainwave-models
  - time-series

MEG-GPT

MEG-GPT: A transformer-based foundation model for magnetoencephalography data

MEG-GPT is a transformer-based foundation model for human magnetoencephalography (MEG) data. As a Large Brainwave Model (LBM), it is designed to capture the spatiotemporal structure of large-scale brain dynamics. MEG-GPT is built on a decoder-only transformer trained in an autoregressive manner.

Github | Model Download | Arxiv Paper Link

πŸ“ Usage

The easiest way to load the model and run inference is via the osl-foundation Python package. Installation and setup instructions are available in our GitHub repostiory. Once the package is installed, you can follow the steps below.

Step 1: Clone the repository

From your command line:

git clone https://huggingface.co/OHBA-analysis/MEG-GPT models
cd models
git lfs install --local
git lfs pull

Step 2: Load the models

In Python script:

from osl_foundation import load_model

tokenizer = load_model("tokenizer")
meg_gpt = load_model("meg-gpt", checkpoint="latest")

βš™οΈ System Requirements

Hardware Requirements

  • GPU: NVIDIA GPU with CUDA support

    Note: For model training and experiments, we used two NVIDIA A100 or V100 GPUs.

Software Requirements

  • Python: Python 3.10
  • CUDA: CUDA 11.7.0 (with TensorFlow 2.11.0)

πŸ“– Citation

@article{huang2025,
         title={MEG-GPT: A transformer-based foundation model for magnetoencephalography data},
         author={Rukuang Huang, SungJun Cho, Chetan Gohil, Oiwi Parker Jones, Mark Woolrich},
         year={2025},
         url={https://arxiv.org/abs/2510.18080},
}