m2-bert-80M-8k / README.md
danfu09's picture
Update README.md
624c2de
|
raw
history blame
No virus
499 Bytes
metadata
license: apache-2.0
language:
  - en
pipeline_tag: text-classification

Monarch Mixer-BERT

The 80M checkpoint for M2-BERT-base from the paper Monarch Mixer: A Simple Sub-Quadratic GEMM-Based Architecture. This model has been pretrained with sequence length 8K.

This model was trained by Dan Fu, Jon Saad-Falcon, and Simran Arora.

Check out our GitHub for instructions on how to download and fine-tune it!