tim-lawson's picture
Update README.md
113a77e verified
metadata
base_model: EleutherAI/pythia-160m-deduped
language: en
library_name: mlsae
license: mit
tags:
  - arxiv:2409.04185
  - model_hub_mixin
  - pytorch_model_hub_mixin

Model Card for tim-lawson/mlsae-pythia-160m-deduped-x8-k32-tfm

A Multi-Layer Sparse Autoencoder (MLSAE) trained on the residual stream activation vectors from EleutherAI/pythia-160m-deduped with an expansion factor of R = 8 and sparsity k = 32, over 1 billion tokens from monology/pile-uncopyrighted.

This model is a PyTorch Lightning MLSAETransformer module, which includes the underlying transformer.

Model Sources

Citation

BibTeX:

@misc{lawson_residual_2024,
  title         = {Residual {{Stream Analysis}} with {{Multi-Layer SAEs}}},
  author        = {Lawson, Tim and Farnik, Lucy and Houghton, Conor and Aitchison, Laurence},
  year          = {2024},
  month         = oct,
  number        = {arXiv:2409.04185},
  eprint        = {2409.04185},
  primaryclass  = {cs},
  publisher     = {arXiv},
  doi           = {10.48550/arXiv.2409.04185},
  urldate       = {2024-10-08},
  archiveprefix = {arXiv}
}