tim-lawson's picture
Push model using huggingface_hub.
940a64e verified
metadata
language: en
library_name: mlsae
license: mit
tags:
  - arxiv:2409.04185
  - model_hub_mixin
  - pytorch_model_hub_mixin

Model Card for tim-lawson/mlsae-Llama-3.2-3B-x64-k32

A Multi-Layer Sparse Autoencoder (MLSAE) trained on the residual stream activation vectors from meta-llama/Llama-3.2-3B with an expansion factor of R = 64 and sparsity k = 32, over 1 billion tokens from monology/pile-uncopyrighted.

This model is a PyTorch TopKSAE module, which does not include the underlying transformer.

Model Sources

Citation

BibTeX:

@misc{lawson_residual_2024,
  title         = {Residual {{Stream Analysis}} with {{Multi-Layer SAEs}}},
  author        = {Lawson, Tim and Farnik, Lucy and Houghton, Conor and Aitchison, Laurence},
  year          = {2024},
  month         = oct,
  number        = {arXiv:2409.04185},
  eprint        = {2409.04185},
  primaryclass  = {cs},
  publisher     = {arXiv},
  doi           = {10.48550/arXiv.2409.04185},
  urldate       = {2024-10-08},
  archiveprefix = {arXiv}
}