tim-lawson commited on
Commit
16a53be
1 Parent(s): 2996474

Push model using huggingface_hub.

Browse files
Files changed (1) hide show
  1. README.md +41 -3
README.md CHANGED
@@ -1,9 +1,47 @@
1
  ---
 
 
 
2
  tags:
 
3
  - model_hub_mixin
4
  - pytorch_model_hub_mixin
5
  ---
6
 
7
- This model has been pushed to the Hub using the [PytorchModelHubMixin](https://huggingface.co/docs/huggingface_hub/package_reference/mixins#huggingface_hub.PyTorchModelHubMixin) integration:
8
- - Library: [More Information Needed]
9
- - Docs: [More Information Needed]
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ language: en
3
+ library_name: mlsae
4
+ license: mit
5
  tags:
6
+ - arxiv:2409.04185
7
  - model_hub_mixin
8
  - pytorch_model_hub_mixin
9
  ---
10
 
11
+ # Model Card for tim-lawson/sae-pythia-70m-deduped-x64-k32-lens-layers-3
12
+
13
+ A Multi-Layer Sparse Autoencoder (MLSAE) trained on the residual stream activation
14
+ vectors from [EleutherAI/pythia-70m-deduped](https://huggingface.co/EleutherAI/pythia-70m-deduped) with an
15
+ expansion factor of R = 64 and sparsity k = 32, over 1 billion
16
+ tokens from [monology/pile-uncopyrighted](https://huggingface.co/datasets/monology/pile-uncopyrighted).
17
+
18
+
19
+ This model is a PyTorch TopKSAE module, which does not include the underlying
20
+ transformer.
21
+
22
+
23
+ ### Model Sources
24
+
25
+ - **Repository:** <https://github.com/tim-lawson/mlsae>
26
+ - **Paper:** <https://arxiv.org/abs/2409.04185>
27
+ - **Weights & Biases:** <https://wandb.ai/timlawson-/mlsae>
28
+
29
+ ## Citation
30
+
31
+ **BibTeX:**
32
+
33
+ ```bibtex
34
+ @misc{lawson_residual_2024,
35
+ title = {Residual {{Stream Analysis}} with {{Multi-Layer SAEs}}},
36
+ author = {Lawson, Tim and Farnik, Lucy and Houghton, Conor and Aitchison, Laurence},
37
+ year = {2024},
38
+ month = oct,
39
+ number = {arXiv:2409.04185},
40
+ eprint = {2409.04185},
41
+ primaryclass = {cs},
42
+ publisher = {arXiv},
43
+ doi = {10.48550/arXiv.2409.04185},
44
+ urldate = {2024-10-08},
45
+ archiveprefix = {arXiv}
46
+ }
47
+ ```