English
Edit model card

These SAEs were trained on the outputs of each of the MLPs in EleutherAI/pythia-160m. We used 8.2 billion tokens from the Pile training set at a context length of 2049. The number of latents is 32,768.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Dataset used to train EleutherAI/sae-pythia-160m-32k

Collection including EleutherAI/sae-pythia-160m-32k