English

These SAEs were trained on the outputs of each of the MLPs in EleutherAI/pythia-160m. We used 8.2 billion tokens from the Pile training set at a context length of 2049. The number of latents is 32,768.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train EleutherAI/sae-pythia-160m-32k

Collection including EleutherAI/sae-pythia-160m-32k