Edit model card

Model Card for Extended-Mind-MPT-7b

Extended Mind MPT-7b, as described in Supersizing Transformers.

Model Description

This model implements active externalism for MPT's 7b model. The model weights have not been edited. Original architecture and code by Mosaic ML.

For more details on active externalism, check out our blog!

Limitations

This model is part of ongoing research at Normal Computing.

Downloads last month
203
Inference Examples
Inference API (serverless) does not yet support model repos that contain custom code.