Edit model card

This is model is a simple extension to (mosaicml/mpt-30b)[https://huggingface.co/mosaicml/mpt-30b]. The alibi positional embedding is first manually interpolated by 2x, then extrapolated by another 2x, hence giving 4x the original context length in total.

Downloads last month
2
Inference Examples
Inference API (serverless) does not yet support model repos that contain custom code.