how can I run this model on a GPU with only 10GB memory?

#33
by azzzzzzz - opened

The original mpt-7b models may require around 32GB RAM or a little lesser systems.

You should use low-precision versions - some developers have finetuned it further which is suited for low computing scenarios.
You may also look for mpt-7b-128k , mpt-7b-32k etc.

Sign up or log in to comment