Update modeling_mpt.py

#11
No description provided.

Added inputs_embeds parameter.

Mosaic ML, Inc. org

Hi inputs_embeds is not implemented inside the modeling code. It is not sufficient to just add the parameter. Could you please explain why you need this parameter?

@daking so it could be trained using LoRA. PEFT is not able to run.

Also, I have a question that is not related to this issue. Is mosaicml/mpt-7b the same as togethercomputer/RedPajama-INCITE-Base-7B-v0.1? Or what are the differences between those 2? Edit: mpt-7b is better. But I'm not sure how to train it with LoRA since it is not supported... Would be good to add some support

Mosaic ML, Inc. org

They are completely different models. mosaicml/mpt-7b is trained and released by MosaicML, and togethercomputer/RedPajama-INCITE-Base-7B-v0.1 is trained and released by togethercomputer. Could you please move the peft/lora discussion and any issues you have to https://github.com/mosaicml/llm-foundry/issues/64?

daking changed pull request status to closed
This comment has been hidden

Sign up or log in to comment