Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
custom_code
text-generation-inference

Error in Triton implementation

#9
by narenzen - opened

Using the below configuration i loaded the instruct model

config = transformers.AutoConfig.from_pretrained(
  'mosaicml/mpt-7b-instruct',
  trust_remote_code=True
)
config.attn_config['attn_impl'] = 'triton'

model = transformers.AutoModelForCausalLM.from_pretrained(
  'mosaicml/mpt-7b-instruct',
  config=config,
  torch_dtype=torch.bfloat16,
  trust_remote_code=True
)
model.to(device='cuda:0')

But I got error:
TypeError: dot() got an unexpected keyword argument 'trans_b'

Mosaic ML, Inc. org

You likely have an incompatible version of something. Please try the versions here: https://github.com/mosaicml/llm-foundry/blob/5fe01bcceb146d2a64d3b595c243d55fa7af9c70/setup.py#L74-L77

Mosaic ML, Inc. org
edited Jun 3, 2023

Closing as stale.

We've added a requirements.txt file as of this PR: https://huggingface.co/mosaicml/mpt-7b-instruct/discussions/41

abhi-mosaic changed discussion status to closed

Sign up or log in to comment