Error/Bug attn_output = F.scaled_dot_product_attention( AttributeError: module 'torch.nn.functional' has no attribute 'scaled_dot_product_attention'

#12
by WajihUllahBaig - opened

I am on a DGX-A100 machine, pytorch=1.12.1 and I get the following before I get an exception

The model 'RWForCausalLM' is not supported for text-generation. Supported models are ['BartForCausalLM', 'BertLMHeadModel', 'BertGenerationDecoder', 'BigBirdForCausalLM', 'BigBirdPegasusForCausalLM', 'BioGptForCausalLM', 'BlenderbotForCausalLM', 'BlenderbotSmallForCausalLM', 'BloomForCausalLM', 'CamembertForCausalLM', 'CodeGenForCausalLM', 'CTRLLMHeadModel', 'Data2VecTextForCausalLM', 'ElectraForCausalLM', 'ErnieForCausalLM', 'GitForCausalLM', 'GPT2LMHeadModel', 'GPT2LMHeadModel', 'GPTNeoForCausalLM', 'GPTNeoXForCausalLM', 'GPTNeoXJapaneseForCausalLM', 'GPTJForCausalLM', 'MarianForCausalLM', 'MBartForCausalLM', 'MegatronBertForCausalLM', 'MvpForCausalLM', 'OpenAIGPTLMHeadModel', 'OPTForCausalLM', 'PegasusForCausalLM', 'PLBartForCausalLM', 'ProphetNetForCausalLM', 'QDQBertLMHeadModel', 'ReformerModelWithLMHead', 'RemBertForCausalLM', 'RobertaForCausalLM', 'RobertaPreLayerNormForCausalLM', 'RoCBertForCausalLM', 'RoFormerForCausalLM', 'Speech2Text2ForCausalLM', 'TransfoXLLMHeadModel', 'TrOCRForCausalLM', 'XGLMForCausalLM', 'XLMWithLMHeadModel', 'XLMProphetNetForCausalLM', 'XLMRobertaForCausalLM', 'XLMRobertaXLForCausalLM', 'XLNetLMHeadModel', 'XmodForCausalLM'].
Setting pad_token_id to eos_token_id:11 for open-end generation.

attn_output = F.scaled_dot_product_attention(
AttributeError: module 'torch.nn.functional' has no attribute 'scaled_dot_product_attention'

WajihUllahBaig changed discussion title from attn_output = F.scaled_dot_product_attention( AttributeError: module 'torch.nn.functional' has no attribute 'scaled_dot_product_attention' to Error/Bug attn_output = F.scaled_dot_product_attention( AttributeError: module 'torch.nn.functional' has no attribute 'scaled_dot_product_attention'

I encountered a similar problem & the solution is to bump up to torch v2.0.1

Correct, I just bumped to 2.0.0 and it worked

FalconLLM changed discussion status to closed

Sign up or log in to comment