Error at the moment of training

#3
by AuraM - opened

I started training this model a month ago and it was running perfect, however in the last two weeks I got the error in the LORA fine-tuning example, I am really a novice and I have no idea how to correct it and there is not much information on the internet. If someone has the same error and could tell me how to correct it I would appreciate it. It gives this error:

RuntimeError: The output 0 of DequantizeAndLinearBackward is a view and is being modified inplace. This view was created inside a Custom Function (or because an input was returned as-is) and the autograd logic to handle view+inplace would override the custom backward associated with the Custom Function, resulting in incorrect gradients. This behavior is prohibited. You can fix this by cloning the output of the Custom Function.

Captura.PNG

I'm getting this error as well. Not entirely sure how to fix it. Is there anyone in the community who can chime in on this?

Getting the same here!

I'm afraid that the real error message is somewhere in the "8 frames" section.

On the bright side, i believe this code was superceded by YounesBelkada and TimDettmers 8bit integration that works with all models.
Please check the code: https://github.com/huggingface/transformers/pull/17901
and the usage example https://colab.research.google.com/drive/1qOjXfQIAULfKvZqwCen8-MoWKGdSatZ4#scrollTo=W8tQtyjp75O

The solution to this problem is , you have to put the output provided by ( DequantizeAndLinearBackward)in some other new variable , and return that variable instead of output .The problem will be fixed

@MukeshSharma Sorry I'm still a bit lost, would you mind showing an example of how you did it?

This seems to work for me:

class FrozenBNBLinear(nn.Module):
...
    def forward(self, input):
        output = DequantizeAndLinear.apply(input, self.weight, self.absmax, self.code, self.bias)
        output = output.clone()  ###Add this line
        if self.adapter:
            output += self.adapter(input)
        return output
...

I may be speaking too soon, but training has started at least.

update: worked fine!

Sign up or log in to comment