fcakyon commited on
Commit
2a2d45e
1 Parent(s): ee1f1f1

Add GenerationMixin as parent class

Browse files

Related to [florence2-large discussion](https://huggingface.co/microsoft/Florence-2-large/discussions/80).

Florence-2 currently triggers the following deprecation warning in the transformers library:
```
Florence2LanguageForConditionalGeneration has generative capabilities, as `prepare_inputs_for_generation` is explicitly overwritten. However, it doesn't directly inherit from `GenerationMixin`. From 👉v4.50👈 onwards, `PreTrainedModel` will NOT inherit from `GenerationMixin`, and this model will lose the ability to call `generate` and other related functions.
- If you're using `trust_remote_code=True`, you can get rid of this warning by loading the model with an auto class. See https://huggingface.co/docs/transformers/en/model_doc/auto#auto-classes
- If you are the owner of the model architecture code, please modify your model class such that it inherits from `GenerationMixin` (after `PreTrainedModel`, otherwise you'll get an exception).
- If you are not the owner of the model architecture class, please contact the model code owner to update it.
```

This PR follows the advice of the warning and adds `GenerationMixin` as a parent class of `Florence2LanguageForConditionalGeneration`.

Files changed (1) hide show
  1. modeling_florence2.py +2 -1
modeling_florence2.py CHANGED
@@ -29,6 +29,7 @@ from einops import rearrange
29
  from timm.models.layers import DropPath, trunc_normal_
30
 
31
  from transformers.modeling_utils import PreTrainedModel
 
32
  from transformers.utils import (
33
  ModelOutput,
34
  add_start_docstrings,
@@ -2059,7 +2060,7 @@ class Florence2LanguageModel(Florence2LanguagePreTrainedModel):
2059
  )
2060
 
2061
 
2062
- class Florence2LanguageForConditionalGeneration(Florence2LanguagePreTrainedModel):
2063
  base_model_prefix = "model"
2064
  _tied_weights_keys = ["encoder.embed_tokens.weight", "decoder.embed_tokens.weight", "lm_head.weight"]
2065
  _keys_to_ignore_on_load_missing = ["final_logits_bias"]
 
29
  from timm.models.layers import DropPath, trunc_normal_
30
 
31
  from transformers.modeling_utils import PreTrainedModel
32
+ from transformers.generation.utils import GenerationMixin
33
  from transformers.utils import (
34
  ModelOutput,
35
  add_start_docstrings,
 
2060
  )
2061
 
2062
 
2063
+ class Florence2LanguageForConditionalGeneration(Florence2LanguagePreTrainedModel, GenerationMixin):
2064
  base_model_prefix = "model"
2065
  _tied_weights_keys = ["encoder.embed_tokens.weight", "decoder.embed_tokens.weight", "lm_head.weight"]
2066
  _keys_to_ignore_on_load_missing = ["final_logits_bias"]