[transformers] Compatibility with transformers>=5.0

#1
by guibru - opened

Preparing the transformers 5.0 switch, I had to tweak the parameters to keep the behaviour consistent with transformers<5.0.

Using transformers==5.0.0rc0, I had to specify the following parameters:

from transformers import AutoModelForZeroShotObjectDetection

model_id = "iSEE-Laboratory/llmdet_large"
model = AutoModelForZeroShotObjectDetection.from_pretrained(
            model_id, tie_encoder_decoder=False, tie_word_embeddings=False
        )

However with these parameters I got a warning that some weights are not initialized, looking like the finetuning deliberately ignored part of the backbone (silently passing in transformers<5.0). I would like feedback about this issue, and if my understanding is correct.

MMGroundingDinoForObjectDetection LOAD REPORT from: iSEE-Laboratory/llmdet_large
Key                                                | Status  | 
---------------------------------------------------+---------+-
bbox_embed.{1, 2, 3, 4, 5}.layers.{0, 1, 2}.weight | MISSING | 
bbox_embed.{1, 2, 3, 4, 5}.layers.{0, 1, 2}.bias   | MISSING | 
model.decoder.class_embed.0.bias                   | MISSING | 
model.decoder.bbox_embed.0.layers.{0, 1, 2}.bias   | MISSING | 
model.decoder.bbox_embed.0.layers.{0, 1, 2}.weight | MISSING | 
class_embed.{1, 2, 3, 4, 5}.bias                   | MISSING | 

Notes:
- MISSING	:those params were newly initialized because missing form the checkpoint. Consider training on your downstream task.

Thanks for your insights.

Sign up or log in to comment