Error with transformers >=4.50 and use_fast=True

#18
by tfederico - opened

This is how I load the processor and the model

model = CLIPModel.from_pretrained("openai/clip-vit-large-patch14-336").to(device)
processor = CLIPProcessor.from_pretrained("openai/clip-vit-large-patch14-336", use_fast=True)

However, if I set use_fast to True and try to use the processor, I get the following error:

  File "/home/name/miniconda3/envs/test/lib/python3.10/site-packages/transformers/models/clip/processing_clip.py", line 97, in __call__
    tokenizer_kwargs = {k: v for k, v in kwargs.items() if k not in self.image_processor._valid_processor_keys}
  File "/home/name/miniconda3/envs/test/lib/python3.10/site-packages/transformers/models/clip/processing_clip.py", line 97, in <dictcomp>
    tokenizer_kwargs = {k: v for k, v in kwargs.items() if k not in self.image_processor._valid_processor_keys}
AttributeError: 'CLIPImageProcessorFast' object has no attribute '_valid_processor_keys'

samsies!!

I am also getting same error in the code line - self.processor = CLIPProcessor.from_pretrained(model_name, use_fast=True). Is this error resolved?

 95 tokenizer_kwargs, image_processor_kwargs = {}, {}
     96 if kwargs:
---> 97     tokenizer_kwargs = {k: v for k, v in kwargs.items() if k not in self.image_processor._valid_processor_keys}
     98     image_processor_kwargs = {
     99         k: v for k, v in kwargs.items() if k in self.image_processor._valid_processor_keys
    100     }
    102 if text is None and images is None:

AttributeError: 'CLIPImageProcessorFast' object has no attribute '_valid_processor_keys'

Same stuff, happens on

inputs = self.processor(text=texts, return_tensors="pt", padding="max_length")

where texts is a list[str]

And initialization is done without errors:

self.model_checkpoint = "openai/clip-vit-base-patch32"
self.processor = CLIPProcessor.from_pretrained(self.model_checkpoint, use_fast=True)

Without use_fast=True it works fine
And also when both texts are the same length and there is no padding="max_length" it also works with use_fast=True

Sign up or log in to comment