Quantization script not working

#2
by EyRaG - opened

I copied your quantization creation script and ran it and constantly having this issue.
can you specify which version of pytorch / llm-compressor you used to run the script ?

  File "C:\Users\civiliste\Documents\Jarod\.venv\Lib\site-packages\llmcompressor\entrypoints\utils.py", line 233, in initialize_processor_from_path
    processor = AutoProcessor.from_pretrained(
        processor_src,
    ...<4 lines>...
        trust_remote_code=model_args.trust_remote_code_model,
    )
  File "C:\Users\civiliste\Documents\Jarod\.venv\Lib\site-packages\transformers\models\auto\processing_auto.py", line 385, in from_pretrained
    return processor_class.from_pretrained(
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
        pretrained_model_name_or_path, trust_remote_code=trust_remote_code, **kwargs
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "C:\Users\civiliste\Documents\Jarod\.venv\Lib\site-packages\transformers\processing_utils.py", line 1312, in from_pretrained     
    args = cls._get_arguments_from_pretrained(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\civiliste\Documents\Jarod\.venv\Lib\site-packages\transformers\processing_utils.py", line 1371, in _get_arguments_from_pretrained
    args.append(attribute_class.from_pretrained(pretrained_model_name_or_path, **kwargs))
                ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\civiliste\Documents\Jarod\.venv\Lib\site-packages\transformers\tokenization_mistral_common.py", line 1762, in from_pretrained
    raise ValueError(
        f"Kwargs {list(kwargs.keys())} are not supported by `MistralCommonTokenizer.from_pretrained`."
    )
ValueError: Kwargs ['trust_remote_code', 'use_fast', '_from_auto'] are not supported by `MistralCommonTokenizer.from_pretrained`.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\civiliste\Documents\Jarod\quantize.py", line 20, in <module>
    oneshot(
    ~~~~~~~^
        model=model,
        ^^^^^^^^^^^^
        recipe=recipe,
        ^^^^^^^^^^^^^^
    )
    ^
  File "C:\Users\civiliste\Documents\Jarod\.venv\Lib\site-packages\llmcompressor\entrypoints\oneshot.py", line 318, in oneshot
    one_shot = Oneshot(**local_args, **kwargs)
  File "C:\Users\civiliste\Documents\Jarod\.venv\Lib\site-packages\llmcompressor\entrypoints\oneshot.py", line 128, in __init__
    pre_process(model_args)
    ~~~~~~~~~~~^^^^^^^^^^^^
  File "C:\Users\civiliste\Documents\Jarod\.venv\Lib\site-packages\llmcompressor\entrypoints\utils.py", line 59, in pre_process
    model_args.processor = initialize_processor_from_path(
                           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
        model_args, model_args.model
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "C:\Users\civiliste\Documents\Jarod\.venv\Lib\site-packages\llmcompressor\entrypoints\utils.py", line 252, in initialize_processor_from_path
    processor = AutoProcessor.from_pretrained(
        processor_src,
    ...<4 lines>...
        trust_remote_code=model_args.trust_remote_code_model,
    )
  File "C:\Users\civiliste\Documents\Jarod\.venv\Lib\site-packages\transformers\models\auto\processing_auto.py", line 385, in from_pretrained
    return processor_class.from_pretrained(
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
        pretrained_model_name_or_path, trust_remote_code=trust_remote_code, **kwargs
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "C:\Users\civiliste\Documents\Jarod\.venv\Lib\site-packages\transformers\processing_utils.py", line 1312, in from_pretrained     
    args = cls._get_arguments_from_pretrained(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\civiliste\Documents\Jarod\.venv\Lib\site-packages\transformers\processing_utils.py", line 1371, in _get_arguments_from_pretrained
    args.append(attribute_class.from_pretrained(pretrained_model_name_or_path, **kwargs))
                ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\civiliste\Documents\Jarod\.venv\Lib\site-packages\transformers\tokenization_mistral_common.py", line 1762, in from_pretrained
    raise ValueError(
        f"Kwargs {list(kwargs.keys())} are not supported by `MistralCommonTokenizer.from_pretrained`."
    )
ValueError: Kwargs ['trust_remote_code', 'use_fast', '_from_auto'] are not supported by `MistralCommonTokenizer.from_pretrained`.
() PS C:\Users\civiliste\Documents\Jarod> 
Red Hat AI org

Thank you for reporting this bug. The script was missing passing the processor to the oneshot command. Please check the updated version.

Also, please make sure to use the latest versions of transformers and compressed-tensors.

alexmarques changed discussion status to closed

Sign up or log in to comment