Inference API not working

#4
by dlaird-skl - opened

I'm looking to evaluate this model on a dataset I've created. I started by using the HF inference API, but it's currently not working. I am getting this error:

Failed to perform inference: Could not load model /repository with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForSequenceClassification'>,). See the original errors: while loading with AutoModelForSequenceClassification, an error is thrown: Traceback (most recent call last): File "/usr/local/lib/python3.11/dist-packages/transformers/pipelines/base.py", line 291, in infer_framework_load_model model = model_class.from_pretrained(model, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/dist-packages/transformers/models/auto/auto_factory.py", line 571, in from_pretrained return model_class.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/dist-packages/transformers/modeling_utils.py", line 279, in _wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/dist-packages/transformers/modeling_utils.py", line 4399, in from_pretrained ) = cls._load_pretrained_model( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/dist-packages/transformers/modeling_utils.py", line 4716, in _load_pretrained_model raise ValueError( ValueError: The state dictionary of the model you are trying to load is corrupted. Are you sure it was properly saved?

I have the same error. I'm assuming not many have used this and it has become unsupported being the last update was like 7-8 months ago.

I have the same problem

Sign up or log in to comment