Model loading with transformers fail

#2
by NicolasBFR - opened

On Google Collab, with current transformers (4.57.1), while loading

# Load model directly
from transformers import AutoModelForTokenClassification
model = AutoModelForTokenClassification.from_pretrained("segment-any-text/sat-12l-sm", dtype="auto")

I get:

The checkpoint you are trying to load has model type `xlm-token` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

Is it a new error ?

Segment any Text org

Hi,

Please use our library wtpsplit. It makes much easier. You just need to install it:

pip install wtpsplit

And then run in Python:

from wtpsplit import SaT

sat = SaT("sat-12l-sm")

sat.split("This is a test This is another test.")
markus583 changed discussion status to closed

Sign up or log in to comment