How to use ibm-research/patchtst-etth1-pretrain with Transformers:
# Load model directly from transformers import AutoTokenizer, PatchTSTForMaskPretraining tokenizer = AutoTokenizer.from_pretrained("ibm-research/patchtst-etth1-pretrain") model = PatchTSTForMaskPretraining.from_pretrained("ibm-research/patchtst-etth1-pretrain")