Instructions to use ml233/no_position_embedding_prompt_tuning_linear_forecasting_long_ETTm1 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use ml233/no_position_embedding_prompt_tuning_linear_forecasting_long_ETTm1 with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("ml233/no_position_embedding_prompt_tuning_linear_forecasting_long_ETTm1", dtype="auto") - Notebooks
- Google Colab
- Kaggle