Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts Paper • 2409.16040 • Published Sep 24, 2024 • 16
Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts Paper • 2410.10469 • Published Oct 14, 2024 • 1
Unified Training of Universal Time Series Forecasting Transformers Paper • 2402.02592 • Published Feb 4, 2024 • 8
Open-FinLLMs: Open Multimodal Large Language Models for Financial Applications Paper • 2408.11878 • Published Aug 20, 2024 • 64
Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting Paper • 2310.08278 • Published Oct 12, 2023 • 3
Tiny Time Mixers (TTMs): Fast Pre-trained Models for Enhanced Zero/Few-Shot Forecasting of Multivariate Time Series Paper • 2401.03955 • Published Jan 8, 2024 • 12
ChronoGAN: Supervised and Embedded Generative Adversarial Networks for Time Series Generation Paper • 2409.14013 • Published Sep 21, 2024
Generative Time Series Forecasting with Diffusion, Denoise, and Disentanglement Paper • 2301.03028 • Published Jan 8, 2023