Timer-S1: A Billion-Scale Time Series Foundation Model with Serial Scaling

Tech Business·2 min read·via ArXivOriginal source →

Timer-S1: A Billion-Scale Time Series Foundation Model with Serial Scaling

arXiv:2603.04791v1 Announce Type: new Abstract: We introduce Timer-S1, a strong Mixture-of-Experts (MoE) time series foundation model with 8.3B total parameters, 0.75B activated parameters for each token, and a context length of 11.5K. To overcome the scalability bottleneck in existing pre-trained time series foundation models, we perform Serial Scaling in three dimensions: model architecture, dataset, and training pipeline. Timer-S1 integrates sparse TimeMoE blocks and generic TimeSTP blocks f

More Stories