Google Research team recently released TimesFM-2.5, a time series foundation model with 200 million parameters and a single decoder architecture. Compared to the previous version, TimesFM-2.5 significantly reduces the number of parameters, from 500M to 200M. At the same time, the context length of this new model has been significantly improved, reaching 16,384 data points. In addition, TimesFM-2.5 also supports local probabilistic forecasting, and is now available on the Hugging Face platform, ranking at the top in accuracy evaluation on GIFT-Eval.
Time series forecasting involves analyzing data points that change over time to identify patterns and predict future values. It plays a critical role in various industries, including retail product demand forecasting, weather and precipitation trend monitoring, and optimization of large-scale systems such as supply chains and energy networks. By capturing time dependencies and seasonal changes, time series forecasting supports data-driven decision-making in dynamic environments.
Compared to the previous version TimesFM-2.0, the improvements of TimesFM-2.5 mainly focus on the following aspects: first, the number of model parameters is significantly reduced; second, the increase in maximum context length allows it to better capture multi-seasonal structures and low-frequency components, reducing the complexity of preprocessing; finally, the new model also introduces an optional function for quantile prediction, supporting up to 1,000 forecast points.
Why is the increase in context length so important? The support for 16,384 historical data points allows the model to more comprehensively capture long-term trends in a single forward pass. This is particularly beneficial for fields with significant historical impact, such as energy load and retail demand forecasting.
The core concept of TimesFM is to perform forecasting through a single decoder foundation model. This theory was first proposed in a paper presented at the 2024 ICML conference. GIFT-Eval (initiated by Salesforce) aims to support standardization of evaluations across different fields and has set up a public leaderboard on Hugging Face to facilitate model performance comparisons.
TimesFM-2.5 marks the transition of time series forecasting foundation models from concept to practical application, demonstrating improved accuracy and efficiency while maintaining a small parameter count. The model is now available on Hugging Face and will be further integrated with BigQuery and Model Garden, promoting the widespread adoption of zero-shot time series forecasting in practical applications.
huggingface:https://huggingface.co/google/timesfm-2.5-200m-pytorch
Key Points:
🌟 ** Smaller and Faster **: The parameter count of TimesFM-2.5 is reduced to 200M, while improving accuracy.
📈 ** Longer Context **: Supports an input length of 16,384, enabling deeper historical data forecasting.
🏆 ** Leading Benchmark **: TimesFM-2.5 ranks first in both point prediction and probabilistic prediction on GIFT-Eval.