✨A decoder-only foundation model for time-series forecasting
📝 Summary:
This paper introduces a decoder-only foundation model, adapted from large language models, for time-series forecasting. It achieves near-optimal zero-shot performance on diverse datasets across various time scales and granularities.
🔹 Publication Date: Published on Oct 14, 2023
🔹 Paper Links:
• arXiv Page: https://arxiv.org/abs/2310.10688
• PDF: https://arxiv.org/pdf/2310.10688
• Github: https://github.com/google-research/timesfm
🔹 Models citing this paper:
• https://huggingface.co/google/timesfm-1.0-200m
• https://huggingface.co/google/timesfm-2.0-500m-pytorch
• https://huggingface.co/google/timesfm-2.5-200m-pytorch
✨ Spaces citing this paper:
• https://huggingface.co/spaces/autogluon/fev-leaderboard
• https://huggingface.co/spaces/JayLacoma/Trader_Technical_Indicators
• https://huggingface.co/spaces/pavel321/huggingface-cli-completion
==================================
For more data science resources:
✓ https://t.iss.one/DataScienceT
#TimeSeriesForecasting #FoundationModels #MachineLearning #DeepLearning #AI
📝 Summary:
This paper introduces a decoder-only foundation model, adapted from large language models, for time-series forecasting. It achieves near-optimal zero-shot performance on diverse datasets across various time scales and granularities.
🔹 Publication Date: Published on Oct 14, 2023
🔹 Paper Links:
• arXiv Page: https://arxiv.org/abs/2310.10688
• PDF: https://arxiv.org/pdf/2310.10688
• Github: https://github.com/google-research/timesfm
🔹 Models citing this paper:
• https://huggingface.co/google/timesfm-1.0-200m
• https://huggingface.co/google/timesfm-2.0-500m-pytorch
• https://huggingface.co/google/timesfm-2.5-200m-pytorch
✨ Spaces citing this paper:
• https://huggingface.co/spaces/autogluon/fev-leaderboard
• https://huggingface.co/spaces/JayLacoma/Trader_Technical_Indicators
• https://huggingface.co/spaces/pavel321/huggingface-cli-completion
==================================
For more data science resources:
✓ https://t.iss.one/DataScienceT
#TimeSeriesForecasting #FoundationModels #MachineLearning #DeepLearning #AI
arXiv.org
A decoder-only foundation model for time-series forecasting
Motivated by recent advances in large language models for Natural Language Processing (NLP), we design a time-series foundation model for forecasting whose out-of-the-box zero-shot performance on...