---
license: apache-2.0
tags:
- time series
- forecasting
- pretrained models
- foundation models
- time series foundation models
- time-series
---
# Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting
![lag-llama-architecture](images/lagllama.webp)
Lag-Llama is the first open-source foundation model for time series forecasting!
[[Tweet Thread](https://twitter.com/arjunashok37/status/1755261111233114165)] [[Model Weights](https://huggingface.co/time-series-foundation-models/Lag-Llama)] [[Colab Demo on Zero-Shot Forecasting](https://colab.research.google.com/drive/13HHKYL_HflHBKxDWycXgIUAHSeHRR5eo?usp=sharing)] [[GitHub](https://github.com/time-series-foundation-models/lag-llama)] [[Paper](https://arxiv.org/abs/2310.08278)]
____
This HuggingFace model houses the pretrained checkpoint of Lag-Llama.
____
* **Coming Next**: Fine-tuning scripts with examples on real-world datasets and best practices in using Lag-Llama!🚀
Updates:
* **17-Feb-2024**: We have released a new updated [Colab Demo](https://colab.research.google.com/drive/1XxrLW9VGPlZDw3efTvUi0hQimgJOwQG6?usp=sharing) for zero-shot forecasting that shows how one can load time series of different formats.
* **7-Feb-2024**: We released Lag-Llama, with open-source model checkpoints and a Colab Demo for zero-shot forecasting.
____
Current Features:
💫 Zero-shot forecasting on a dataset of any frequency for any prediction length, using the Colab Demo.
____
Coming Soon:
⭐ An online gradio demo where you can upload time series and get zero-shot predictions and perform finetuning.
⭐ Features for finetuning the foundation model
⭐ Features for pretraining Lag-Llama on your own large-scale data
⭐ Scripts to reproduce all results in the paper.
____
Stay Tuned!🦙