Full-text search
+ 1,000 results
time-series-foundation-models / Lag-Llama
README.md
model
40 matches
tags:
safetensors, time series, forecasting, pretrained models, foundation models, time series foundation models, time-series, time-series-forecasting, arxiv:2310.08278, license:apache-2.0, region:us
13
14
15
16
17
ards Foundation Models for Probabilistic Time Series Forecasting
![lag-llama-architecture](images/lagllama.webp)
Lag-Llama is the <b>first open-source foundation model for time series forecasting</b>!
FacultyAI / moirai-small
README.md
model
19 matches
tags:
safetensors, model_hub_mixin, pytorch_model_hub_mixin, time series foundation models, pretrained models, time series, time-series-forecasting, arxiv:2402.02592, region:us
11
12
13
14
15
ined Time Series Model based on the Masked Encoder architecture. It is a universal time series forecasting model capable of addressing diverse forecasting tasks across multiple domains, frequencies, and variables in a zero-shot manner.
This is a version of [Moirai small](https://huggingface.co/Salesforce/moirai-1.1-R-small) trained by Faculty AI. It was pre-trained on the [LOTSA data](https://huggingface.co/datasets/Salesforce/lotsa_data) using the [codebase](https://github.com/SalesforceAIResearch/uni2ts/tree/main/cli/conf/pretrain) provided by Woo et al. (2024). Both the dataset and codebase are licensed under the [Apache License 2.0](https://www.apache.org/licenses/LICENSE-2.0). For more details on the model architecture, training, and results, please refer to the [paper](https://arxiv.org/abs/2402.02592).
### Usage
xiyuanz / UniMTS
README.md
model
45 matches
tags:
time series classification, motion time series, human activity recognition, pretrained models, foundation models, time series foundation models, time-series, arxiv:2410.19818, license:apache-2.0, region:us
13
14
15
16
17
tion Time Series
🚀 This is the official implementation of the NeurIPS 2024 paper "UniMTS: Unified Pre-training for Motion Time Series".
<p align="center">
ibm / ttm-research-r2
README.md
model
67 matches
tags:
safetensors, tinytimemixer, time series, forecasting, pretrained models, foundation models, time series foundation models, time-series, time-series-forecasting, arxiv:2401.03955, license:cc-by-nc-sa-4.0, region:us
13
14
15
16
17
Tiny Time Mixer (TTM) Research-Use Model Card
<p align="center" width="100%">
<img src="ttm_image.webp" width="600">
</p>
amazon / chronos-t5-large
README.md
model
37 matches
tags:
transformers, safetensors, t5, text2text-generation, time series, forecasting, pretrained models, foundation models, time series foundation models, time-series, time-series-forecasting, arxiv:2403.07815, arxiv:1910.10683, license:apache-2.0, autotrain_compatible, text-generation-inference, endpoints_compatible, region:us
15
16
17
18
19
ined time series forecasting models** based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.
For details on Chronos models, training data and procedures, and experimental results, please refer to the paper [Chronos: Learning the Language of Time Series](https://arxiv.org/abs/2403.07815).
<p align="center">
amazon / chronos-t5-tiny
README.md
model
37 matches
tags:
transformers, safetensors, t5, text2text-generation, time series, forecasting, pretrained models, foundation models, time series foundation models, time-series, time-series-forecasting, arxiv:2403.07815, arxiv:1910.10683, license:apache-2.0, autotrain_compatible, text-generation-inference, endpoints_compatible, region:us
15
16
17
18
19
ined time series forecasting models** based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.
For details on Chronos models, training data and procedures, and experimental results, please refer to the paper [Chronos: Learning the Language of Time Series](https://arxiv.org/abs/2403.07815).
<p align="center">
amazon / chronos-t5-small
README.md
model
37 matches
tags:
transformers, safetensors, t5, text2text-generation, time series, forecasting, pretrained models, foundation models, time series foundation models, time-series, time-series-forecasting, arxiv:2403.07815, arxiv:1910.10683, license:apache-2.0, autotrain_compatible, text-generation-inference, endpoints_compatible, region:us
15
16
17
18
19
ined time series forecasting models** based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.
For details on Chronos models, training data and procedures, and experimental results, please refer to the paper [Chronos: Learning the Language of Time Series](https://arxiv.org/abs/2403.07815).
<p align="center">
autogluon / chronos-t5-large
README.md
model
37 matches
tags:
transformers, safetensors, t5, text2text-generation, time series, forecasting, pretrained models, foundation models, time series foundation models, time-series, time-series-forecasting, arxiv:2403.07815, arxiv:1910.10683, license:apache-2.0, autotrain_compatible, text-generation-inference, endpoints_compatible, region:us
15
16
17
18
19
ined time series forecasting models** based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.
For details on Chronos models, training data and procedures, and experimental results, please refer to the paper [Chronos: Learning the Language of Time Series](https://arxiv.org/abs/2403.07815).
<p align="center">
amazon / chronos-t5-base
README.md
model
37 matches
tags:
transformers, safetensors, t5, text2text-generation, time series, forecasting, pretrained models, foundation models, time series foundation models, time-series, time-series-forecasting, arxiv:2403.07815, arxiv:1910.10683, license:apache-2.0, autotrain_compatible, text-generation-inference, endpoints_compatible, region:us
15
16
17
18
19
ined time series forecasting models** based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.
For details on Chronos models, training data and procedures, and experimental results, please refer to the paper [Chronos: Learning the Language of Time Series](https://arxiv.org/abs/2403.07815).
<p align="center">
amazon / chronos-t5-mini
README.md
model
37 matches
tags:
transformers, safetensors, t5, text2text-generation, time series, forecasting, pretrained models, foundation models, time series foundation models, time-series, time-series-forecasting, arxiv:2403.07815, arxiv:1910.10683, license:apache-2.0, autotrain_compatible, text-generation-inference, endpoints_compatible, region:us
15
16
17
18
19
ined time series forecasting models** based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.
For details on Chronos models, training data and procedures, and experimental results, please refer to the paper [Chronos: Learning the Language of Time Series](https://arxiv.org/abs/2403.07815).
<p align="center">
shchuro / chronos-t5-tiny-deploy
README.md
model
37 matches
tags:
transformers, safetensors, t5, text2text-generation, time series, forecasting, pretrained models, foundation models, time series foundation models, time-series, other, arxiv:2403.07815, arxiv:1910.10683, license:apache-2.0, autotrain_compatible, text-generation-inference, endpoints_compatible, region:us
15
16
17
18
19
ined time series forecasting models** based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.
For details on Chronos models, training data and procedures, and experimental results, please refer to the paper [Chronos: Learning the Language of Time Series](https://arxiv.org/abs/2403.07815).
<p align="center">
autogluon / chronos-t5-mini
README.md
model
37 matches
tags:
transformers, safetensors, t5, text2text-generation, time series, forecasting, pretrained models, foundation models, time series foundation models, time-series, time-series-forecasting, arxiv:2403.07815, arxiv:1910.10683, license:apache-2.0, autotrain_compatible, text-generation-inference, endpoints_compatible, region:us
15
16
17
18
19
ined time series forecasting models** based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.
For details on Chronos models, training data and procedures, and experimental results, please refer to the paper [Chronos: Learning the Language of Time Series](https://arxiv.org/abs/2403.07815).
<p align="center">
autogluon / chronos-t5-tiny
README.md
model
37 matches
tags:
transformers, safetensors, t5, text2text-generation, time series, forecasting, pretrained models, foundation models, time series foundation models, time-series, time-series-forecasting, arxiv:2403.07815, arxiv:1910.10683, license:apache-2.0, autotrain_compatible, text-generation-inference, endpoints_compatible, region:us
15
16
17
18
19
ined time series forecasting models** based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.
For details on Chronos models, training data and procedures, and experimental results, please refer to the paper [Chronos: Learning the Language of Time Series](https://arxiv.org/abs/2403.07815).
<p align="center">
autogluon / chronos-t5-small
README.md
model
37 matches
tags:
transformers, safetensors, t5, text2text-generation, time series, forecasting, pretrained models, foundation models, time series foundation models, time-series, time-series-forecasting, arxiv:2403.07815, arxiv:1910.10683, license:apache-2.0, autotrain_compatible, text-generation-inference, endpoints_compatible, region:us
15
16
17
18
19
ined time series forecasting models** based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.
For details on Chronos models, training data and procedures, and experimental results, please refer to the paper [Chronos: Learning the Language of Time Series](https://arxiv.org/abs/2403.07815).
<p align="center">
autogluon / chronos-t5-base
README.md
model
37 matches
tags:
transformers, safetensors, t5, text2text-generation, time series, forecasting, pretrained models, foundation models, time series foundation models, time-series, time-series-forecasting, arxiv:2403.07815, arxiv:1910.10683, license:apache-2.0, autotrain_compatible, text-generation-inference, endpoints_compatible, region:us
15
16
17
18
19
ined time series forecasting models** based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.
For details on Chronos models, training data and procedures, and experimental results, please refer to the paper [Chronos: Learning the Language of Time Series](https://arxiv.org/abs/2403.07815).
<p align="center">
anchovy / autogluon-chronos-t5-tiny
README.md
model
37 matches
tags:
safetensors, t5, time series, forecasting, pretrained models, foundation models, time series foundation models, time-series, time-series-forecasting, arxiv:2403.07815, arxiv:1910.10683, license:apache-2.0, region:us
15
16
17
18
19
ined time series forecasting models** based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.
For details on Chronos models, training data and procedures, and experimental results, please refer to the paper [Chronos: Learning the Language of Time Series](https://arxiv.org/abs/2403.07815).
<p align="center">
anchovy / autogluon-chronos-t5-base
README.md
model
37 matches
tags:
safetensors, t5, time series, forecasting, pretrained models, foundation models, time series foundation models, time-series, time-series-forecasting, arxiv:2403.07815, arxiv:1910.10683, license:apache-2.0, region:us
15
16
17
18
19
ined time series forecasting models** based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.
For details on Chronos models, training data and procedures, and experimental results, please refer to the paper [Chronos: Learning the Language of Time Series](https://arxiv.org/abs/2403.07815).
<p align="center">
anchovy / autogluon-chronos-t5-mini
README.md
model
37 matches
tags:
safetensors, t5, time series, forecasting, pretrained models, foundation models, time series foundation models, time-series, time-series-forecasting, arxiv:2403.07815, arxiv:1910.10683, license:apache-2.0, region:us
15
16
17
18
19
ined time series forecasting models** based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.
For details on Chronos models, training data and procedures, and experimental results, please refer to the paper [Chronos: Learning the Language of Time Series](https://arxiv.org/abs/2403.07815).
<p align="center">
anchovy / autogluon-chronos-t5-small
README.md
model
37 matches
tags:
safetensors, t5, time series, forecasting, pretrained models, foundation models, time series foundation models, time-series, time-series-forecasting, arxiv:2403.07815, arxiv:1910.10683, license:apache-2.0, region:us
15
16
17
18
19
ined time series forecasting models** based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.
For details on Chronos models, training data and procedures, and experimental results, please refer to the paper [Chronos: Learning the Language of Time Series](https://arxiv.org/abs/2403.07815).
<p align="center">
anchovy / autogluon-chronos-t5-large
README.md
model
37 matches
tags:
safetensors, t5, time series, forecasting, pretrained models, foundation models, time series foundation models, time-series, time-series-forecasting, arxiv:2403.07815, arxiv:1910.10683, license:apache-2.0, region:us
15
16
17
18
19
ined time series forecasting models** based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.
For details on Chronos models, training data and procedures, and experimental results, please refer to the paper [Chronos: Learning the Language of Time Series](https://arxiv.org/abs/2403.07815).
<p align="center">