This repository contains the weights of the TimeMoE-200M model of the paper Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts.
For details on how to use this model, please visit our GitHub page.
Run this model on powerful GPU infrastructure. Deploy in 60 seconds.
Deploy on H100, A100, or RTX GPUs. Pay only for what you use. No setup required.