The race to build the Top Foundation Forecasting Model is on!
Salesforce MOIRAIone of the first basic models, it achieved high benchmark results and was open sourced along with its pre-training dataset, LOTSA.
We take an exhaustive look at how MOIRAI works here – and created a comprehensive project comparing MOIRAI with popular statistical models.
Salesforce has now released an improved version: MOIRAI-MOE — with significant improvements, particularly the addition of Blend of Experts (MOE). We briefly discuss the MOE when another model, Time-MOEHe also used several experts.
In this article, we will cover:
- How MOIRAI-MOE works and why it is a powerful model.
- Key differences between MOIRAI and MOIRAI-MOE.
- How MOIRAI-MOE's use of a combination of experts improves accuracy.
- How combining experts generally solves frequency variation problems in basic time series models.
Let's get started.
I have launched ai Horizon Forecast, a newsletter focused on time series and innovations…