Abstract
As energy transformation and technological advancements accelerate, multi-load forecasting serves as a key component in optimizing the planning, operation, and management of smart grids. Nevertheless, conventional load prediction techniques often suffer from issues such as limited accuracy and high volatility. To overcome these limitations, this study presents a load prediction approach based on Transformer multi-model fusion, employing a Stacking ensemble learning framework to integrate multiple models, including Transformer, XGBoost, and GDBT. The method also takes into account key influencing factors, such as meteorological conditions and holidays. Specifically, it involves training and predicting from the original features, followed by utilizing Transformer’s self-attention mechanism to capture feature relationships and long-term dependencies, ultimately leading to more precise load predictions. Empirical results validate that the proposed approach achieves superior prediction accuracy and stability across multiple datasets, demonstrating significant improvements over single-model approaches and conventional techniques.
Get full access to this article
View all access options for this article.
