An ADMM-LSTM framework for short-term load forecasting
Journal Publication ResearchOnline@JCUAbstract
Accurate short-term load forecasting (STLF) is crucial for maintaining reliable and efficient operations within power systems. With the continuous increase in the volume and variety of energy data provided by renewables, electric vehicles, and other sources, long short-term memory (LSTM) has emerged as an attractive approach for STLF due to its superiority in extracting dynamic temporal information. However, traditional LSTM training methods rely on stochastic gradient methods that have several limitations. This paper presents an innovative LSTM optimization framework via the alternating direction method of multipliers (ADMM) for STLF, dubbed ADMM-LSTM. Explicitly, we train the LSTM network distributedly by the ADMM algorithm. More specifically, we introduce a novel approach to update the parameters in the ADMM-LSTM framework, using a backward-forward order, significantly reducing computational time. Additionally, within the proposed framework, the solution to each subproblem is achieved by utilizing either the proximal point algorithm or local linear approximation, preventing the need for supplementary numerical solvers. This approach confers several advantages, including avoiding issues associated with exploding or vanishing gradients, thanks to the inherent gradient-free characteristics of ADMM-LSTM. Furthermore, we offer a comprehensive theoretical analysis that elucidates the convergence properties inherent to the ADMM-LSTM framework. This analysis provides a deeper understanding of the algorithm’s convergence behavior. Lastly, the efficacy of our method is substantiated through a series of experiments conducted on two publicly available datasets. The experimental results demonstrate the superior performance of our approach when compared to existing methods.
Journal
Neural Networks
Publication Name
N/A
Volume
173
ISBN/ISSN
1879-2782
Edition
N/A
Issue
N/A
Pages Count
12
Location
N/A
Publisher
Elsevier
Publisher Url
N/A
Publisher Location
N/A
Publish Date
N/A
Url
N/A
Date
N/A
EISSN
N/A
DOI
10.1016/j.neunet.2024.106150