Tags:long short-term memory(LSTM), redundant information, remaining useful life prediction and self-attention mechanism
Abstract:
Remaining useful life(RUL) prediction of intelligent equipment plays a crucial role in avoiding major safety accidents and substantial economic losses from degradation failure. Recently, many studies focused on deep learning-based data-driven methods, such as long short-term memory (LSTM) neural network, that used multi-dimensions condition signals or features to predict the RUL. However, most existing methods are inability to acquire valid temporal information from long-term time series. Moreover, the input data containing much redundant information leads to imprecise RUL prediction results. To overcome the aforementioned weakness, a multi-scale LSTM neural network with multi-head self-attention embedding mechanism(MLSTM-MHA) is proposed in this article for RUL prediction. Firstly, the memory cell of LSTM is divided into several parts according to different temporal trend types, such as local trends, medium trends, and long trends. Fusing all types of memory cells can capture different trend information and improve the performance of LSTM to learn time series data. Secondly, multi-head self-attention is embedded in the forgetting gate and input gate structure of LSTM. This attention mechanism can participate in the training of the MLSTM-MHA network and adaptively recalculates the weights as the network parameters. And the redundant information is assigned lower weights due to lower values by the attention module. Finally, a commercial modular aero propulsion system simulation dataset and an hot strip mill rollers dataset are used to validate the superiority of the proposed method. Compared with the existing typical data-driven RUL prediction methods, the proposed method has a more accurate predictive ability. The main contributions of this work include 1.the proposed novel structure of memory cells can deal with multi-scale temporal information 2.the impact of redundant information is attenuated by the proposed embedding multi-head self-attention mechanism.
A Multi-Scale LSTM with Multi-Head Self-Attention Embedding Mechanism for Remaining Useful Life Prediction of Hot Strip Mill Rollers