课题组关于混沌时间序列预测的论文在《Applied Soft Computing》上发表,该论文应用Transformer自注意力机制改进宽度学习算法,构建了一种新的混沌时间序列预测方法。Su Liyun, Xiong Lang, Yang Jialing, Multi-Attn BLS: Multi-Head Attention Mechanism with Broad Learning System for Chaotic Time Series Prediction, Applied Soft Computing, 2023. https://authors.elsevier.com/sd/article/S1568-4946(22)00880-8.
The observational 1-D signals available for realizing the highly accurate intrinsic attractor fitting of deep learning network approaches are often insufficient because of the complexity and nonlinearity of chaotic time series. Unlike deep models, a broad learning system (BLS) with the attention mechanism exhibits a unique and preeminent pattern prediction ability. Thus, this system has been applied as a practical trend in many fields. However, the application of multi-head attention fused manifold broad learning architecture to chaotic time series prediction remains adequate. Thus, a multi-head attentional BLS (Multi-Attn BLS) for chaotic time series prediction is proposed in this study to improve the prediction accuracy of chaotic time series further. Our model develops a novel framework that combines the high computational efficiency of broad learning with the multi-head attention mechanism. First, the received data are reconstructed into fixed-size tuples. The multidimensional arrays with embedding dimensions and time delay are used as the input to a broad learning network. Subsequently, a robust BLS with a spatiotemporal multi-head attention mechanism is developed to depict the internal dynamic evolution. The Multi-Attn BLS model can capture key spatiotemporal feature information and achieve high predictive performance. It also has a good generalization ability in practical nonlinear complex systems. Comparative experiments with the traditional long short-term memory (LSTM) network and the primitive BLS show that its computing speed and generalization ability are improved. Furthermore, the network is good at capturing the spatiotemporal features of the sequence because of the multi-head attention mechanism. The experimental results show that our model outperforms BLS, ridge regression, and LSTM on the four main evaluation indicators (root mean square error, root mean square percentage error, mean absolute error, and mean absolute percentage error) in predicting classical systems (Lorenz and Rossler systems). Moreover, the model has an excellent prediction effect in the real-world chaotic system of sea clutter.