当前位置: X-MOL 学术IEEE Trans. Cybern. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Robust Standard Gradient Descent Algorithm for ARX Models Using Aitken Acceleration Technique.
IEEE Transactions on Cybernetics ( IF 9.4 ) Pub Date : 2021-03-23 , DOI: 10.1109/tcyb.2021.3063113
Jing Chen , Min Gan , Quanmin Zhu , Pritesh Narayan , Yanjun Liu

A robust standard gradient descent (SGD) algorithm for ARX models using the Aitken acceleration method is developed. Considering that the SGD algorithm has slow convergence rates and is sensitive to the step size, a robust and accelerative SGD (RA-SGD) algorithm is derived. This algorithm is based on the Aitken acceleration method, and its convergence rate is improved from linear convergence to at least quadratic convergence in general. Furthermore, the RA-SGD algorithm is always convergent with no limitation of the step size. Both the convergence analysis and the simulation examples demonstrate that the presented algorithm is effective.

中文翻译:

使用Aitken加速技术的ARX模型的鲁棒标准梯度下降算法。

针对使用Aitken加速方法的ARX模型开发了鲁棒的标准梯度下降(SGD)算法。考虑到SGD算法的收敛速度较慢并且对步长敏感,因此推导了一种健壮且加速的SGD(RA-SGD)算法。该算法基于Aitken加速方法,其收敛速度通常从线性收敛提高到至少二次收敛。此外,RA-SGD算法始终收敛,而不受步长的限制。收敛性分析和仿真算例均表明该算法是有效的。
更新日期:2021-03-23
down
wechat
bug