当前位置:
X-MOL 学术
›
IEEE Trans. Signal Process.
›
论文详情
Our official English website, www.x-mol.net, welcomes your
feedback! (Note: you will need to create a separate account there.)
Towards Inversion-Free Sparse Bayesian Learning: A Universal Approach
IEEE Transactions on Signal Processing ( IF 4.6 ) Pub Date : 2024-10-23 , DOI: 10.1109/tsp.2024.3484908 Yuhui Song, Zijun Gong, Yuanzhu Chen, Cheng Li
IEEE Transactions on Signal Processing ( IF 4.6 ) Pub Date : 2024-10-23 , DOI: 10.1109/tsp.2024.3484908 Yuhui Song, Zijun Gong, Yuanzhu Chen, Cheng Li
Sparse Bayesian Learning (SBL) has emerged as a powerful tool for sparse signal recovery, due to its superior performance. However, the practical implementation of SBL faces a significant computational complexity associated with matrix inversion. Despite numerous efforts to alleviate this issue, existing methods are often limited to specifically structured sparse signals. This paper aims to provide a universal inversion-free approach to accelerate existing SBL algorithms. We unify the optimization of SBL variants with different priors within the expectation-maximization (EM) framework, where a lower bound of the likelihood function is maximized. Due to the linear Gaussian model foundation of SBL, updating this lower bound requires maximizing a quadratic function, which involves matrix inversion. Thus, we employ the minorization-maximization (MM) framework to derive two novel lower bounds that diagonalize the quadratic coefficient matrix, thereby eliminating the need for any matrix inversions. We further investigate their properties, including convergence guarantees under the MM framework and the slow convergence rate due to reduced curvature. The proposed approach is applicable to various types of structured sparse signals, such as row-sparse, block-sparse, and burst-sparse signals. Our simulations on synthetic and real data demonstrate remarkably shorter running time compared to state-of-the-art methods while achieving comparable recovery performance.
中文翻译:
迈向无反转稀疏贝叶斯学习:一种通用方法
稀疏贝叶斯学习 (SBL) 由于其卓越的性能,已成为稀疏信号恢复的强大工具。然而,SBL 的实际实现面临着与矩阵求逆相关的重大计算复杂性。尽管为缓解这个问题做出了许多努力,但现有方法通常仅限于特定结构的稀疏信号。本文旨在提供一种通用的无反演方法来加速现有的 SBL 算法。我们在期望最大化 (EM) 框架内统一了具有不同先验的 SBL 变体的优化,其中似然函数的下限最大化。由于 SBL 的线性高斯模型基础,更新此下限需要最大化二次函数,这涉及矩阵求逆。因此,我们采用少数化最大化 (MM) 框架来推导出两个新颖的下界,它们对二次系数矩阵进行对角化,从而消除了任何矩阵求逆的需要。我们进一步研究了它们的性质,包括 MM 框架下的收敛保证和由于曲率减小而导致的缓慢收敛速率。所提出的方法适用于各种类型的结构化稀疏信号,例如行稀疏、块稀疏和突发稀疏信号。我们对合成数据和真实数据的模拟表明,与最先进的方法相比,运行时间明显缩短,同时实现了相当的回收性能。
更新日期:2024-10-23
中文翻译:
迈向无反转稀疏贝叶斯学习:一种通用方法
稀疏贝叶斯学习 (SBL) 由于其卓越的性能,已成为稀疏信号恢复的强大工具。然而,SBL 的实际实现面临着与矩阵求逆相关的重大计算复杂性。尽管为缓解这个问题做出了许多努力,但现有方法通常仅限于特定结构的稀疏信号。本文旨在提供一种通用的无反演方法来加速现有的 SBL 算法。我们在期望最大化 (EM) 框架内统一了具有不同先验的 SBL 变体的优化,其中似然函数的下限最大化。由于 SBL 的线性高斯模型基础,更新此下限需要最大化二次函数,这涉及矩阵求逆。因此,我们采用少数化最大化 (MM) 框架来推导出两个新颖的下界,它们对二次系数矩阵进行对角化,从而消除了任何矩阵求逆的需要。我们进一步研究了它们的性质,包括 MM 框架下的收敛保证和由于曲率减小而导致的缓慢收敛速率。所提出的方法适用于各种类型的结构化稀疏信号,例如行稀疏、块稀疏和突发稀疏信号。我们对合成数据和真实数据的模拟表明,与最先进的方法相比,运行时间明显缩短,同时实现了相当的回收性能。