当前位置: X-MOL 学术IEEE Trans. Signal Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Low-Rank Tensor Completion via Novel Sparsity-Inducing Regularizers
IEEE Transactions on Signal Processing ( IF 4.6 ) Pub Date : 7-11-2024 , DOI: 10.1109/tsp.2024.3424272
Zhi-Yong Wang 1 , Hing Cheung So 2 , Abdelhak M. Zoubir 3
Affiliation  

To alleviate the bias generated by the $\ell_{1}$ -norm in the low-rank tensor completion problem, nonconvex surrogates/regularizers have been suggested to replace the tensor nuclear norm, although both can achieve sparsity. However, the thresholding functions of these nonconvex regularizers may not have closed-form expressions and thus iterations are needed, which implies high computational load. To solve this issue, we devise a framework to generate sparsity-inducing regularizers with closed-form thresholding functions. These regularizers are applied to low-tubal-rank tensor completion, and efficient algorithms based on the alternating direction method of multipliers are developed. Furthermore, convergence of our methods is analyzed and it is proved that the generated sequences are bounded and converge to a stationary point. Experimental results using synthetic and real-world datasets show that the proposed algorithms outperform the state-of-the-art methods in terms of restoration performance.

中文翻译:


通过新颖的稀疏性诱导正则化器实现低阶张量补全



为了减轻由$\ell_{1}$ -范数在低秩张量完成问题中,建议使用非凸代理/正则化器来取代张量核范数,尽管两者都可以实现稀疏性。然而,这些非凸正则化器的阈值函数可能没有封闭形式的表达式,因此需要迭代,这意味着高计算负载。为了解决这个问题,我们设计了一个框架来生成具有封闭形式阈值函数的稀疏性正则化器。这些正则化器应用于低管阶张量补全,并开发了基于乘子交替方向法的高效算法。此外,分析了我们的方法的收敛性,并证明了生成的序列是有界的并且收敛到驻点。使用合成和真实数据集的实验结果表明,所提出的算法在恢复性能方面优于最先进的方法。
更新日期:2024-08-19
down
wechat
bug