当前位置: X-MOL 学术Math. Models Methods Appl. Sci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Entropy-based convergence rates of greedy algorithms
Mathematical Models and Methods in Applied Sciences ( IF 3.6 ) Pub Date : 2024-02-16 , DOI: 10.1142/s0218202524500143
Yuwen Li 1 , Jonathan W. Siegel 2
Affiliation  

We present convergence estimates of two types of greedy algorithms in terms of the entropy numbers of underlying compact sets. In the first part, we measure the error of a standard greedy reduced basis method for parametric PDEs by the entropy numbers of the solution manifold in Banach spaces. This contrasts with the classical analysis based on the Kolmogorov n-widths and enables us to obtain direct comparisons between the algorithm error and the entropy numbers, where the multiplicative constants are explicit and simple. The entropy-based convergence estimate is sharp and improves upon the classical width-based analysis of reduced basis methods for elliptic model problems. In the second part, we derive a novel and simple convergence analysis of the classical orthogonal greedy algorithm for nonlinear dictionary approximation using the entropy numbers of the symmetric convex hull of the dictionary. This also improves upon existing results by giving a direct comparison between the algorithm error and the entropy numbers.



中文翻译:

基于熵的贪婪算法的收敛速度

我们根据底层紧集的熵数提出了两种贪婪算法的收敛估计。在第一部分中,我们通过 Banach 空间中解流形的熵数来测量参数偏微分方程的标准贪婪简化基方法的误差。这与基于柯尔莫哥洛夫的经典分析形成对比n-宽度,使我们能够获得算法误差和熵数之间的直接比较,其中乘法常数是明确且简单的。基于熵的收敛估计非常敏锐,并且改进了椭圆模型问题的简化基方法的经典基于宽度的分析。在第二部分中,我们使用字典的对称凸包的熵数推导了经典正交贪婪算法的新颖且简单的收敛分析,用于非线性字典逼近。通过在算法误差和熵数之间进行直接比较,这也改进了现有结果。

更新日期:2024-02-16
down
wechat
bug