当前位置: X-MOL首页全球导师 国内导师 › Jingwei Liang (梁经纬)

个人简介

Short bio Tenure-track Associate Professor   Institute of Natural Science and School of Mathematical Sciences, Shanghai Jiao Tong University. Lecturer in Mathematical Data Science (2020.11-2021.06)   School of Mathematical Sciences, Queen Mary University of London. Postdoc research associate (2017.04-2020.08)   Department of Applied Mathematics and Theoretical Physics, University of Cambridge. Ph.D. in Mathematics (2017.01)   GREYC, ENSICAEN and University of Caen Normandy. Last updated 4/21/2023. Preprints A Fast and Adaptive SVD-free Algorithm for General Weighted Low-rank Recovery A. Dutta, J. Liang and X. Li   Geometry of First-order Methods and Adaptive Acceleration C. Poon and J. Liang   Conference proceedings Tuning-free Plug-and-Play Proximal Algorithm for Inverse Imaging Problems  K. Wei, A. Aviles-Rivero, J. Liang, Y. Fu, C. Schönlieb and H. Huang, ICML, 2020. (Outstanding paper award, code) Trajectory of Alternating Direction Method of Multipliers and Adaptive Acceleration  C. Poon, J. Liang, NeurIPS, 2019. (Oral, code) Faster FISTA  J. Liang, C. Schönlieb, EUSIPCO, 2018. (code) Local Convergence Properties of SAGA/Prox-SVRG and Acceleration  C. Poon, J. Liang and C. Schönlieb, ICML, 2018. (code) A Multi-step Inertial Forward-Backward Splitting Method for Non-convex Optimization  J. Liang, J. Fadili and G. Peyré, NeurIPS, 2016. Activity Identification and Local Linear Convergence of Douglas-Rachford\/ADMM under Partial Smoothness  J. Liang, J. Fadili, G. Peyré and R. Luke, SSVM, 2015. (Oral) Local Linear Convergence of Forward-Backward under Partial Smoothness  J. Liang, J. Fadili and G. Peyré, NeurIPS, 2014. On the Convergence Rates of Proximal Splitting Algorithms  J. Liang, J. Fadili and G. Peyré, ICIP, 2014. (Top 10% Papers) PhD Thesis Title: Convergence Rates of First-Order Splitting Methods Last updated 4/21/2023. Description This is a short course on introducing first-order optimization methods, including proximal gradient descent, primal-dual splitting methods, alternating direction method of multipliers and Douglas-Rachford splitting methods. Some well-known accelerated algorithms are also introduced. References R. T. Rockafellar. Convex analysis. Princeton university press, 2015. A. Beck. First-order methods in optimization. Vol. 25. SIAM, 2017. H. H. Bauschke and P. L. Combettes. Convex analysis and monotone operator theory in Hilbert spaces. Vol. 408. New York: Springer, 2011. B. Polyak. Introduction to optimization. Optimization Software, 1987. Y. Nesterov. Introductory lectures on convex optimization: A basic course. Vol. 87. Springer Science & Business Media, 2013. Last updated 4/21/2023.

近期论文

查看导师最新文章 (温馨提示:请注意重名现象,建议点开原文通过作者单位确认)

Screening for Sparse Online Learning J. Liang and C. Poon, Journal of Computational and Graphical Statistics, 2022. Improving “Fast Iterative Shrinkage-Thresholding Algorithm”: Faster, Smarter and Greedier J. Liang, T. Luo and C. Schönlieb, SIAM Journal on Scientific Computing, 44(3), A1069-A1091, 2022. (code) Partial Smoothness and Constant Rank  A. Lewis, J. Liang and T. Tian, SIAM Journal on Optimization, 32(1):276-291, 2022. TFPnP: Tuning-free Plug-and-Play Proximal Algorithms with Applications to Inverse Imaging Problems  K. Wei, A. Aviles-Rivero, J. Liang, Y. Fu, H. Huang and C. Schönlieb, Journal of Machine Learning and Research, 23:1-48, 2022. On the Bias-Variance Tradeoff in Stochastic Gradient Methods  D. Driggs, J. Liang and C. Schönlieb, Journal of Machine Learning and Research, 23:1-43, 2022. The Fun is Finite: Douglas--Rachford and Sudoku Puzzle --- Finite Termination and Local Linear Convergence  R. Tovey and J. Liang, Journal of Applied and Numerical Optimization, 3(3):435-456, 2021. SPRING: A Fast Stochastic Proximal Alternating Method for Non-smooth Non-convex Optimization  D. Driggs, J. Tang, J. Liang, M. Davies and C. Schönlieb, SIAM Journal on Imaging Science, 14(04):1932-1970, 2021. A Stochastic Alternating Direction Method of Multipliers for Non-smooth and Non-convex Optimization  F. Bian, J. Liang and X. Zhang, Inverse Problems, 37(7), 2021. (DOI: https://doi.org/10.1088/1361-6420/ac0966) Best Pair Formulation & Accelerated Scheme for Non-convex Principal Component Pursuit  A. Dutta, F. Hanzely, J. Liang and P. Richtárik, IEEE Trans. on Signal Processing, 68:6128-6141, 2020. Convergence Rates of Forward–Douglas–Rachford Splitting Method C. Molinari, J. Liang and J. Fadili, Journal of Optimization Theory and Applications, 182(2):606-639, 2019. (code) Local Linear Convergence Analysis of Primal–Dual Splitting Methods  J. Liang, J. Fadili and G. Peyré, Optimization, 67(6):821-853, 2018. Activity Identification and Local Linear Convergence of Forward-Backward-type methods  J. Liang, J. Fadili and G. Peyré, SIAM Journal on Optimization, 27(1):408-437, 2017. Local Convergence Properties of Douglas–Rachford and Alternating Direction Method of Multipliers J. Liang, J. Fadili and G. Peyré, Journal of Optimization Theory and Applications, 172(3):874-913, 2017. (More details on ADMM can be found at arXiv) Convergence Rates with Inexact Nonexpansive Operators  J. Liang, J. Fadili and G. Peyré, Mathematical Programming, 59(1):403-434, 2016. Retinex by Higher order Total Variation \(L^1\) Decomposition  J. Liang, X. Zhang, Journal of Mathematical Imaging and Vision, 52(3):345-355, 2015. Seismic Data Restoration via Data-driven Framelet J. Liang, J. Ma and X. Zhang, Geophysics, 79(3):65-74, 2014. Wavelet Frame based Color Image Demosaicing  J. Liang, J. Li, Z. Shen and X. Zhang, Inverse Problems and Imaging, 7(3):777-794, 2013.

推荐链接
down
wechat
bug