当前位置: X-MOL首页全球导师 国内导师 › 江如俊

个人简介

副教授,博士生导师。2017年加入复旦大学。2016年获得香港中文大学博士学位,并留校1年从事博士后研究。其研究成果发表在Math. Program., SIAM J. Optim.、Math. Oper. Res.、INFORMS J. Comput.和ICML等国际顶级期刊或会议上。获上海市扬帆计划、国家级青年人才计划支持,先后主持国家自然科学基金青年项目和面上项目。获ICML 2022杰出论文奖。

研究领域

二次规划、非凸优化理论和算法及其在机器学习、运筹学、金融工程和信号处理中的应用

近期论文

查看导师新发文章 (温馨提示:请注意重名现象,建议点开原文通过作者单位确认)

Journal Articles Wutao Si, P.-A. Absil, Wen Huang, Rujun Jiang and Simon Vary. A Riemannian Proximal Newton Method. SIAM Journal on Optimization 34.1 (2024), 654-681. Rujun Jiang and Duan Li. Exactness Conditions for Semidefinite Programming Relaxations of Generalization of the Extended Trust Region Subproblem. Mathematics of Operations Research. 48.3 (2023): 1235-1253. doi/abs/10.1287/moor.2022.1305 Rujun Jiang, Zhizhuo Zhou and Zirui Zhou. Cubic Regularization Methods with Second-Order Complexity Guarantee Based on a New Subproblem Reformulation. Journal of the Operations Research Society of China. 10 (2022): 471–506. https://doi.org/10.1007/s40305-022-00398-5 Rujun Jiang and Xudong Li. H\"olderian error bounds and Kurdyka-{\L}ojasiewicz inequality for the trust region subproblem. Mathematics of Operations Research. 47.4 (2022): 3025-3050. https://doi.org/10.1287/moor.2021.1243 (There is a typo in the first column in Table 1 in the published version, where \perp should be \not\perp.) Rujun Jiang, Man-Chung Yue, Zhishuo Zhou. An accelerated first-order method with complexity analysis for solving cubic regularization subproblems. Computational Optimization and Applications. 79.2 (2021): 471-506. https://doi.org/10.1007/s10589-021-00274-7 Hezhi Luo, Xiaodong Ding, Jiming Peng, Rujun Jiang and Duan Li. Complexity Results and Effective Algorithms for Worst-case Linear Optimization under Uncertainties. INFORMS Journal on Computing. 33.1 (2020):180-197. https://doi.org/10.1287/ijoc.2019.0941 appendix data and code Rujun Jiang and Duan Li. A Linear-Time Algorithm for Generalized Trust Region Subproblems. SIAM Journal on Optimization. 30.1 (2020): 915-932. Rujun Jiang and Duan Li. Second order cone constrained convex relaxations for nonconvex quadratically constrained quadratic programming. Journal of Global Optimization. 75.2 (2019): 461-494. https://doi.org/10.1007/s10898-019-00793-y Rujun Jiang and Duan Li. Novel reformulations and efficient algorithms for the generalized trust region subproblem. SIAM Journal on Optimization. 29.2 (2019): 1603-1633. Baiyi Wu, Duan Li and Rujun Jiang. Quadratic Convex Reformulation for Quadratic Programming with Linear On-Off Constraints. European Journal of Operational Research. 274.3 (2019): 824-836. Xueting Cui, Xiaoling Sun, Shushang Zhu, Rujun Jiang, and Duan Li. Portfolio Optimization with Nonparametric Value at Risk: A Block Coordinate Descent Method. INFORMS Journal on Computing. 30.3 (2018): 454-471. appendix Rujun Jiang, Duan Li and Baiyi Wu. SOCP reformulation for the generalized trust region subproblem via a canonical form of two symmetric matrices. Mathematical Programming. 169.2 (2018): 531-563. Rujun Jiang and Duan Li. Simultaneous diagonalization of matrices and its applications in quadratically constrained quadratic programming. SIAM Journal on Optimization. 26.3 (2016): 1649-1668. Conference Articles Jiulin Wang, Xu Shi, Rujun Jiang. Near-Optimal Convex Simple Bilevel Optimization with a Bisection Method. To appear in Proceedings of the 27th International Conference on Artificial Intelligence and Statistics (AISTATS 2024), 2024. He Chen, Haochen Xu, Rujun Jiang, Anthony Man-Cho So. Lower-Level Duality Based Reformulation and Majorization Minimization Algorithm for Hyperparameter Optimization. To appear in Proceedings of the 27th International Conference on Artificial Intelligence and Statistics (AISTATS 2024), 2024. Rufeng Xiao, Yuze Ge, Rujun Jiang and Yifan Yan. Unified Framework for Rank-based Loss Minimization. Advances in Neural Information Processing Systems 36: Proceedings of the 2023 Conference (NeurIPS 2023). Jiali Wang, Wen Huang, Rujun Jiang, Xudong Li and Alex L. Wang. Solving Stackelberg Prediction Game with Least Squares Loss via Spherically Constrained Least Squares Reformulation. International Conference on Machine Learning (ICML), 2022. Outstanding Paper Award code Jiali Wang, He Chen, Rujun Jiang, Xudong Li and Zihao Li. Fast Algorithms for Stackelberg Prediction Game with Least Squares Loss. International Conference on Machine Learning (ICML), 2021. code Rujun Jiang, Huikang Liu and Anthony Man-Cho So. LPA-SD: An Efficient First-Order Method for Single-Group Multicast Beamforming. Proceedings of the 19th IEEE International Workshop on Signal Processing Advances in Wireless Communications (SPAWC 2018), 2018. Rujun Jiang and Duan Li. On Conic Relaxations of Generalization of the Extended Trust Region Subproblem. In World Congress on Global Optimization, pp. 145-154. Springer, Cham, 2019. Rujun Jiang and Duan Li. Semidefinite Programming Based Convex Relaxation for Nonconvex Quadratically Constrained Quadratic Programming. In World Congress on Global Optimization, pp. 213-220. Springer, Cham, 2019.

推荐链接
down
wechat
bug