Research overview

My research began focused on projection methods for solving convex feasibility problems (particularly in the context of CT imaging). This morphed into studying operators in further generality and operator splitting methods (e.g., ADMM, forward backward splitting) that cover a wide class of continuous optimization problems. The latest shift in focus is the subject of my thesis research: how to fuse the advantages of machine learning machinery with the theoretical guarantees afforded by operator-based methods. In the context of deep learning, there are many exciting and ongoing developments connecting these two fields. Several works are in progress and will be revealed this spring.

 

Key words:  Convex Optimization, Deep Learning, Plug and Play (PnP), Learning to Optimize (L2O), Fixed Point Network (FPN), Implicit Depth Learning, Deep Unrolling, Operator Splitting, Convex Feasibility Problems

Selected Publications

S. Wu Fung, H. Heaton, Q. Li, D. McKenzie, S. Osher, W. Yin. Fixed Point Networks. 

arXiv preprint: 2103.12803, 2021.

T. Chen, X. Chen, W. Chen, Z. Wang, H. Heaton, J. Liu, W. Yin, Learning to Optimize: A Primer and A Benchmark.

arXiv preprint: 2103.12828, 2021.

J.Shen, X. Chen, H. Heaton, T. Chen, J. Liu, W. Yin, Z. Wang,

Learning A Minimax Optimizer: A Pilot Study.  

ICLR, 2021.

H. Heaton, S. Wu Fung, A.T. Lin, S. Osher, W. Yin.

Projecting to Manifolds via Unsupervised Learning. 

arXiv preprint: 2008.02200, 2020.

H. Heaton, X. Chen, Z. Wang, W. Yin.

Safeguarded Learned Convex Optimization

arXiv preprint: 2003.01880, 2020

H. Heaton, Y. Censor.

Asynchronous sequential inertial iterations for common fixed points problems with an application to linear systems.

Journal of Global Optimization, 2019.

Y. Censor, H. Heaton, R. Schulte.

Derivative-free superiorization with component-wise perturbations.

Numerical Algorithms, 2018.

©2021 by Howard Heaton