At the foundations of machine learning lie a number of interconnected discrete optimization problems, such as optimal point labeling and subset selection. As time passes, the need to address ever more complex variations on these problems arises. We address this need with a study of the discrete optimization foundations of machine learning, with an emphasis on general-purpose approximation algorithms for high-dimensional discrete optimization.
Objectives:
- Develop & refine the underlying optimization theory that drives learning systems, especially in discrete contexts.
Selected Publications:
- Minh N. Vu, Truc Nguyen, and My T. Thai. “NeuCEPT: Learn Neural Networks’ Mechanism via Critical Neurons with Precision Guarantee,” in ICDM, 2022
- “An Approximately Optimal Bot for Non-Submodular Social Reconnaissance,” in Proceedings of HyperText, 2018 .
- “Fast Maximization of Non-Submodular, Monotonic Functions on the Integer Lattice,” in Proceedings of ICML, 2018 .