About Me

I am a machine learning researcher currently working as an AI Resident at Google. My research broadly focuses on ensuring that optimization methods for machine learning are efficient and robust.

I received a PhD in applied mathematics from the University of Wisconsin-Madison under the guidance of Nigel Boston. I was also a postdoctoral researcher with Dimitris Papailiopoulos.

In my free time I often bake. You can find recipes I am fond of in my dissertation (no, really).


My Google Scholar page can be found here. Below is a list of my preprints and publications, separated by subject.

Machine Learning and Optimization

Adaptive Federated Optimization. Sashank Reddi, Zachary Charles, Manzil Zaheer, Zachary Garrett, Keith Rush, Jakub Konečný, Sanjiv Kumar, H. Brendan McMahan. [arXiv]

Advances and Open Problems in Federated Learning. Peter Kairouz, H. Brendan McMahan, et al. [arXiv]

Convergence and Margin of Adversarial Training on Separable Data. Zachary Charles, Shashank Rajput, Stephen Wright, Dimitris Papailiopoulos. [arXiv]

DETOX: A Redundancy-based Framework for Faster and More Robust Gradient Aggregation. Shashank Rajput, Hongyi Wang, Zachary Charles, Dimitris Papailiopoulos. [arXiv]

Does Data Augmentation Lead to Positive Margin? Shashank Rajput, Zhili Feng, Zachary Charles, Po-Ling Loh, Dimitris Papailiopoulos. ICML, 2019. [link] [arXiv]

A Geometric Perspective on the Transferability of Adversarial Directions. Zachary Charles, Harrison Rosenberg, Dimitris Papailiopoulos. AISTATS, 2019. [link] [arXiv]

ErasureHead: Distributed Gradient Descent without Delays Using Approximate Gradient Codes. Hongyi Wang, Zachary Charles, Dimitris Papailiopoulos. [arXiv]

ATOMO: Communication-efficient Learning via Atomic Sparsification. Hongyi Wang, Scott Sievert, Zachary Charles, Shengchao Liu, Stephen Wright, Dimitris Papailiopoulos. NeurIPS, 2018. [link] [arXiv]

Stability and Generalization of Learning Algorithms that Converge to Global Optima. Zachary Charles, Dimitris Papailiopoulos. ICML, 2018. [link] [arXiv] [slides]

Approximate Gradient Coding via Sparse Random Graphs. Zachary Charles, Dimitris Papailiopoulos, Jordan Ellenberg. [arXiv]

DRACO: Robust Distributed Training via Redundant Gradients. Lingjiao Chen, Hongyi Wang, Zachary Charles, Dimitris Papailiopoulos. ICML, 2018. [link] [arXiv]

Gradient Coding Using the Stochastic Block Model. Zachary Charles, Dimitris Papailiopoulos. ISIT, 2018. [link] [arXiv] [slides]

Subspace Clustering with Missing and Corrupted Data. Zachary Charles, Amin Jalali, Rebecca Willett. IEEE Data Science Workshop, 2018. [link] [arXiv] [slides]

Applied and Computational Mathematics

Exploiting Algebraic Structure in Global Optimization and the Belgian Chocolate Problem. Zachary Charles, Nigel Boston. Journal of Global Optimization, 2018. [link] [arXiv]

Generating Random Factored Ideals in Number Fields. Zachary Charles. Mathematics of Computation, 2018. [link] [arXiv]

Distributions of the Number of Solutions to the Network Power Flow Equations. Alisha Zachariah, Zachary Charles, Nigel Boston, Bernard Lesieutre. ISCAS, 2018. [link]

Efficiently Finding All Power Flow Solutions to Tree Networks. Alisha Zachariah, Zachary Charles. Allerton, 2017. [link]

Nonpositive Eigenvalues of Hollow, Symmetric, Nonnegative Matrices. Zachary Charles, Miriam Farber, Charles R Johnson, Lee Kennedy-Shaffer. SIAM Journal on Matrix Analysis and Applications, 2013. [link]

Nonpositive Eigenvalues of the Adjacency Matrix and Lower Bounds for Laplacian Eigenvalues. Zachary Charles, Miriam Farber, Charles R Johnson, Lee Kennedy-Shaffer. Discrete Mathematics, 2013. [link]

The Relation Between the Diagonal Entries and the Eigenvalues of a Symmetric Matrix, Based upon the Sign Pattern of its Off-Diagonal Entries. Zachary Charles, Miriam Farber, Charles R Johnson, Lee Kennedy-Shaffer. Linear Algebra and its Applications, 2013. [link]


Algebraic and Geometric Structure in Machine Learning and Optimization Algorithms. Zachary Charles. University of Wisconsin-Madison PhD thesis, 2017. [link]