Zeroth Order Regularized Optimization (ZORO): Approximately sparse gradients and adaptive sampling.

Published in (under review), 2021

Joint with HanQin Cai, Wotao Yin and Zhenliang Zhang.

We study high dimensional zeroth-order optimization. We propose a novel compressible gradients assumption and an algorithm, ZORO, that exploits it. We prove that ZORO is able to circumvent known lower bounds on the query complexity of zeroth-order optimization, for functions exhibiting gradient compressibility. We show gradient compressibility is surprisingly ubiquitous in real-world problems.

Arxiv version