Asynchronous stochastic gradient descent with unbounded delay on nonconvex problem

Event
Tuesday, February 13, 2018 - 3:30pm to 4:30pm
Event Type: 

Speaker: Xin Zhang / Abstract:  Parallel distributed implementations of stochastic gradient, especially asynchronous scheme, have been widely applied in optimizing nonconvex problems. So far, several works have been done to analyze the asymptotic convergence rate of asynchronous gradient descent method. But these works have been restricted by bounded delay. In our work, we focus on the asynchronous stochastic gradient descent methods with unbounded delay on nonconvex optimization problem. We analyze regular Asynchronous SGD and Asynchronous SGD with incremental batch size, a variance reducing scheme. We prove asymptotic o(1/ k) convergence rate for regular Asynchronous SGD and o(1/k) for Asynchronous SGD with increment batch size. Also, we summarize a sufficient condition for the unbounded delay that the convergence rate could be guaranteed.

Category: