A flexible block-coordinate forward-backward algorithm for non-smooth and non-convex optimization
Abstract
Block coordinate descent (BCD) methods are prevalent in large scale optimization problems due to the low memory and computational costs per iteration, the predisposition to parallelization, and the ability to exploit the structure of the problem. The theoretical and practical performance of BCD relies heavily on the rules defining the choice of the blocks to be updated at each iteration. We propose a new deterministic BCD framework that allows for very flexible updates, while guaranteeing state-of-the-art convergence guarantees on non-smooth nonconvex optimization problems. While encompassing several update rules from the literature, this framework allows for priority on updates of particular blocks and correlations in the block selection between iterations, which is not permitted under the classical convergent stochastic framework. This flexibility is leveraged in the context of multilevel optimization algorithms and, in particular, in multilevel image restoration problems, where the efficiency of the approach is illustrated.