On the Relationship Between Conjugate Gradient and Optimal First-Order Methods for Convex Optimization

On the Relationship Between Conjugate Gradient and Optimal First-Order Methods for Convex Optimization
Author :
Publisher :
Total Pages :
Release :
ISBN-10 : OCLC:875920207
ISBN-13 :
Rating : 4/5 ( Downloads)

Book Synopsis On the Relationship Between Conjugate Gradient and Optimal First-Order Methods for Convex Optimization by : Sahar Karimi

Download or read book On the Relationship Between Conjugate Gradient and Optimal First-Order Methods for Convex Optimization written by Sahar Karimi and published by . This book was released on 2013 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: In a series of work initiated by Nemirovsky and Yudin, and later extended by Nesterov, first-order algorithms for unconstrained minimization with optimal theoretical complexity bound have been proposed. On the other hand, conjugate gradient algorithms as one of the widely used first-order techniques suffer from the lack of a finite complexity bound. In fact their performance can possibly be quite poor. This dissertation is partially on tightening the gap between these two classes of algorithms, namely the traditional conjugate gradient methods and optimal first-order techniques. We derive conditions under which conjugate gradient methods attain the same complexity bound as in Nemirovsky-Yudin's and Nesterov's methods. Moreover, we propose a conjugate gradient-type algorithm named CGSO, for Conjugate Gradient with Subspace Optimization, achieving the optimal complexity bound with the payoff of a little extra computational cost. We extend the theory of CGSO to convex problems with linear constraints. In particular we focus on solving $l_1$-regularized least square problem, often referred to as Basis Pursuit Denoising (BPDN) problem in the optimization community. BPDN arises in many practical fields including sparse signal recovery, machine learning, and statistics. Solving BPDN is fairly challenging because the size of the involved signals can be quite large; therefore first order methods are of particular interest for these problems. We propose a quasi-Newton proximal method for solving BPDN. Our numerical results suggest that our technique is computationally effective, and can compete favourably with the other state-of-the-art solvers.


On the Relationship Between Conjugate Gradient and Optimal First-Order Methods for Convex Optimization Related Books

On the Relationship Between Conjugate Gradient and Optimal First-Order Methods for Convex Optimization
Language: en
Pages:
Authors: Sahar Karimi
Categories:
Type: BOOK - Published: 2013 - Publisher:

DOWNLOAD EBOOK

In a series of work initiated by Nemirovsky and Yudin, and later extended by Nesterov, first-order algorithms for unconstrained minimization with optimal theore
First-Order Methods in Optimization
Language: en
Pages: 487
Authors: Amir Beck
Categories: Mathematics
Type: BOOK - Published: 2017-10-02 - Publisher: SIAM

DOWNLOAD EBOOK

The primary goal of this book is to provide a self-contained, comprehensive study of the main ?rst-order methods that are frequently used in solving large-scale
Convex Optimization
Language: en
Pages: 142
Authors: Sébastien Bubeck
Categories: Convex domains
Type: BOOK - Published: 2015-11-12 - Publisher: Foundations and Trends (R) in Machine Learning

DOWNLOAD EBOOK

This monograph presents the main complexity theorems in convex optimization and their corresponding algorithms. It begins with the fundamental theory of black-b
Lectures on Convex Optimization
Language: en
Pages: 589
Authors: Yurii Nesterov
Categories: Mathematics
Type: BOOK - Published: 2018-11-19 - Publisher: Springer

DOWNLOAD EBOOK

This book provides a comprehensive, modern introduction to convex optimization, a field that is becoming increasingly important in applied mathematics, economic
Proximal Algorithms
Language: en
Pages: 130
Authors: Neal Parikh
Categories: Mathematics
Type: BOOK - Published: 2013-11 - Publisher: Now Pub

DOWNLOAD EBOOK

Proximal Algorithms discusses proximal operators and proximal algorithms, and illustrates their applicability to standard and distributed convex optimization in