ORCID

https://orcid.org/0000-0003-4838-101X

Date of Award

Spring 5-15-2016

Author's Department

Electrical & Systems Engineering

Degree Name

Doctor of Philosophy (PhD)

Degree Type

Dissertation

Abstract

In a world where data rates are growing faster than computing power, algorithmic acceleration based on developments in mathematical optimization plays a crucial role in narrowing the gap between the two. As the scale of optimization problems in many fields is getting larger, we need faster optimization methods that not only work well in theory, but also work well in practice by exploiting underlying state-of-the-art computing technology.

In this document, we introduce a unified framework of large-scale convex optimization using Jensen surrogates, an iterative optimization method that has been used in different fields since the 1970s. After this general treatment, we present non-asymptotic convergence analysis of this family of methods and the motivation behind developing accelerated variants. Moreover, we discuss widely used acceleration techniques for convex optimization and then investigate acceleration techniques that can be used within the Jensen surrogate framework while proposing several novel acceleration methods. Furthermore, we show that proposed methods perform competitively with or better than state-of-the-art algorithms for several applications including Sparse Linear Regression (Image Deblurring), Positron Emission Tomography, X-Ray Transmission Tomography, Logistic Regression, Sparse Logistic Regression and Automatic Relevance Determination for X-Ray Transmission Tomography.

Language

English (en)

Chair

Joseph A. O'Sullivan

Committee Members

David G. Politte, Arye Nehorai, Mark Anastasio, Jr-Shin Li,

Comments

Permanent URL: https://doi.org/10.7936/K7K35RXN

Share

COinS