Abstract
In a world where data rates are growing faster than computing power, algorithmic acceleration based on developments in mathematical optimization plays a crucial role in narrowing the gap between the two. As the scale of optimization problems in many fields is getting larger, we need faster optimization methods that not only work well in theory, but also work well in practice by exploiting underlying state-of-the-art computing technology.
In this document, we introduce a unified framework of large-scale convex optimization using Jensen surrogates, an iterative optimization method that has been used in different fields since the 1970s. After this general treatment, we present non-asymptotic convergence analysis of this family of methods and the motivation behind developing accelerated variants. Moreover, we discuss widely used acceleration techniques for convex optimization and then investigate acceleration techniques that can be used within the Jensen surrogate framework while proposing several novel acceleration methods. Furthermore, we show that proposed methods perform competitively with or better than state-of-the-art algorithms for several applications including Sparse Linear Regression (Image Deblurring), Positron Emission Tomography, X-Ray Transmission Tomography, Logistic Regression, Sparse Logistic Regression and Automatic Relevance Determination for X-Ray Transmission Tomography.
Committee Chair
Joseph A. O'Sullivan
Committee Members
David G. Politte, Arye Nehorai, Mark Anastasio, Jr-Shin Li,
Degree
Doctor of Philosophy (PhD)
Author's Department
Electrical & Systems Engineering
Document Type
Dissertation
Date of Award
Spring 5-15-2016
Language
English (en)
DOI
https://doi.org/10.7936/K7K35RXN
Author's ORCID
https://orcid.org/0000-0003-4838-101X
Recommended Citation
Degirmenci, Soysal, "A General Framework of Large-Scale Convex Optimization Using Jensen Surrogates and Acceleration Techniques" (2016). McKelvey School of Engineering Theses & Dissertations. 157.
The definitive version is available at https://doi.org/10.7936/K7K35RXN
Comments
Permanent URL: https://doi.org/10.7936/K7K35RXN