A New Interpolation Approach for Linearly Constrained Convex Optimization

Handle URI:
http://hdl.handle.net/10754/244891
Title:
A New Interpolation Approach for Linearly Constrained Convex Optimization
Authors:
Espinoza, Francisco
Abstract:
In this thesis we propose a new class of Linearly Constrained Convex Optimization methods based on the use of a generalization of Shepard's interpolation formula. We prove the properties of the surface such as the interpolation property at the boundary of the feasible region and the convergence of the gradient to the null space of the constraints at the boundary. We explore several descent techniques such as steepest descent, two quasi-Newton methods and the Newton's method. Moreover, we implement in the Matlab language several versions of the method, particularly for the case of Quadratic Programming with bounded variables. Finally, we carry out performance tests against Matab Optimization Toolbox methods for convex optimization and implementations of the standard log-barrier and active-set methods. We conclude that the steepest descent technique seems to be the best choice so far for our method and that it is competitive with other standard methods both in performance and empirical growth order.
Advisors:
Rockwood, Alyn
Committee Member:
Turkiyyah, George; Zhang, Xiangliang ( 0000-0002-3574-5665 )
KAUST Department:
Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
Program:
Applied Mathematics and Computational Science
Issue Date:
Aug-2012
Type:
Thesis
Appears in Collections:
Applied Mathematics and Computational Science Program; Theses; Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division

Full metadata record

DC FieldValue Language
dc.contributor.advisorRockwood, Alynen
dc.contributor.authorEspinoza, Franciscoen
dc.date.accessioned2012-09-19T06:45:11Z-
dc.date.available2012-09-19T06:45:11Z-
dc.date.issued2012-08en
dc.identifier.urihttp://hdl.handle.net/10754/244891en
dc.description.abstractIn this thesis we propose a new class of Linearly Constrained Convex Optimization methods based on the use of a generalization of Shepard's interpolation formula. We prove the properties of the surface such as the interpolation property at the boundary of the feasible region and the convergence of the gradient to the null space of the constraints at the boundary. We explore several descent techniques such as steepest descent, two quasi-Newton methods and the Newton's method. Moreover, we implement in the Matlab language several versions of the method, particularly for the case of Quadratic Programming with bounded variables. Finally, we carry out performance tests against Matab Optimization Toolbox methods for convex optimization and implementations of the standard log-barrier and active-set methods. We conclude that the steepest descent technique seems to be the best choice so far for our method and that it is competitive with other standard methods both in performance and empirical growth order.en
dc.language.isoenen
dc.subjectOptimizationen
dc.subjectConvexen
dc.subjectInterpolationen
dc.subjectShepard's methoden
dc.subjectLinear constraintsen
dc.subjectQuadratic programmingen
dc.titleA New Interpolation Approach for Linearly Constrained Convex Optimizationen
dc.typeThesisen
dc.contributor.departmentComputer, Electrical and Mathematical Sciences and Engineering (CEMSE) Divisionen
thesis.degree.grantorKing Abdullah University of Science and Technologyen_GB
dc.contributor.committeememberTurkiyyah, Georgeen
dc.contributor.committeememberZhang, Xiangliangen
thesis.degree.disciplineApplied Mathematics and Computational Scienceen
thesis.degree.nameMaster of Scienceen
dc.person.id113224en
All Items in KAUST are protected by copyright, with all rights reserved, unless otherwise indicated.