Session Detail Information
Add this session to your itinerary

Cluster :  Nonlinear Programming

Session Information  : Tuesday Jul 14, 13:10 - 14:40

Title:  Algorithms for Large-Scale Nonlinear Optimization
Chair: Joshua Griffin,SAS Institute Inc., 100 SAS Campus Drive, Cary NC, United States of America, Joshua.Griffin@sas.com

Abstract Details

Title: Compact Representations of Quasi-Newton Matrices
 Presenting Author: Roummel Marcia,Associate Professor, University of California, Merced, 5200 N. Lake Road, Merced Ca 95343, United States of America, rmarcia@ucmerced.edu
 Co-Author: Jennifer Erway,Associate Professor, Wake Forest University, Winston-Salem NC, United States of America, erwayjb@wfu.edu
 
Abstract: Very large systems of linear equations arising from quasi-Newton methods can be solved efficiently using the compact representation of the quasi-Newton matrices. In this paper, we present a compact formulation for the entire Broyden convex class of updates for limited-memory quasi-Newton methods and how they can be used to solve large-scale trust-region subproblems with quasi-Newton Hessian approximations.
  
Title: Extension of the Multi-Start Algorithm to Mixed Integer Nonlinear Programming
 Presenting Author: Tao Huang,SAS Institute Inc., 100 SAS Campus Drive, Cary NC 27513, United States of America, Tao.Huang@sas.com
 
Abstract: We present an implementation of the multi-start algorithm for continuous nonlinear optimization as is extended to handle integer variables. Schemes to generate sample points under integer requirements are discussed. In the cases where no feasible integer sample point is generated, an algorithm is proposed to seek feasible integer points. The properties of the integer-seeking algorithm is discussed. Our multi-start algorithm exploits parallelism in different phases of the algorithm and as a result the solution times are drastically reduced. Preliminary numerical results are presented to show its efficacy.
  
Title: Optimization on Riemannian Manifolds: Methods and Applications to Matrix Manifolds
 Presenting Author: Murugiah Muraleetharan,SAS Institute Inc., 100 SAS Campus Drive, Cary NC 27513, United States of America, M.Muraleetharan@sas.com
 
Abstract: We discuss Riemannian optimization methods for optimizing functions over manifolds. Algorithms, such as steepest descent, nonlinear conjugate-gradients, and Newton-based trust-region methods can be re-derived in the Riemannian setting and consequently applied to constrained optimization problems whose constraints can be interpreted as Grassmann and Stiefel manifolds. These manifolds represent the constraints that arise in such areas as singular value decomposition, matrix completions, and extreme eigen-pairs of a symmetric matrix. Riemannian optimization methods lead to practical globally convergent algorithms that scale to large-scale matrix problems while providing a gateway to modify solution requirements on classical decompositions.