Convergence of the steepest descent method with line searches and uniformly convex objective in reflexive Banach spaces

Fernando Andrés Gallego, John Jairo Quintero, Juan Carlos Riano

Abstract


In this paper, we present some algorithms for unconstrained convex optimization problems. The development and analysis of these methods is carried out in a Banach space setting. We begin by introducing a general framework for achieving global convergence without Lipschitz conditions on the gradient, as usual in the current literature. This paper is an extension to Banach spaces to the analysis of the steepest descent method for convex optimization, most of them in less general spaces.

Keywords


uniformly convex functional, descent methods, step-size estimation, metric of gradient

Full Text:

PDF


ISSN: 1331-0623 (Print), 1848-8013 (Online)