Store besparelser
Hurtig levering
Gemte
Log ind
0
Kurv
Kurv

Optimization in Banach Spaces

Af: Alexander J. Zaslavski Engelsk Paperback

Optimization in Banach Spaces

Af: Alexander J. Zaslavski Engelsk Paperback
Tjek vores konkurrenters priser
The book is devoted to the study of constrained minimization  problems on closed and convex sets in Banach spaces with a Frechet differentiable objective function. Such  problems are well studied in a  finite-dimensional space and in an infinite-dimensional Hilbert space. When the space is Hilbert there are many algorithms for solving optimization problems including the gradient projection algorithm which  is one of the most important tools in the optimization theory, nonlinear analysis and their applications. An optimization problem is described by an  objective function  and a set of feasible points. For the gradient projection algorithm each iteration consists of two steps. The first step is a calculation of a gradient of the objective function while in the second one  we calculate a projection on the feasible  set. In each of these two steps there is a computational error. In our recent research we show that the gradient projection algorithm generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. It should be mentioned that  the properties of a Hilbert space play an important role. When we consider an optimization problem in a general Banach space the situation becomes more difficult and less understood. On the other hand such problems arise in the approximation theory. The book is of interest for mathematicians working in  optimization. It also can be useful in preparation courses for graduate students.  The main feature of the book which appeals specifically to this audience is the study of algorithms for convex and nonconvex minimization problems in a general Banach space. The book is of interest for experts in applications of optimization to the approximation theory.

In this book the goal is to obtain a good approximate solution of the constrained optimization problem in a general Banach space under  the presence of computational errors.  It is shown that the algorithm generates a good approximate solution, if the sequence of computational errors is bounded from above by a small constant. The book consists of four chapters. In the first we discuss several algorithms which are studied in the book and  prove a convergence result for an unconstrained problem which is a prototype of our results for the constrained problem. In Chapter 2 we analyze convex optimization problems. Nonconvex optimization problems  are studied in Chapter 3. In Chapter 4 we study  continuous   algorithms for minimization problems under the presence of computational errors. The algorithm generates a good approximate solution, if the sequence of computational errors is bounded from above by a small constant. The book consists of four chapters. In the first we discuss several algorithms which are studied in the book and  prove a convergence result for an unconstrained problem which is a prototype of our results for the constrained problem. In Chapter 2 we analyze convex optimization problems. Nonconvex optimization problems  are studied in Chapter 3. In Chapter 4 we study  continuous   algorithms for minimization problems under the presence of computational errors.



Tjek vores konkurrenters priser
Normalpris
kr 478
Fragt: 39 kr
6 - 8 hverdage
20 kr
Pakkegebyr
God 4 anmeldelser på
Tjek vores konkurrenters priser
The book is devoted to the study of constrained minimization  problems on closed and convex sets in Banach spaces with a Frechet differentiable objective function. Such  problems are well studied in a  finite-dimensional space and in an infinite-dimensional Hilbert space. When the space is Hilbert there are many algorithms for solving optimization problems including the gradient projection algorithm which  is one of the most important tools in the optimization theory, nonlinear analysis and their applications. An optimization problem is described by an  objective function  and a set of feasible points. For the gradient projection algorithm each iteration consists of two steps. The first step is a calculation of a gradient of the objective function while in the second one  we calculate a projection on the feasible  set. In each of these two steps there is a computational error. In our recent research we show that the gradient projection algorithm generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. It should be mentioned that  the properties of a Hilbert space play an important role. When we consider an optimization problem in a general Banach space the situation becomes more difficult and less understood. On the other hand such problems arise in the approximation theory. The book is of interest for mathematicians working in  optimization. It also can be useful in preparation courses for graduate students.  The main feature of the book which appeals specifically to this audience is the study of algorithms for convex and nonconvex minimization problems in a general Banach space. The book is of interest for experts in applications of optimization to the approximation theory.

In this book the goal is to obtain a good approximate solution of the constrained optimization problem in a general Banach space under  the presence of computational errors.  It is shown that the algorithm generates a good approximate solution, if the sequence of computational errors is bounded from above by a small constant. The book consists of four chapters. In the first we discuss several algorithms which are studied in the book and  prove a convergence result for an unconstrained problem which is a prototype of our results for the constrained problem. In Chapter 2 we analyze convex optimization problems. Nonconvex optimization problems  are studied in Chapter 3. In Chapter 4 we study  continuous   algorithms for minimization problems under the presence of computational errors. The algorithm generates a good approximate solution, if the sequence of computational errors is bounded from above by a small constant. The book consists of four chapters. In the first we discuss several algorithms which are studied in the book and  prove a convergence result for an unconstrained problem which is a prototype of our results for the constrained problem. In Chapter 2 we analyze convex optimization problems. Nonconvex optimization problems  are studied in Chapter 3. In Chapter 4 we study  continuous   algorithms for minimization problems under the presence of computational errors.



Produktdetaljer
Sprog: Engelsk
Sider: 126
ISBN-13: 9783031126437
Indbinding: Paperback
Udgave:
ISBN-10: 3031126432
Kategori: Numerisk analyse
Udg. Dato: 30 sep 2022
Længde: 0mm
Bredde: 155mm
Højde: 235mm
Forlag: Springer International Publishing AG
Oplagsdato: 30 sep 2022
Forfatter(e): Alexander J. Zaslavski
Forfatter(e) Alexander J. Zaslavski


Kategori Numerisk analyse


ISBN-13 9783031126437


Sprog Engelsk


Indbinding Paperback


Sider 126


Udgave


Længde 0mm


Bredde 155mm


Højde 235mm


Udg. Dato 30 sep 2022


Oplagsdato 30 sep 2022


Forlag Springer International Publishing AG