Recent Advances in Optimization Theory, Methods, and Applications in Science and Engineering 2021
1Shanghai University of Engineering Science, Shanghai, China
2Loyola University Maryland, Baltimore, USA
3Georgia Southern University, Statesboro, USA
4Nord University, Nesna, Norway
Recent Advances in Optimization Theory, Methods, and Applications in Science and Engineering 2021
Description
Modern optimization theory and associated methods have seen significant and rapid progress in recent decades. These advances have had an important impact on the development of many areas of science, engineering, and technology, as well as business and finance. One of the areas of optimization that has had the strongest development both in theory and methods is the area of convex conic optimization. There are three major factors that have contributed to such development. The first is the fact that convex conic optimization is a unifying frame that contains important optimization problems, such as linear optimization, second-order cone optimization, and semidefinite optimization as special cases. In addition, convex conic optimization has combined Euclidean Jordan algebras and related symmetric cones with optimization theory leading to strong and significant research results, and a still very active research area. Interior-point methods, which have in many ways revolutionized the theory and methods of mathematical programming, have shown to be efficient algorithms in solving conic optimization problems, both theoretically and practically. Numerous applications in various fields, such as statistics, optimal experiment design, information and communication theory, electrical engineering, portfolio optimization, and combinatorial optimization, that can be formulated as conic optimization problems and solved efficiently using appropriate interior-point methods.
The need to solve challenging large-scale optimization problems arising in various areas of science, engineering, and technology has led to breakthrough advancements in numerical optimization, including first-order methods and augmented Lagrangian methods. These and other optimization methods have contributed to rapid development in many fields, including operations research, data science, data analytics, machine learning, and artificial intelligence, among many others. Significant progress has also been made in solving difficult and previously non-tractable problems such as non-convex and/or non-symmetric optimization, nonlinear conic optimization, sparse optimization, and stochastic optimization problems with applications in science and engineering. However, many challenges and open questions still remain because of the size of problems and the need to solve them efficiently.
The aim of this Special Issue is to provide a comprehensive collection of cutting-edge research contributions on optimization theory, methods, and applications in science and engineering. We welcome both original research and review articles.
Potential topics include but are not limited to the following:
- Optimization theory
- Linear and nonlinear optimization
- Interior-point methods and related topics
- First-order methods and related topics
- Sparse optimization
- Robust optimization
- Stochastic optimization
- Conic optimization
- Complementarity problems and variational inequalities
- Discrete and combinatorial optimization
- Applications of optimization theory and methods