Multi-spectral gradient method via variational technique under log-determinant norm for large-scale optimization
The spectral gradient method is popular due to the fact that only the gradient of the objective function is required at each iterate. Besides that, it is more efficient than the quasi-Newton method as the storage of second derivatives (Hessian) approximation are not required especially when the dime...
Guardado en:
Autores Principales: | , , , |
---|---|
Formato: | Artículo |
Lenguaje: | English |
Publicado: |
Malaysian Mathematical Science Society
2017
|
Acceso en línea: | http://psasir.upm.edu.my/id/eprint/62502/1/SPECTRAL.pdf |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Sumario: | The spectral gradient method is popular due to the fact that only the gradient of the objective function is required at each iterate. Besides that, it is more efficient than the quasi-Newton method as the storage of second derivatives (Hessian) approximation are not required especially when the dimension of the problem is large. In this paper, we propose a spectral gradient method via variational technique under log-determinant measure such that it satisfies the weaker secant equation. The corresponding variational problem is solved and the Lagrange multiplier is approximated using the Newton-Raphson method and solved following interior point method that is associated with weaker secant relation. An executable code is developed to test the efficiency of the proposed method with some standard conjugate-gradient methods. Numerical results are presented which suggest a better performance has been achieved. |
---|