Improved stochastic gradient descent algorithm with mean-gradient adaptive stepsize for solving large-scale optimization problems

Stochastic gradient descent (SGD) is one of the most common algorithms used in solving large unconstrained optimization problems. It utilizes the concept of classical gradient descent method with modification on the gradient selection. SGD uses random or batch data sets to compute gradient in solvin...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores Principales: Zulkifli, Munierah, Abd Rahmin, Nor Aliza, Wah, June Leong
Formato: Artículo
Lenguaje:English
Publicado: Persatuan Sains Matematik Malaysia 2023
Acceso en línea:http://psasir.upm.edu.my/id/eprint/110372/1/document%20%284%29.pdf
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!