Improved stochastic gradient descent algorithm with mean-gradient adaptive stepsize for solving large-scale optimization problems

Stochastic gradient descent (SGD) is one of the most common algorithms used in solving large unconstrained optimization problems. It utilizes the concept of classical gradient descent method with modification on the gradient selection. SGD uses random or batch data sets to compute gradient in solvin...

Полное описание

Сохранить в:
Библиографические подробности
Главные авторы: Zulkifli, Munierah, Abd Rahmin, Nor Aliza, Wah, June Leong
Формат: Статья
Язык:English
Опубликовано: Persatuan Sains Matematik Malaysia 2023
Online-ссылка:http://psasir.upm.edu.my/id/eprint/110372/1/document%20%284%29.pdf
Метки: Добавить метку
Нет меток, Требуется 1-ая метка записи!