Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Ergebnis 8 von 5901
Mathematics (Basel), 2023-01, Vol.11 (3), p.682
2023
Volltextzugriff (PDF)

Details

Autor(en) / Beteiligte
Titel
Recent Advances in Stochastic Gradient Descent in Deep Learning
Ist Teil von
  • Mathematics (Basel), 2023-01, Vol.11 (3), p.682
Ort / Verlag
Basel: MDPI AG
Erscheinungsjahr
2023
Quelle
EZB Electronic Journals Library
Beschreibungen/Notizen
  • In the age of artificial intelligence, the best approach to handling huge amounts of data is a tremendously motivating and hard problem. Among machine learning models, stochastic gradient descent (SGD) is not only simple but also very effective. This study provides a detailed analysis of contemporary state-of-the-art deep learning applications, such as natural language processing (NLP), visual data processing, and voice and audio processing. Following that, this study introduces several versions of SGD and its variant, which are already in the PyTorch optimizer, including SGD, Adagrad, adadelta, RMSprop, Adam, AdamW, and so on. Finally, we propose theoretical conditions under which these methods are applicable and discover that there is still a gap between theoretical conditions under which the algorithms converge and practical applications, and how to bridge this gap is a question for the future.
Sprache
Englisch
Identifikatoren
ISSN: 2227-7390
eISSN: 2227-7390
DOI: 10.3390/math11030682
Titel-ID: cdi_doaj_primary_oai_doaj_org_article_6561ed8b28104f0995d1a2db25d1ea2d

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX