Published: 2023-07-01

Pengembangan Stochastic Gradient Descent dengan Penambahan Variabel Tetap

DOI: 10.35870/jtik.v7i3.840

Issue Cover

Downloads

Article Metrics
Share:

Abstract

Stochastic Gradient Descent (SGD) is one of the commonly used optimizers in deep learning. Therefore, in this work, we modify stochastic gradient descent (SGD) by adding a fixed variable. We will then look at the differences between standard stochastic gradient descent (SGD) and stochastic gradient descent (SGD) with additional variables. The phases performed in this study were: (1) optimization analysis, (2) fix design, (3) fix implementation, (4) fix test, (5) reporting. The results of this study aim to show the additional impact of fixed variables on the performance of stochastic gradient descent (SGD).

Keywords

Stochastic Gradient Descent (SGD) ; Modification ; Performance

Peer Review Process

This article has undergone a double-blind peer review process to ensure quality and impartiality.

Indexing Information

Discover where this journal is indexed at our indexing page to understand its reach and credibility.

Open Science Badges

This journal supports transparency in research and encourages authors to meet criteria for Open Science Badges by sharing data, materials, or preregistered studies.

Similar Articles

You may also start an advanced similarity search for this article.

Most read articles by the same author(s)