Which optimization algorithm is commonly used to update weights in neural networks?
Practice Questions
1 question
Q1
Which optimization algorithm is commonly used to update weights in neural networks?
K-means
Stochastic Gradient Descent
Principal Component Analysis
Random Forest
Stochastic Gradient Descent (SGD) is a popular optimization algorithm used to update weights in neural networks based on the gradient of the loss function.
Questions & Step-by-step Solutions
1 item
Q
Q: Which optimization algorithm is commonly used to update weights in neural networks?
Solution: Stochastic Gradient Descent (SGD) is a popular optimization algorithm used to update weights in neural networks based on the gradient of the loss function.