📉Gradient descent updates the weights of a machine learning model to minimize loss and improve performance.
✈️Gradient descent can be thought of as an airplane descending to find the most optimal landing path.
➗The sigmoid function is used as the activation function to update decimal outputs in gradient descent.
📉🔄🔀Gradient descent helps find the most efficient path to minimize loss, even with multiple possible routes.
🔐By using a small learning rate, gradient descent ensures that weight updates occur gradually for stability and improved performance.