📈Linear regression uses a straight line to fit the data set by minimizing the cost function.
⛰️Gradient descent algorithm helps find the minimum value of the cost function.
📉Gradient descent takes steps to minimize the cost function, eventually converging to the local minimum.
🎢Gradient descent may oscillate as it approaches the minimum, but it continues to reduce the cost function.
🔄Gradient descent updates the parameters (thetas) using the derivative of the cost function with respect to the parameters.