Techno Blender
Digitally Yours.

Back To Basics, Part Dos: Linear Regression, Cost Function, and Gradient Descent | by Shreya Rao | Feb, 2023

0 41


Image by author
the equation of the curve is the equation used to calculate the MSE
Image by author
in reality, we won’t know what the MSE curve looks like | Image by author

Step 1: Start with a random initial guess for the intercept value

Step 2: Calculate the gradient of the MSE curve at this point

the gradient of the MSE curve when intercept = 0 | Image by author
Image by author
Image by author
Image by author
Image by author
Image by author

Step 3: Calculate the Step Size and update the intercept value by using the Gradient and the Learning Rate

Image by author
Image by author
Image by author
Image by author

Step 4: Repeat steps 2–3

Image by author
Image by author
Image by author
Image by author
Image by author
Image by author
Image by author
Image by author
Image by author
Image by author

Why does finding the gradient actually work?

Image by author
the red line, or the Gradient, is sloping downwards
the red line, or the gradient, is sloping downwards => a negative Gradient | Image by author
the red line, or the Gradient, is sloping upwards => a positive Gradient Image by author

How does Gradient Decent know when to stop taking steps?

Image by author
Image by author

What if the minimum point is not so easy to identify?

Image by author
Image by author

What if we are trying to find more than one optimal value?

Image by author
Image by author


Image by author
the equation of the curve is the equation used to calculate the MSE
Image by author
in reality, we won’t know what the MSE curve looks like | Image by author

Step 1: Start with a random initial guess for the intercept value

Step 2: Calculate the gradient of the MSE curve at this point

the gradient of the MSE curve when intercept = 0 | Image by author
Image by author
Image by author
Image by author
Image by author
Image by author

Step 3: Calculate the Step Size and update the intercept value by using the Gradient and the Learning Rate

Image by author
Image by author
Image by author
Image by author

Step 4: Repeat steps 2–3

Image by author
Image by author
Image by author
Image by author
Image by author
Image by author
Image by author
Image by author
Image by author
Image by author

Why does finding the gradient actually work?

Image by author
the red line, or the Gradient, is sloping downwards
the red line, or the gradient, is sloping downwards => a negative Gradient | Image by author
the red line, or the Gradient, is sloping upwards => a positive Gradient Image by author

How does Gradient Decent know when to stop taking steps?

Image by author
Image by author

What if the minimum point is not so easy to identify?

Image by author
Image by author

What if we are trying to find more than one optimal value?

Image by author
Image by author

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.
Leave a comment