You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Gradient descent is one of most used algorithm in Machine Learning.Gradient descent is a first-order iterative optimization algorithm for finding the minimum of a function. To find a local minimum of a function using gradient descent.
3
3
4
-
#Example
4
+
#Example
5
5

6
6
7
7
Supoose we are at any point, and we want to minimize the cost function. Thus this can be achieve by taking derivative of cost function.
8
8
The slope of the tangent will give a direction in which the value is decreases. Thus this algorithm make steps down the function.
9
9
The size of each step is determine by the learning rate.
10
10
11
-
#Gradient descent for linear regression
11
+
#**Gradient descent for linear regression**
12
12
In this model, gradient descent is used to calculating the theta vector(or parameter vector). Once the parameter vector is calculated using
13
13
training examples, we can predict the output of unknown input.
0 commit comments