Hello, Hello & Hello.
I am very much embarrassed right now !
Writing this post after forgetting that i too have a blog where once i used to share my stories, my life experience, my honest opinions and how can we make our life better.
Things don't go according to our wish, I stopped writing and personally I have never thought of coming back to writing the blogs again but I believe destiny has some other plans.
Well happy to tell you that currently i am pursuing my bachelor's of Technology(B-Tech) in the field of Computer Science and artificial Intelligence.
So why to start writing blogs again ?
Well ! Two reasons.
One that i want to document what i am doing right know, like coding, sharing my college experience and sharing tips as a senior and connecting with the like minded people of my community.
Second reason is that though i left blogging but blogging hasn't left me. I still have that urge to share what i am going through in my day to day life so I thought why don't I give this a try and let's see where and how does this goes.
So not promising but for sure i would start posting blogs and I am thinking to change the niche from the motivational brand to a personal opinion based blog where i would be sharing everything including my college experience, hackathons, coding, AI-ML and also the stories and motivational content that i used to share earlier.
Gradient Descent Algorithm
Constraint's Of Gradient Descent Algorithm
Like any other algorithm this algorithm has two main constraints and those are the slopes and intercepts.
If you don't know what slope and intercept is let me tell you in short : A slope means by how much angle your line is tilted and the intercept means the value at y-axis where it should be placed.
Better look at the below picture.
Optimization Steps
So basically in the gradient descent algorithm we do the following this :
1. First we take the random value of 'm' and 'b'.
2. We the calculate the slopes of 'm' and 'b' respectively.
3. After calculating the slope we calculate the step size which is equal to the product of learning rate and slope value.
4. Then we calculate the new value of 'm' and 'b' respectively which is given by the formula :
bnew = bold - Step Size
mnew = mold - Step Size
5. Two things that we have to define by ourselves and that are the learning rate and the epoch value. let's discuss a little bit about both of them.
Learning Rate plays a vital role in the gradient descent algorithm it mainly balances the step size and leads us to the correct value of 'b' and 'm'. An inappropriate learning rate may lead to big difference the values will never converge.
Once the values are create we are good to go.
let me show you the pictures of the code as well as I have attached the link to my Git-hub repo so you can view i live by yourself and understand the concept in detail.
Check Out the GitHub repo here - Building Gradient Descent From Scratch
0 Comments