What is the difference between SGD and TGD?

340    Asked by dhanan_7781 in Devops , Asked on Nov 24, 2023

I am currently working on a project on machine learning and I am stuck in a conflict between choosing stochastic gradient descent(SGD) and Traditional gradient descent (TGD). What are the differences between them considering choosing SGD over TGD?

Answered by Dhananjay Singh

If you choose stochastic gradient descent (SGD) in the context of conflict between stochastic gradient descent vs gradient descent then it would help you in dealing with the large datasets. In the circumstances where the sizes of the data make batch processing impractical, the SGD is very useful in this particular scenario. The gradient descent is known for computing the gradient of the entire data set even before the updation of the parameters while SGD is famous for processing a random subset for each iteration.

Therefore, the adaptability feature of SGD makes it more comfortable, and efficient and facilitates in easy model training. Moreover, the stochastic nature provides you with randomness which can further assist you in escaping through local minima which can further lead to better convergence. 

Join our data scientist course online in order to get more interesting concepts like this.



Your Answer

Interviews

Parent Categories