What is Gradient Descent

Post Reply
User avatar
Administrator
Site Admin
Posts: 1452
Joined: Thu Mar 22, 2018 10:19 am

What is Gradient Descent

Post by Administrator » Tue Jul 02, 2019 8:47 pm

Gradient Descent is a method of Loss Optimization, and can be achieved with back prorogation. The learning rate controls amount of Decent and can be difficult to determine. This is how we train a neural network.

Gradient Descent can only achieve a local minimum Loss, and not Global there are other methods of achieving this but would require more processing power.

In practice Local minimum should be enough optimization.

Post Reply