An IEEE team proposes AngularGrad — a novel optimization algorithm that takes both gradient direction and angular information into consideration. The method successfully reduces the zig-zag effect in the optimization trajectory and speeds up convergence.

Here is a quick read: New IEEE Research Equips Gradient Descent with Angular Information to Boost DNN Training.

The paper AngularGrad: A New Optimization Technique for Angular Convergence of Convolutional Neural Networks is on arXiv.



Source link