Exploding Gradient Problem
Meaning ⎊ Training issue where gradients grow exponentially, leading to numerical instability and weight divergence.
Vanishing Gradient Problem
Meaning ⎊ Training issue where gradients shrink to near zero, preventing deep network layers from updating their weights.
Stochastic Gradient Descent
Meaning ⎊ Gradient optimization method using random data subsets to improve computational speed and escape local minima.
Gradient Descent Optimization
Meaning ⎊ Mathematical technique to find the minimum of a function by iteratively moving against the gradient of the loss.
