-
Learn to draw a graph to see if the learning rate is consistent.
The number of iterations required for the convergence of the gradient descent algorithm varies according to the model. We cannot predict in advance. We can draw a graph of the number of iterations and the value of the cost function to observe when the algorithm tends to converge.
-Automatic testing methods
There are also automatic testing methods for convergence, such as comparing the changing value of the cost function with some threshold (such as 0.001, which basically means tending to 0 and hardly changing), but usually it is better to look at a graph like the above. -
Summary
Usually you can consider trying some learning rates to debug slowly:
Reference content:
Wu Enda Machine Learning Notes